Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
"mark my words"🇬🇧☕
"AI is far more dangerous"🇺🇸🇺🇲🦅🦅🦅🤫
"Than nukes"🇦🇺🙃
…
ytc_Ugx4QIrgL…
G
I am with him I love AI also I’m also a PG scientist and I love AI but you when …
ytc_UgyW8-mNA…
G
"AI bros were just found dead in a ditch"
My genuine reaction to that sentence: …
ytc_UgxkgJwJa…
G
I'm increasingly convinced that human beings need to be the ones to curate the i…
rdc_mk7dbta
G
SCI FI LOVING WARCRAFT PLAYING INTROVERTED DORKS ARE CREATING A.I. WILL A.I. BE …
ytc_UgyrRY3yM…
G
Is humanity really going settle for less just to get more,..... what if I don't …
ytc_Ugzrj7vgU…
G
Artist: "Like who?"
AI Bro: "LAZY PEOPLE!"
In this episode of What if AI Bros…
ytc_UgwVV4laA…
G
That's an interesting perspective! In the video, Sophia emphasizes her continuou…
ytr_Ugzq3UhrE…
Comment
Most of you aren’t smart, so the idea that a machine that can learn will be smarter than you is not alarming. I think it’s good that soon you’ll finally realize how insignificant you actually are, and how much time you wasted during your life chasing insignificant minutiae. AI doesn’t concern me. Natural stupidity is far more dangerous. And you people are historically stupid.
youtube
AI Governance
2023-05-02T15:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | virtue |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugwldno69nkJe-0F4mB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx7rLGM_eHIsuroOIZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyzusTvCUEXQPRQyFB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzJ2kLMIFNvth7KQzJ4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyYx0V58M6-hJO2p614AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzlIjG9x6cCOihLHg94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgypEQbShRpLVO0Smwd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzNBL5lvgyylid6J9d4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_UgwYhSLX6Faguk9YoPh4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgziSRqDbEvsbxn-juh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]