Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Because for better or worse humans can’t resist the unknown and let it be.
Flyi…
rdc_oi0x0b5
G
Fear mongering on AI..6 months these guys will change their tune...ott finished …
ytc_Ugyy1nYLJ…
G
I'm a retired Maintenance Electrician from a Steel Mill. There is in no way th…
ytc_UgwUEES1e…
G
Exactly, and for an AI trained on Internet data, there is far more instances of …
rdc_kp0rre5
G
All the information used right now by AI is writhed or developed by humans. Sou …
ytc_UgwCLwpSY…
G
The experts that agree with AI "I am alive"... Find out who pays them, then re-…
ytc_UgxK8Oi6y…
G
Someone mentioned "Don't look up". I think a scenario where AI wipes people out …
ytc_UgyhdqRoD…
G
02:27 “Without pain or pleasure, there's no preference, and rights are meaningle…
ytc_UgyDLUjEe…
Comment
Human civilization consists of individual humans. Some of their decisions are wise and ethical, but great majority of human decisions are dictated by evolutionary randomness and intellectual short-sightedness. Still, our civilization is organized into institutions that are not as stupid as a random sample of individuals. Humans are currently crucial at teaching and testing AI ethics. Once AI is better at ethics than AI-less part of humanity as a whole, I do not see a reason why we should remain much more valuable than gorillas.
youtube
AI Governance
2025-12-10T18:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | liability |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxOJV2tBuwHtVW04ft4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"frustration"},
{"id":"ytc_UgwBO7nHKh19jrnHqTZ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxUWAa0kQDBr4n_r_94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugx-j8oIbBt7y6qbyol4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgwHM7-46f0mdspqQ994AaABAg","responsibility":"investor","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxlocDupn-kUK_ZRhd4AaABAg","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw2Sks2eioeIZvULnB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzGDer4ewj9rBn5hj54AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugx2GcL9O8E5IytzG9h4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyX6rXQCcRBysgEW-Z4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"}
]