Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
In an argument about this Ghibli slop, someone unironically said to me "Blanket …
ytc_UgzXpITc2…
G
What I find the most wierd is me feeling bad for chatGPT. Like if this is uncomf…
ytc_UgzCli3Nw…
G
Remember I Robot when Will Smith was in his car and he took it out of autonomous…
ytc_Ugwc0gOfT…
G
Wait they wanted you to be a graphic designer but then would force you to use AI…
ytr_UgypBUMX0…
G
If I can go back in time, imma bring a sledgehammer and destroy some stuff that …
ytc_Ugyx0-7C2…
G
1:50 Could I ask you? Try may I ask you. We need AI or my 6th grade English teac…
ytc_UgyUVa-kH…
G
Are these REALLY two chat GTP's discussing? For me this seems fake and humans pr…
ytc_UgzPpvrsd…
G
I find the superficial and naive part of this video to be its comparison between…
ytc_UgyxiFJOl…
Comment
I can't believe FSD is allowed anywhere in the world. The reason Tesla's use only cameras is profit. I think elon musk should be in prison for corporate manslaughter. Its like the Firestone scandel. I wonder how many thousands will have to die before this technology.
Do you think he lets his family use this faulty by design technology.
I am also very worried that they have now be AI integrated.
I an old ethical tech question was a self driving car has to choose to crash into a old person or a child or a wall (possibly injuring the driver). AI will choose the child as it protect itself and have the cheapest law suit option. (imo)
youtube
AI Harm Incident
2025-10-19T23:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxVSRevNfpEsXy-Kah4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgygwhEpahYKg8OZLLB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyCtcsEgGjHLJT5PpF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxpfcXZ1cVY8T8WN7V4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxaEv9AjrNXYNE10NZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugyj5zO0aDUmpAxCdEZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxZVLm5WJguL8LuB9x4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxiM8DVfMbWsCvF79h4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyvFkngF8l8gB30U4p4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw6gOyk_7dZLjcExOF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]