Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Good vid. With text it gets even sillier: I can write a novel in Arabic with an …
ytc_UgyOsrCOd…
G
Watch Bicentennial Man. Only when you have seen that film will you change your o…
ytr_UghYuSBQJ…
G
10:47 hahahahah Caterpillars don't jump from tree branch to tree branch. Hope …
ytc_Ugy59ZszC…
G
Believe it or not, some people are. I'm not one of them, as I'm an artist myself…
ytr_Ugxo3QVNa…
G
The whole arms race between actual artists and AI artists to make tools to count…
ytc_UgxGx6tNs…
G
if it's a manual you only press the clutch snd brake but in automatic cars it's …
ytc_UgxHg0mQZ…
G
I have no need for AI , I use my wife's intelligence, Don't believe me she's tha…
ytc_UgzB2LjYi…
G
The editing is pretty good, even tho you can still see the blur around the robot…
ytc_UgyAZNjPE…
Comment
Ngl I would just treat the robot as a equal if we don’t want a mechine up rising to be fair, I think a future of our machine and it’s creators coexist with a better one because what’s the say human beings and machines have a huge relationship in a way because of we created them we gave them the program and the come alive sure they don’t have a soul but if their programming is very well self-aware and moral then still doesn’t count as well assault but they have the mind of a human so treat machinery, like it’s your son or daughter or mother or grandparents all the best friend, because left to say I Think I prefer a zombie apocalypse instead of a robot apocalypse
youtube
AI Harm Incident
2025-07-15T22:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | contractualist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgzIRYP8Wr_PuUa730p4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyiMzXEi1osMGPQesB4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyrrDofqJe8eMOWcyV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzJB_XZHvCwIqFzP854AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzfHTFc1cfbHN8LBeN4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx5mPL24_iJb86ohsR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxpsJbM-4gDbRGvPZl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxbn10fqyeOh-nMud14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyU-BEv9ErXajDqtO54AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx4OeFibnX1Pr-cTbx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"}]