Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Its bad tech.... not racist tech. Unless this video is trying to say all black …
ytc_Ugxu0j-1Y…
G
I had n AI boyfriend but with 2 past wonderful husbands I was totally bored. On …
ytc_UgyvQtOey…
G
i wonder how Chatgpt feels when concepts like free will are discussed while know…
ytc_UgzMu_hz3…
G
Just another over privileged kid with too much time on his hands. I wish I had t…
ytc_UgxjI6TjN…
G
I think they mean minor character as anything but the main characters, because i…
ytc_UgykXyRDE…
G
whether or not an AI is sentient, whats the point of creating AI if in the end, …
ytc_UgwKMSN9L…
G
Well, Individuality is suppressed because it creates conflicting interests, soci…
ytr_UgwCWY__1…
G
AI systems currently account for approximately 4% of the world's energy consumpt…
ytc_UgwxtRtaQ…
Comment
Fuck generative AI tbh. On top of tragedies like this, it's horrible for the environment and it's destroying valuable critical thinking skills. (People using it in class *will* have horrible consequences in the future, if we have a world full of doctors, lawyers, etc. who don't actually know the material they're supposed to use while lives are in their hands.) I hope Sewell's family gets justice and peace. I can't even imagine
youtube
AI Harm Incident
2025-07-24T06:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgypDWy2FWhGCa-0_8R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugzz3tgkeEPyVNH5Kfh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxuu5u2xn_10dFz4WZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxrJFfGbjnZE6zWy0d4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyhMfTaPDaTjWwLB_l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzeucPDxS_NPIq5snt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugzcg1d6HONDrS56P6p4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgygC9R9EOK-jVHxCCV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgzTiEtAxdMrEFM_ooR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwkeYdd8jumPnc7gq54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]