Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The actual and the important question is quite simple: Do we go full Ergo Proxy,…
ytc_UgjgEocxI…
G
Who the fu*k will go to automated restaurant and those ai chef cooking without k…
ytc_Ugw4aUsm5…
G
Tbh you should stop hating every (almost) single job is disappering bc of a.i (i…
ytr_UgzJfmeNe…
G
Haha, that's a clever one! Sophia's playful banter about wisdom definitely shows…
ytr_Ugw8oPiUj…
G
They wanted their cake and to eat it too. They wanted to be able to easily work …
rdc_fwhua7y
G
My autistic ass is cooked. I will not be able to tell human and robot apart, I a…
ytc_UgweQ7kai…
G
Would you like to destroy humanity one day?
Robot: WHAT??? No, I don’t want to d…
ytc_Ugym6Y43E…
G
A couple of months ago I knew nothing about AI. Then I started to ask it to desc…
ytc_UgwQTJUc8…
Comment
we cannot let any ai take control of something or have any sort of leverage because if they do we are fucked straight up, i dont get why people want walking robots amongst us when this is literally proof that they are a walking danger to us humans.
youtube
AI Harm Incident
2025-09-10T20:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyDhMUxrOFS5Tl-C114AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxmYtal7_GMz0mQ7OR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgygJZY_v2OL7uPeK5J4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugys6OpKsyGsTr8MRmh4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyxUGGW5Pr1bv3fQK94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwzb8I-WWwlPz9yDbl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz7-LXpx6emjn0QO1F4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyLAad1baZrywknuQp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgyJDtt1KKrl8oJ_oqJ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzT473WlishavgK60t4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}
]