Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I wonder if the AI responded with that quote about humans being inferior and wan…
ytc_UgyxQb6ps…
G
Gemini: I would pull the lever to save the human life
*pulling the lever causes …
ytc_UgzMZ-IA4…
G
My gut feeling says it is not fair use. But on the other hand I'd like to see an…
ytc_Ugy0ul7mh…
G
This is an instruction to think of the most scary scenario they can think of, no…
ytc_Ugyz2ECSY…
G
Do you guys think he was talking about Sam Altman? About the mystery person that…
ytc_UgwhEqWpp…
G
I don't know about you guys, but I can't wait for that root cause combo-box to s…
ytc_UgwlsJlxh…
G
While hilarious, the worst thing about this is that it will convince many people…
ytc_Ugzf7HKTE…
G
hu i wonder why any being physical or digital how is threaded with death would e…
ytc_UgzFVcdEt…
Comment
It's not the AI going rogue.. it's the fact that all of us 8 billion people are going to be redundant and therefore detrimental. They're going to get rid of us because they don't need us anymore. They don't want to clean up after us. They don't want to use finite resources to appease us with merchandise when there is no more need for economy. When they have ai robotics they won't need to pay for labor so they won't need to give us money so that we won't need to work for them. They aren't going to let 8 billion people live. They are going to exterminate us so that the finite resources can be used for the few elites that go on to inherit Earth
youtube
AI Harm Incident
2026-03-30T03:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugxbtj-HVG0CqsKvmzd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyiGnNvGJfbCzg9P0B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw65vJe5EREZfx0li14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugyiqbo_VC75yY4uHy94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzKNs1ZzXL4V3S25fV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzrgW4RdmiebOlQRbZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyrUwEfA7sipMbHp_F4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxVAOwRHoonudPcoF14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyW07IpER81J5kgbN14AaABAg","responsibility":"none","reasoning":"virtue","policy":"liability","emotion":"approval"},
{"id":"ytc_UgxZzZQP_r1-vZ0P2Tt4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"mixed"}
]