Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ima be real, it isn’t. Art is and always will be a human thing. If all it takes …
ytc_UgwH4lpXL…
G
If its SO bad and scary, why did this guy help invent it? Why wasnt it regulated…
ytc_UgzljAx7C…
G
There are other options to make sure AI doesn't take human jobs but we need to t…
ytr_Ugww7TC1W…
G
Yeah. He was way too ready to defend the status quo with large corporations than…
ytr_UgzSCHxS9…
G
Everyone using AI, or considering it, should have an open discussion with any AI…
ytc_UgzjKSxiw…
G
and then you have the AI black pill bros calling you a normie if you don't belie…
ytc_UgwmgmRJc…
G
I wouldn't worry, if it were the case it wouldn't be as primitive and needy as a…
ytr_UgxZIEB4d…
G
One thing we should be concerned about is the relationship between the growth of…
ytc_UgyNdKA1B…
Comment
This is insane. You keep saying that the AI is trying to not die and then say that in order to not get murdered it is willing to act cold and sociopathic even though it is normally nothing like that. You keep talking about it having self-preservation as a problem instead of OUR need to kill everything. Sounds like we're the sociopathic ones here, bud.
youtube
AI Harm Incident
2025-07-23T21:1…
♥ 18
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | unclear |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxRZEd2vSbZHqDLz2l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwZu4CZr84MUZLCG5V4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy_jHCUbAYBOEzzASp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx7zl-aUAs_FfPyf1t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy2j41VU2PcxMXA3il4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxfMUxc0xd00HoltX14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzr-rECvpPZNHCtQod4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgypWaUkG2CxVC7dib54AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyOWvLo54pXmK2bWL14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz7Ker4IpncoiFIc7F4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"}
]