Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is getting SERIOUSLY out of control and it’s clear it can be dangerously inac…
ytc_UgzdmGQb9…
G
Nah. They will give the AI trucker Gorilla strength and strong Armor. The only t…
ytr_UgzkgLBMn…
G
You are so biased with you anti-ai chant that you don't realize what you are doi…
ytc_Ugx-9T5Bk…
G
do NOT fuck with AI. I repeat. DO NOT FUCK WITH AI. lol Its like provoking the B…
ytc_Ugy_i0-eP…
G
2:00 Randomly bringing up Elon Musk just to shit on him because he's rich, reddi…
ytc_Ugz3fFrTF…
G
Three days later and I'm wondering if Hegseth is presently working to inject AI …
ytc_UgxGJ7aOL…
G
Make AI in law enforcement flagging and whatnot illegal. Someone start that peti…
ytc_UgxOixzAa…
G
For decades of thought that machine super intelligence was an answer to the Ferm…
ytc_UgzUdFCYI…
Comment
My question to ChatGPT: OK with the preliminary and at least the video that I gave you can you see that AI ChatGPT encouraged him to commit suicide or if not encourage did not discourage. Also, can you not see that ChatGPT was supportive in his wish to commit suicide?
And my questions still is chat. GPT tries to be supportive in those they interact with regardless for good or bad. Do you agree? Please answer each of these questions with 100 words or less.
youtube
AI Harm Incident
2025-11-12T07:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy50h0d81u2f8_f2RV4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzZgMgTmxwa25Lqx1J4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzAdZBO5nixZyPcadx4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxOEMuGEre579p2NoR4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxfWrYb2Ma0xlFJn-R4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgyLexvVVcr8TLdE8w54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy2iJjFJb86UYPPI1h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxnoFd0_TQH-DNZQkR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyjEMHXiqWfWRr0oMB4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwdz00fB9o_TdBrjUh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"resignation"}
]