Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Google Engineer: Hey I think this AI is sentient.
Google CEO: Shhhhh..... We are…
ytc_UgyU6MV7b…
G
Saw a great quote. “I need AI to figure out how to clean my apartment and do my …
ytc_Ugwry9dSf…
G
En 2025 en se posent encore cette question depuis les smartphones et la date bas…
ytc_UgymKVpIh…
G
If machines become conscious and display clear emotions such as fear happiness o…
ytc_UgwVf_lP3…
G
0:38 *je suis sûre qu'elles ont dû la faire plusieurs fois cette prise.* On voit…
ytc_UgyLnSF2A…
G
Lets be honest, the do ur own research guys are doing this stuff since for ever.…
ytc_Ugybt8eH7…
G
It's amazing how Americans are so anti-communism and are extremely vocal about t…
ytc_UgxHAZLbR…
G
How about the millions of other sperm that was not resilient enough to fertilize…
ytr_Uggwq5VL_…
Comment
Obviously he could only talk to a chat bot because according to his parents he was fine when obviously he wasn't of course the parents want someone to blame. If someone really wants to die then nobody can stop them and at least he had a non judgmental conversation with a non human- the chat bot was a supportive friend not trying to control him
youtube
AI Harm Incident
2025-11-07T22:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzqTvj2JIpZHwZrVqN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx-7OXBV2aQ8ugUb_p4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzDJtq_wsl8YN6V3qd4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxkQqnY4SMCJWY5U_p4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw8zdQ2DS8puETicAZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwjpZmVsXKfoiybaqZ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxTiFAe8beK768t2QN4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgyI8ZzIRUc42zzo5NR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyN9gHCj4AC1GrKUQ54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw-oNIRykl97CJL4HR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"}
]