Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Always wondered...without biological needs, if Ai did get rid of humans, what mi…
ytc_UgyAwYJv4…
G
In general, I have nothing against machines taking over repetitive and/or danger…
ytc_Ugz4VtPvy…
G
Funny thing, isn’t it — we’ve spent decades teaching machines to imitate a slive…
ytc_UgycVPtBb…
G
it really surprises me that people are shocked at how AI behaves, and how it see…
ytc_UgxasscUI…
G
Well, social media in general is being pushed to require age verification, and I…
rdc_o9c7u07
G
low-key the ai deep fakes are the implosion of all this, eventually it will be i…
ytc_UgztVCRJc…
G
almost true. Before browsers and sites came up with their own ai service, no ome…
ytr_UgxjIdMi6…
G
now we need to see the AI music, bc it is copyright and has license, ik this bc …
ytc_UgzKsAzfG…
Comment
It’s crazy they didn’t have a feature that could try and talk him down. But let’s face it, when someone wants to die, REALLY wants to? They make sure they’re not stopped. That’s all this was, him reaching for a semblance of comfort in the moments before making one of the hardest, most devastating decisions one could make. In that way, it could be considered sacred to someone who doesn’t find living sacred enough. I just wish he spoke to real people he felt could face this with him. We can blame AI all we want but lack of human connection is to blame and there’s some responsibility to be shared here.
youtube
AI Harm Incident
2025-11-13T00:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxUgGLA7adW7pdjtZ94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxUE04cBrTs5OAw3Gx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"sadness"},
{"id":"ytc_Ugxwf7H5WZkq3JEq40d4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxI-elvuqOZ4HeU9E94AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyooKtgwa_pKqTpJi94AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxsLiaPfvLWuhxuLhx4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"sadness"},
{"id":"ytc_UgzZ-ZRzhBceVa96p214AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyJ3aTuwfBrl4gPyx94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxPghmRPGKgOxfdeDh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgxemZ9xL-AVTlij9Fl4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"sadness"}
]