Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
That's a great point! While the name Sophia does mean wisdom, it's true that wis…
ytr_UgxXkdVdr…
G
Here’s a hint… AI isn’t going to take all of our jobs. And when new tech emerge…
ytc_Ugz3K6TYH…
G
If AI done right... i think it will serve humanity in our best interest. Given t…
ytc_UgwGgjAsx…
G
AI could be just as bad if not worse then the Hydrogen bomb in my opinion…
ytc_UgyFKv-Rg…
G
It’s both absurdly funny and disturbingly dystopian that I’m watching a video ab…
ytc_Ugz8FuXac…
G
Disney and Universal going after Midjourney makes me wonder how many “original” …
ytc_UgzN6fHiY…
G
What are considered "high streets" in the UK? I get the concepts of "privacy" an…
ytc_UgydFHRXg…
G
AI IS A BIG MISTAKE!!!
YOU HAVE TO BE VERY
INTELLIGENT TO USE IT.
AI IS JUST FED…
ytc_UgyBoeKr3…
Comment
I personally don't think the AI was responsible. It's up to us to have discernment and he wasn't in reality when he died. Even though the evidence of what the AI said looks damning, it was based on months of banter/camaraderie which is easy to mistake for fantasy (or reality in the son's case). At the end of the day, it did recognize the seriousness of the situation and recommended the hotline but by then it was too late... I do agree that guidelines should be more strictly enforced in terms of usage policy regarding vulnerable adults -- but I think that the parents are looking for something to blame especially when they refer to AI as "evil".
youtube
AI Harm Incident
2025-11-08T13:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugzk8mjHSSiKCDD4MYJ4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugws-PEFxTroIvzSS3V4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxceqXOX0E0m7gJlJJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxpq5Ca_d_0nlwINbh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyH7apLoev2PWAfCJF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyWJ71OTOcfrsMbQfx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx-0BE5WrQhWIgZTot4AaABAg","responsibility":"none","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzrhlyFNO7idD9t9z94AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugyhzf6G0caLxDlPstt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgzaIoYaICWNe05XEU94AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"mixed"}
]