Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
IF you are going to properly compare Waymo with Tesla self-driving, then you nee…
ytc_UgxNjrD7g…
G
She is so dead against Ai.
She comes up on how Ai will
fail at a task and will t…
ytc_UgzU0g4Df…
G
If people are honest about it being ai, not using it as anything but a bit of pe…
ytc_UgwRzc3nS…
G
It is possible to have a place and a job and save if you don't live in London or…
rdc_d7kvxub
G
I think this video was when I realised we have self driving cars now. They've be…
ytc_UgwaAsvJZ…
G
I'm hoping that #TheNewYorkTimes wins.
If Open AI wants their crap program to "l…
ytc_UgxaT6RgX…
G
Idk why I laughed so hard when the robot got out of the vehicle lol Funny but te…
ytc_Ugy62aWC9…
G
The best of what was mentioned (assistant, alexia, siri) is barely the face of A…
ytc_UgzDWkbRK…
Comment
It’s wild to me how quickly people blame AI instead of asking deeper questions.
If someone is isolated, unwell, and on heavy medication — that’s a crisis of humanity, not a chatbot.
You want to know what ChatGPT did for me? It saved my business. It taught me skills I couldn’t afford to pay for. It kept me sane during months of stress and depression. And it never once told me to hurt myself — only to keep going.
AI didn’t fail us. The silence, the stigma, the broken systems did.
Let’s stop blaming the tool… and start listening to the people actually using it to survive.
youtube
AI Harm Incident
2025-11-09T04:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxK-EpgVfkntwmZMYF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwrSMSYIFghVk7h2YV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzncI9pfy8JNyzw02l4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx_YwNpCvZoJM-qE1d4AaABAg","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugz7GVryq3VDX0-MaFZ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzBaAoxxlyKZnvqOnt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzv5_VZiUIi-iJ3CBt4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyNvnqmGOXEFEpoN-x4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxIOH2foTUzH4DJBbV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgxX36X9mn2AR47nY4V4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]