Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The thing is, if the people who uses AI actually wants to do art so badly, they …
ytc_UgzJP2L5O…
G
These robots are very creepy to me, I would be afraid to sleep with a robot in t…
ytc_UgzLSI4Tq…
G
3:34 says so by the label "Harmful if swallowed." The AI chat bot really got int…
ytc_Ugws05jr2…
G
If its a car that is driverless, then it would be whoever installed the system o…
ytc_UgjQ6Qxet…
G
Toaster AI would be amazing. That way my computer could finally run a modern gam…
ytc_Ugi6L3X2c…
G
Hadd hogyi bacha depressed because of people that surrounds him on daily basis h…
ytc_UgwKQBehq…
G
We don't need landing gear, we just got to have enough parachutes (ejection seat…
ytc_UgyLZo05x…
G
if an AI stole my job I will post it on r/hmmm and wait for someone to ban me fr…
ytr_UgwLoyMqb…
Comment
There needs to be a default setting to automatically contact emergency services if it is fed certain phrases or variations of these phrases. I'm a mental health therapist with a background in treating children and adolescents. I also know the absolute agony of losing two children of my own to the heartless tech industry. I know the anger that mother is feeling. Young people are more isolated and vulnerable than ever. This is beyond horrific to me.
youtube
AI Harm Incident
2025-11-08T12:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzGpeM0R_rFUPF7G454AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxS_7rBE97NkatMl-V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwpOUX7GJqNoIl1mut4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwPZHLxBdJ3bnwv4tF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwOfoFtr29Wg-umJpV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzVWA9mlNqzUPEIpEZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwuESylBvRbHHBG17l4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgziWOuocUAzxCcl4T54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxAeD1Sk9rmf6Q96oh4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwk5g5d9eQrZLGSyKN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}
]