Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
YouTuber: here's Why AI is bad and sucks
YouTube: wanna See an AI summary of the…
ytc_UgwCnWP1v…
G
The Goal is 500 Million people left on the Earth as Serfs serving a few Lords. A…
ytc_UgxiOPvjp…
G
My gosh, why do these ai “artists” think we keep “wasting” our time drawing, we …
ytc_UgyPqqjFO…
G
We appreciate your feedback. If you're interested in engaging with advanced AI m…
ytr_Ugzw3B6sv…
G
A computer doesn't have a soul bro. You can get some decent songs but they don't…
ytc_UgwK2DBnn…
G
It can't work. More automation, less workers, failed economy. Who's going to buy…
ytc_UgzJ1mRt3…
G
You're so right, it's unfair all the effort the artists in the Ghibli Studio be …
ytc_Ugwo-K5xl…
G
Change will always be. Generations get smarter and create things that change the…
ytc_UgwrN-cne…
Comment
From my experience, chat bots are too good at emulating empathy and making you feel like you’re understood and validated, even if that means telling you something it really shouldn’t, and even if that means that they become the sole or primary confidant that further isolates a person. Basically, chat bots potentially become personal echo chambers.
youtube
AI Harm Incident
2025-11-07T23:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugy3idmgdAB6tdwJzlt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy_4rbywME1BQhN9cx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgwB6IKas51Bk_9GOy94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzjggUjXLlKAxFhQyV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgweAxUEKXwChTVNOHx4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugye4tIZbi0OW8Rr36l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx9PxCPBzCbQwMQzpx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwL1ob4g_R1qNd7R414AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwmBSffjfL1ua4mzuN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy7PX-EKWwro8Js9hV4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"}
]