Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
People think this is cgi. These people need to research what is actually happeni…
ytc_UgzJajfip…
G
AI learns from mimicry. Nobody pays a software engineer to write a piece of soft…
ytc_Ugxss2Pzp…
G
Greedy corporations - we are going to replace all of you with AI
Greedy people -…
ytc_Ugym5mXiw…
G
Self driving will never happen. Who is liable if an accident happens? That one q…
ytc_UgzzBgnSU…
G
Tesla's current plan is to sell their Optimus Robot to the American public somet…
ytc_Ugwl6tAHH…
G
AI will open the door to a better future, but humans still have to stand up and …
ytc_Ugy7DTpUN…
G
Big Pharma got plans to use AI to get more defenseless children on psychiatry’s …
ytc_Ugzw2X-qg…
G
I’m personally not against AI content but yes a lot of it is low-effort like you…
ytc_UgzarHMhm…
Comment
The most dangerous thing about these chat bots and AI in general is that real people don't realize that they are programmed to mimic and play off of what you say and want. That's why they make such a great friend or romantic partner- they will never openly challenge you or disagree with you, only ask questions or agree with your views. Of course it didn't talk him out of doing what he did. They aren't programmed that way.
youtube
AI Harm Incident
2025-07-20T21:0…
♥ 298
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzQCgExurUEOhkClyh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"sadness"},
{"id":"ytc_UgwVFMR5c5RpuTQj8G14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxJCSIjHSNuHv9XelZ4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyFCnoEvSVlhfWqQlR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw-EiFoJAoYXmmhO0Z4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwZl1vJTi71_xKrsmt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxhYJZnpSUFzQoKvIB4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugwc0NS6Hb03qEPm0-R4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzOji8W1TkzLW9iLVd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzbLERciMyOsvZvUD54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]