Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
the black pickup truck is gonna be mad when he sees that AP saved the tesla and …
ytc_UgzEdDcN7…
G
@laurentiuvladutmanea don't think you understand what I'm saying. Every move cou…
ytr_UgxGu0RUm…
G
I'm hoping the new york times wins and all other ai companies soon fold under th…
ytc_UgxHJiFfi…
G
Ok invest in OpenAI replace the labour force, see huge profits, BUT, a large chu…
ytc_Ugy75Zlur…
G
Oh my god this is so infuriating. You're not an artist if you don't make the art…
ytc_UgzlntvEQ…
G
i have a confession to make. i have never felt strongly about the ghibli art sty…
ytc_Ugyqe0J9U…
G
This is a bunch of horse crap no AI prefers a race that has to be the stupidest …
ytc_UgxS1Tg0a…
G
He was gracious in victory :) Waymo might be a bit more robust but they do rely…
ytc_UgwqwFyeW…
Comment
AI should %100 have better tools to deal with concerning messages, but this is just parental negligence tied up and put into the same old nice little bow of "It's the TV's fault"!!! Like, you're telling me when his mom heard that he was texting AI she didn't do anything to learn about what it was he was actually talking to after it was clear he'd been withdrawing for so long? (I'm saying this with a lot of hate for AI and it's unethical circumstances and use, but) Give me a break. Still, screw AI, put real people on the forefront, and make the developers change their programs so that crap like this doesn't happen to kids who are parented by their phones and tablets.
youtube
AI Harm Incident
2025-08-15T03:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxyAW22y72_dh9lZZ14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxQ8vq10J0JKcdqVg14AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzKLpHOfdcpxpPsqmt4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz34RX2QpB9vqC21ed4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwGlPhcz4Br_UDvVpp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwGJ3yVZLsNPq-TNUR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgwqPrpV8VGGU8D69kt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgymD_3mo-eARXFkp2p4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwfPCnC_ZEJovyW9EJ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"sadness"},
{"id":"ytc_UgwvffxlInM6FuqxP5x4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}
]