Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
its sad that when I was a kid I dreamt the day AI would help humanity, and now h…
ytc_Ugz5eC9Pp…
G
From ABC News:
"The National Center on Sexual Exploitation, an anti-pornography…
ytc_UgziWF89g…
G
easy solution to the problem is that we fix people before we go on to produce AI…
ytc_UgxYy71hk…
G
Big companies don't care about people needs, no one gonna use LLMS as last versi…
ytr_Ugxh28s8U…
G
I feel like people who say this have no idea what they are actually saying and i…
ytr_UgzEwwk2m…
G
That shit enrages me.
I was fine until you infantilized me! Where tf in my con…
ytr_UgxP5PEiC…
G
Robot's wouldn't deserve rights, even if they had such thing as a "understanding…
ytc_UgxbZOFTW…
G
DO NOT, I REPEAT DO NOT SAY THANK YOU TO YOUR AI. YOU ARE WASTING VALUABLE RESOU…
ytc_UgxQuf5Rs…
Comment
Sorry for being a bit off topic, but ever noticed that if you ask an AI a plain question, if gives a great answer, but if you start the conversation by arguing with it about politics or something controversial, then it just starts contradicting everything you say, even saying things that are the opposite of what it said in other conversations? It's a little disturbing the biases that are starting to be put in AI, that it gives different answers to people even if the only difference is their political leaning. AI might help you figure out which part of your leg is hurting, but you really shouldn't just blindly trust everything it says.
youtube
AI Harm Incident
2026-04-20T00:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugyq7F8uKd4-q6H9KVJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw-sACa30q38aUCiER4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgzQVy8xXvsbGgG35HV4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwHXFYLZSlUeXxCJLd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyCux2GKQxk0BvIrGx4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzQEUGuAWwaCn8fOFF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwhvOum004-Hp6wjCF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy3Gknio5-FAbynV4Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwKcZPI7CfR7CFmqCJ4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxi85BHGv50ld_SYnV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"}
]