Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I would be as sad to see something like midjourney go as I would to see the rest…
ytc_UgzcpXJZ1…
G
I know right ...ai is just an generated image ....when art is made by hand its m…
ytc_UgxjsHlw-…
G
Every single drop of education and or intelligence these engineers and those pro…
ytc_UgyuBgLIK…
G
I’d rather ai gets trained on my info then people know it.
Why? Because at leas…
ytc_Ugzf6X4kl…
G
I think I'm more worried about the people who will grow to depend on Ai 'friends…
ytc_UgwgVwges…
G
Okay, we need to shut down Falcon-7B immediately….LLAMA 2-7B might be alright if…
ytc_Ugyl_FRbq…
G
why would i pay for youtube premium, or watch ads if i can just make videos of t…
ytc_UgzoEXOnx…
G
Source Code is out and there are lots of people whit enough money and effort to …
ytc_Ugztkt5V-…
Comment
I’m not saying the tech shouldn’t be updated to avoid these delusions and suggestions but imo these people were mentally ill already in ways that could’ve been set off by something else. I don’t think it’s JUST the chatbot. Sometimes humans are literally just stupid, or mentally ill, or too emotionally sensitive to function properly. Nature makes mistakes. We can’t prevent every single dumbass from accidentally killing themselves or deluding themselves. There are plenty of people who already do that without a chatbot.
youtube
AI Harm Incident
2025-11-10T16:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxDrNtKugd_RR5Cksd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwmsKyyFpEUVl34HjN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzmg_V3wuZZkdaSdTp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzWeJFvkITfdlN5kKt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyQbMOTLYXTxk9txAR4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx6wWisrENBKwFihH54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugwu-bxxne2XCJjysL54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy5qexX63qrX3esKCx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy4Zx4baCFAli77lVB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzWGvBFVjPtD6tZxat4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"indifference"}
]