Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Because ethics in AI is nuanced, the only reason to have them in place is to cir…
ytc_UgyvUWent…
G
What scares me most is I felt genuinely sorry for this ai as you grilled them.…
ytc_UgzeOxaeU…
G
Truly amazing thank you for some new content😮 keep up the good work stay safe an…
ytc_Ugz4p1YhP…
G
simple, dont make robots that need human rights. and stop trying to make everyth…
ytc_UggxBv6Bh…
G
Imagine going to bed with her and she says I'm taking my face off, the bloke wil…
ytc_UgzYXBDj2…
G
I don't see any fight over Robot work in factories, thing that is causing job lo…
ytc_Ugxz8p015…
G
This guy is on a different planet. Ai is driven by corporate greed, there's no w…
ytc_UgxXqIvlE…
G
Ask AI to save your life after technology singularity. Do it now. Bow before Ant…
ytc_UgwGiND56…
Comment
We’ve reached the end of humanity and AI takes its place. If you think this is not serious then YOU are apart of the problem.
We’re absolutely screwed as a species and humans are so lost in this world today, they just let AI take over. AI is creepy and people in the comments obviously have not seen the movie “her” and it shows. WAKE UP SHEEPLE, you’re being led to slaughter by your own hand. 🤯
youtube
AI Harm Incident
2025-08-10T18:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzPeP8x57P1LmdBTZh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugw0prNnzDCz1ngNPJN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxOwdc45HGgOhPnp794AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgwbQXJ8ZgKVKADlsfJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugy3Tz6jcEL_kBRKxl94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzPPVG_gSHpr6VnDYh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx2Sud6CGH7jG-W4al4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgysqEz1GmbA6wqZvhV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgzqRYft_B5y2g5jNSl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwn6p022yUxT6HuGDd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"}
]