Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The Big Shitty Bill allows NO regulation on AI fir 10 years!!! There goes all yo…
ytc_Ugz6FuvYM…
G
I don't want the planet to be populated with human-looking robots and that's wha…
ytc_UgxuV8tUE…
G
The face when you stop and realize that 4% of a population being infected is con…
rdc_g9teage
G
I feel like Taiwan is the biggest risk in terms of China starting a war and need…
rdc_gt67pm6
G
am i a bad person if i sometimes use ai as a reference? its very few and far bet…
ytc_UgxvTM4pU…
G
Makes me sad to be honest.
Facial recognition is to protect people like me again…
ytc_Ugx1a_yJa…
G
But the ai generated image looks very realistic if i saw that in a frame i would…
ytc_Ugzg8jmWZ…
G
I have a different belief on AI.
While i think COMMERCIAL use, such selling ai a…
ytc_Ugy6_nS8H…
Comment
There's an 85% chance this video is clickbait. Not once in the video did an AI actually end someone's life. A hypothetical test and an actual death are not the same. AI also makes up stories that are completely unbelievable and are so silly.
What will be the new threat when AI hasn't ended humanity in 5 years? Another five years? Alien invasion? World war? When climate change didn't work, when disease didn't work, when celestial disaster didn't work, what makes us think this will?
youtube
AI Harm Incident
2025-07-24T22:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugxd0fxcDgj7BlB1sMp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwWd2FAC7KX4o7mYsh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy43wJiBBP8AaTf7ZB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwX2ZO8DJl3_iockjZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyASkV54ecsZRgAmbN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz7sHUJy0lb9fzKg3B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxN7AtitC0mxzZVApZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugyb24ijYzTfYCpLMfl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzW01twe7yyAiHlnLV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugzr-Q3yj1J9tY8PYz94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}
]