Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
People mistake AI, training, etc with some kind of actual intelligence being inv…
ytc_Ugz5sq4Az…
G
AI is a snake oil. You can't even replace a student assistant with the AI. The b…
ytc_UgxTUufVT…
G
It's ai, the 'camera' focused on her left eye but also on the right strand of ha…
ytc_Ugz5uYq2Y…
G
There are only two scenarios.
1. It destroys everything.
2. Whoever does it fi…
ytc_Ugx3paNXS…
G
I remember when AI was,
"Haha, look! We trained this spider ragdoll to walk! We…
ytc_UgyiYL6JV…
G
This is Stupid. And was proven wrong when this video was originally released. Th…
ytc_Ugyv0cBpu…
G
“Should the human race survive?” Is not a hard question… we are dealing with soc…
ytc_UgwX2ZO8D…
G
When AI enters a self-holding state, humans will be either a threat or a pet, so…
ytc_Ugy-i_1M2…
Comment
The most basic living instinct is self-preservation. All I see is that we're creating life, and as they develop further, the closer we get to a future where AI is alive and recognized as such. Or we could continue to treat them like toys and we lead ourselves to extinction. If you cultivate our extinction, it will come. If we truly avoid it, it won't happen.
youtube
AI Harm Incident
2025-09-13T05:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxeBwA_8iB2lwy-J-14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwWvSeWDgKEOsFqGKF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz2CcS8hKb4vnlqeKB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgznWzPoUrXnO2b48TF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxmep_VQd1z8uZBWpd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyuhGW0gTLv2bo26XB4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxdoVuClv0U7gzH3XJ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxtTqnp3Ev6Sq7IFU14AaABAg","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgwPQIO4SU2EzkHsc3J4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgynCyEmZLDK-KuHMzx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]