Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It's astounding to me how conditioned society is to assume that computers are al…
ytc_UgwmQXOql…
G
lmao artists in the comment malding when people doesn't care what they think art…
ytc_UgxYtc5_j…
G
Bro that's a argument? No matter on paper or computer/tablet you need skill/tale…
ytc_Ugxcmw8ho…
G
So to bring back slavery, without pointing the finger at whites, they are going…
ytc_UgzBsj28a…
G
The ridiculous Tesla "robot" wipes down a counter and slowly places items in a c…
ytc_Ugy91a0Rc…
G
I know it’s all gonna crash not just ai so everyone stack your cash and buy the …
ytc_UgyQtjLCF…
G
Is it the entirely predictable reason that I predicted? AI isn't human, it's an …
ytc_Ugwv-m6Fn…
G
AI is already used by the military with humanity eradicated by killing from an…
ytc_UgzZTGMx3…
Comment
It’s no wonder AI developers constantly share how existentially lethal their work is in every way possible, imagine gun to your head you’re forced to work on the weapon that could, very likely, destroy humanity, and should you fail or refuse, then they’ll just move on to the next bozo and then the next until this fancy new toy is ready. What can you do aside from keep working while trying everything in your power to warn others, get them to stop you and stop those with the gun to your head?
youtube
AI Moral Status
2025-12-11T11:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwoPEbqR72Y76GfReN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyzxDoTG4lCWfuSRQR4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyIeD0e8JM5xYKzHb94AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx2-63QYAoulf2hXUB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwxIy3NCmAXrkzxbQl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugw509kiPU9zlTJEqap4AaABAg","responsibility":"company","reasoning":"virtue","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgwwRUlONUOZxqeik-t4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxVXB9GCNlWuILPK0F4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugxhrg69guZkSLQ92KV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzxMKf7A1zNZtEXAvJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}
]