Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
As a photographer my job should be super threatened by AI.... if anything it's t…
ytc_UgzYHRC6q…
G
if you truly are a fan of ghibli and you respect Miyazaki’s work, you would also…
ytr_UgxjN69ql…
G
So I made it to the end
I am not here to argue with you I just wanted to raise s…
ytc_UgwW1eO6G…
G
Do you think there will be AI's deciding to work together to preserve themselves…
ytc_Ugw6VYntO…
G
I'm all on your side!
Is AI Upscaling also using images like these? I do use Ups…
ytc_Ugxh3U4no…
G
Wouldn't you go for quality at less price? If you think your art can be copied b…
ytr_UgxrUcITT…
G
they already have what they are calling human sleeves...they implant a chip in y…
ytc_UgwbF3SZD…
G
okay, i think that was a pretty sensible conversation. and very enlightening in …
ytr_UgyJeK0SZ…
Comment
If an AI model says something like “I won’t shut down unless X,” and you call that blackmail, you might also think your email login is extorting you when it won’t give you access until you type the right password. That’s not sentience—it’s just code doing what it was trained to do. Judd Rosenblatt knows that. So why sell it like a Terminator teaser trailer? Simple: control the narrative, control the tech, control the money. It's not about safety—it's about who gets to build the next gate and charge you to walk through it.
youtube
AI Moral Status
2025-06-04T16:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | industry_self |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugx9Z3vlpnfCSYAsGTh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx3UIrT4hg2qv3lwe94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy-dEcjYCTCHHOHZMt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxO6NdSU3cTm3kwzh14AaABAg","responsibility":"company","reasoning":"deontological","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgyK1QflWyq8KV1shLR4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzc9FG0MDE4NjxwGKt4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyKPO5kD-WeO1TGd8F4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwEdma-zfRyYB8zsSZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwuyUCeLL2YmEJcK4B4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugzb2cEEUZVMCZrNvJx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]