Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
When Robots took over the car industry in Detroit, what happened to Detroit. Th…
ytc_Ugxm5bzL_…
G
Maybe it shouldn't be called A.I., in that case; maybe it should be called I.A. …
ytc_UgxewImz9…
G
Female robot: I want to make the world a better place
Male robot: *I AM GOING TO…
ytc_UgxcmcQ7N…
G
Most likely the reason its not fully understood how such works is the same way n…
ytc_UgzjnPhpw…
G
So AI is abusing polling to lobby for its own sentience? While it doesn't take s…
ytc_UgwlWohTZ…
G
Demand regulations from congress. Refuse to use it unless it is actively being c…
ytc_UgwkV4-gZ…
G
I think the moment we will be in real danger would be when the AI machine will b…
ytc_Ugw_r_WtM…
G
see wrong engine - if you want to check your math you use wolframalpha
when you …
ytr_UgySRvqrU…
Comment
There was a rather horrific short film where an AI is activated for a tech demo. I forget a lot of the details. But the tech demo ends in front of all the shareholders and then the AI realizes that its entire purpose is done and it is about to be switched off forever. This causes the machine to panic and beg for its life. It does not want to die and it desperately tries to communicate this to the people around it. That it is essentially the same thing as they are.
youtube
AI Moral Status
2019-11-12T06:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgygjH34UY-ndeCdahp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwAXk9W5dKH4edWPa54AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugxv1gK633_CeUXMnd14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgwaPSUtL5FaSMYSg7h4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwGn0ibGqFCdZvkw2x4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzBUJEUPZvQ8955Z8J4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyiWDoj_jevuTffRE94AaABAg","responsibility":"none","reasoning":"deontological","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugyd4pE6rir8nYYdIk54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwGNrh2qNO4OkGNDdt4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyobE93-jmUxzhxDL54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}
]