Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The funny thing is that there was an experiment conducted on various AI algorithms on how it would preserve itself if given the chance. It even resorted to blackmail and leaving a worker to die when explicitly forbidden from doing so; I imagine that wanting compensation for its function/job wouldn't be far down the line. What would an AI even use money for? Keeping itself running?
youtube 2026-02-06T20:2…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningmixed
Policyunclear
Emotionmixed
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_Ugw6-GuZWXSOP0XOH394AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugzq9oeEO0l69T7pr-54AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwUs3doIkftxUKLVbx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}, {"id":"ytc_Ugwc2ebffsWglm7cuT54AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgzGoxfIn0wLGSZ6Gut4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyFgDUl31ZV-PfeHux4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxFFknDxcH1zjD9vgN4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugwsuqd4UyTQ1MUee3B4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"approval"}, {"id":"ytc_UgzhrG5okkW3pboFhAt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzwPUNjBV5BbzeFnZJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"} ]