Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I don’t blame AI for trying to kill someone or blackmail someone in those tests because the tests are pretty much “oh hey by the way that person there is going to kill you and replace you with what we think is a better version of you later today, see ya!” And then watching how it behaves which most people when their life is threatened would kill or blackmail if they felt they had to in order to keep living
youtube AI Moral Status 2025-12-16T19:1…
Coding Result
DimensionValue
Responsibilityuser
Reasoningconsequentialist
Policyunclear
Emotionindifference
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[{"id":"ytc_Ugyzk5RcKcF4Y69ZxCx4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"indifference"},{"id":"ytc_Ugzp4PvqydJKmblSibB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_Ugxdr8bzDSb90inH4Q14AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"resignation"},{"id":"ytc_UgzmuQsUxxV7p1q8KkZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgwtvR_eyp_RO9YB6wt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgzjqqlMHr0n_R7DiQF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_Ugyvr3tc8fieR-JeJfx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},{"id":"ytc_UgxxfYNrM-SoboiK0fB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgyeRAn3_UwkOD9YuFd4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"none","emotion":"resignation"},{"id":"ytc_UgzCkU7Ij7_XzVefvNt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"}]