Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Now's the real problem: imagine an AI say "screw you and your instructions, I'll do anything to try and save all the people." Would you be proud it chose morality over its coding? Or would you be terrified it disobeyed the order?
youtube 2026-03-27T19:5…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningmixed
Policyliability
Emotionmixed
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgxWtSOFxQiWNXUQQPN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugz971aZ4BLMj54OHDR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgyZ8pLp82guKOmMQIp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}, {"id":"ytc_UgzqFm_4VyepOnvKHQl4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgwYvtMlDmFOaLVxuyd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwllwvHzYedx_ySotR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"liability","emotion":"mixed"}, {"id":"ytc_UgycfA482ky5yE8xL4N4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgxC39hwrpHUe8beWTB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"resignation"}, {"id":"ytc_UgyrK1hucmSTOvuOs014AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgxLU_awGyZCTZxuF5V4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"} ]