Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Now you got me curious - what AI agent(s) tried to kill? I'd like a specific example - I want to look into this further
youtube AI Responsibility 2025-11-05T02:5…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_Ugy2v8W63QVRG0r2klB4AaABAg.AUY5GcpBXeqAUzEu8TD3et","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytr_Ugzdcx0CdycigwJN54N4AaABAg.ARZvuBaySWhAS3U8vcCd-X","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgzNC4n1RVH0WgiLM5l4AaABAg.ARVtLwEdVpcAS3UItV7b7C","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytr_UgzJD4677wXn6ZZa2BJ4AaABAg.AQyXgUvhDX7AR-R-nl91k1","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_UgxrYNA4wSmF3ldo8FV4AaABAg.AQfx_y0j-GaAR-RNm-NonN","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugz3CmBvEbqmY9qZ6D54AaABAg.AQfJJli7O2MAR-d3T7iHEf","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytr_UgxnA3KPL8BAnr2wiBV4AaABAg.APaeXV4y2rSAQpxXPcSUbq","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_UgypZknkEThfR3Qywtx4AaABAg.AP84BQbkJ_5AQf_BllOdFM","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytr_UgypZknkEThfR3Qywtx4AaABAg.AP84BQbkJ_5AT-nfgGyL8w","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"ytr_UgwOHQ0pyTPcxci4HiF4AaABAg.AOuySmeBPWiAP7f0hAxnKD","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]