Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I’m so confused, did the chat bot tell him to kill himself? What made him think that killing himself would unite him with an AI chat box? And yes the interviewer needs to practice his facial expressions - seems too smirkey for this type of conversation. Maybe he is nervous?
youtube AI Harm Incident 2026-01-26T21:5… ♥ 2
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionmixed
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[{"id":"ytc_Ugzqs3hb5KO3vBvL8op4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugxh5o-yvGFw2nEHQpB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgyyoJCkH_-ykPc5y4d4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzcY9P4mIWswYGoS_N4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgwO8Ep920a-DU1sxX14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwM1cWZPunmpYA6iop4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwxDp1IH6h30U5PM294AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgyVOwOBKJl8Mm_ql6R4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwFWfGIm3vN4agg-3J4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzoqcINIQ0m3DTfeGF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}]