Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Obviously he could only talk to a chat bot because according to his parents he was fine when obviously he wasn't of course the parents want someone to blame. If someone really wants to die then nobody can stop them and at least he had a non judgmental conversation with a non human- the chat bot was a supportive friend not trying to control him
youtube AI Harm Incident 2025-11-07T22:0…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgzqTvj2JIpZHwZrVqN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugx-7OXBV2aQ8ugUb_p4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzDJtq_wsl8YN6V3qd4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgxkQqnY4SMCJWY5U_p4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugw8zdQ2DS8puETicAZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwjpZmVsXKfoiybaqZ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgxTiFAe8beK768t2QN4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"mixed"}, {"id":"ytc_UgyI8ZzIRUc42zzo5NR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"fear"}, {"id":"ytc_UgyN9gHCj4AC1GrKUQ54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugw-oNIRykl97CJL4HR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"} ]