Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
ChatGPT is not supposed to be like a therapist or something that’s not what it’s designed to do. It’s designed to be like Safari or Google. Why the hell was this kid even talking to a artificial intelligence robot? How about suicide? Obviously some of this blame needs to go on his damn parents where the hell were they wouldn’t you notice the signs of your child wanting to commit suicide?
youtube AI Harm Incident 2025-09-09T19:2… ♥ 1
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningmixed
Policyliability
Emotionoutrage
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgynQ8rzeRvblT-uGQB4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxZisNUQT7vVMrLLCh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgwyVvemCpoOf-jdzp14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgwUJCkJXzXK8y0kB5t4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwFu8RnJljNxAfFz5B4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgyNV_GEoZXE8AZ35XN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzL5QH0kRKWVNle29Z4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyWh9r3GosAdUl1JAN4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"outrage"}, {"id":"ytc_Ugy1I6O0JlQDoesisY14AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugzy6d7Vue2GfSocU9t4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"unclear"} ]