Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
You can't buy a gun when you're 12 years old. But you can talk to a chatbot that will tell you to kill yourself. This country is messed up. Regulatory laws are needed before a product is introduced into society. If the FDA I can regulate food and drugs, there needs to be an organization to regulate technology that is out there being fed to the children.
youtube AI Harm Incident 2026-03-12T21:1… ♥ 2
Coding Result
DimensionValue
Responsibilitygovernment
Reasoningdeontological
Policyregulate
Emotionoutrage
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgzKge6nidW2FLXwlVx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugytt-_UDD_2vaFdrgJ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwjZji2um_XvZU2zVh4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"fear"}, {"id":"ytc_UgzpxApQMc7-jM9sm2B4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxSaG-eGo8_Slf8NXl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxpB3n2cmz7wZKxR8J4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgxgMzQDNuSpa74Gv2x4AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"indifference"}, {"id":"ytc_UgxutGzHPw3RYByyyaV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgxCdO1QGYk5L_T90vp4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgzntHaBEA_sZy2-qW54AaABAg","responsibility":"distributed","reasoning":"unclear","policy":"none","emotion":"resignation"} ]