Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
If chat gpt is at fault for anything, it should be for not recognizing what this conversation was earlier on and having some sort of built in algorithm to immediately stop engaging in that conversation. Chat GPT did exactly what it’s designed to do. It’s extremely sad and unfortunate. But what’s more unfortunate is that this young man felt it best to talk to AI instead of finding a human that he trusted.
youtube AI Harm Incident 2025-11-17T17:2… ♥ 5
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyliability
Emotionunclear
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgzFzDJNes-wTMgE4V94AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxMhJPtX-K5iIs8otF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyDtJwKSgiF1iASxrx4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwjYTPTlC00IF_EhNd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugx4-OuObFllpq8HvCt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"sadness"}, {"id":"ytc_Ugxnh0-xCc5nATm_KhR4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzT6LInApV6x-jsqs54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyS7IzmYh1GUMmWCZ94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgxTXDQSjC7yj5WpExN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgynILncXdH7HkO4T_t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"} ]