Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
It's so scary that you're a real person who didn't get consent to be turned into an ai chatbot and yet you have a charaacter profile on this app. If nothing else, that should be illegal. If things can spiral so out of control that vulnerable people can lose their grip on reality when talking to obviously fictional characters, then it seems like the chatbot being based off of a real person would make that a hundred times worse.
youtube AI Harm Incident 2025-07-20T23:2… ♥ 894
Coding Result
DimensionValue
Responsibilitycompany
Reasoningdeontological
Policyliability
Emotionoutrage
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[{"id":"ytc_UgyQCqsBknKcifQqojR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},{"id":"ytc_UgxGjUsH1nTknGHmaSZ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"outrage"},{"id":"ytc_UgyOUfS-S2smk5UsIDl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_UgzdOs1H2nU5rDXdbFV4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgxBhWurJl_hqad6o9x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"sadness"},{"id":"ytc_UgyG2TIvQY2Z_va46UZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},{"id":"ytc_Ugw99sbGJx0XuJXibF54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},{"id":"ytc_UgytwjDdsm0VRRbyhPB4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"fear"},{"id":"ytc_UgxCJHGreHdpQGfqe7l4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"unclear"},{"id":"ytc_UgyV9c-YcJvovy8aMa94AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}]