Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Can we just kill A.I if the greedy people don’t want to stop it then just stop moving forward just go back to flip phones not wars, unequality, not the bad things just throw it all away I’m sick and tired of us trying to stop it while keeping it JUST KILL IT ITS NOT HUMAN SO DIE USELESS THING “Oh, but it’s useful if you use it right!” MY FOOT we have been living without A.I so why now, we don’t need it, don’t keep it, if we turn crazily stupid again, then we might as well start living caves and hunting, only having enough knowledge to know what is good and bad YOU DONT WANT THAT DO YOU LIVING IN A CAVE REVERTING BACK TO ZERO YOU IDIOT DONT YOU?! Stop being pain in the bums and kill it, it doesn’t feel, see, think, live or anything! ITS PRACTICALLY A CORPSE COPYING A HUMAN KILL IT OFF, IT WONT HURT. (Sorry if it’s quite vulgar I just want to revert or I’m reverting myself)
youtube AI Harm Incident 2025-11-20T10:3…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyban
Emotionoutrage
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgxmU8XHi7U6HBdNzjR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgwuJXQ-SQPnpD-3xWV4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyRW3NITWdL6hw64yF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugzh8Fk3whU-4SZdH094AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgzCqUr_0yyG1Cu-E0N4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"ytc_UgwUtPHHoCnVVAZVg3N4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgzHBNzLJTCyAYTw1IN4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwoYSSZt_cw1mRIX0J4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"}, {"id":"ytc_UgzAMEm4WRd6l4krN8J4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugw3Sw7lM87MnMjMVvl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"} ]