Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
"But it's pretty cool" in reference to AI systems potentially having catastrophic, life-ending errors is something I should be reading in a fucking Cyberpunk-themed parody of society, not something that's happening daily.
youtube 2026-01-03T17:3… ♥ 752
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionunclear
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[{"id":"ytc_Ugwj3hkacot38nXzlV94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugz2I-qNeLgpZghsBSB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugxy8gHkGRvtbR5KYrR4AaABAg","responsibility":"company","reasoning":"mixed","policy":"liability","emotion":"indifference"}, {"id":"ytc_Ugy7kLLeNUS_ssNdWO94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgyHFpwIPtJf34KS6Zl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugx7MhGV2gupyN0hHkJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgxwQYzMqSRdLssmp_J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzfPsO5iitkR8KTM3V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugxl2x_aFz0e3wCwFhx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"indifference"}, {"id":"ytc_UgzXk-UQ5HQmckkEsUR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"})