Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Extremely important thing to note: The ai didn't "think bromine and chlorine were the same". the training data was most likely to string those words in that order.
youtube AI Harm Incident 2026-01-09T05:0…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policynone
Emotionindifference
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_Ugy-RekoDvgI-rCbfeh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"approval"}, {"id":"ytc_UgwmLBZOytjav2-DwCt4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugz_PJSxm_4PCc3JWMN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzhRHOI1SB1byVJnQt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzGL_JXmVBhqqcmFNJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgzN_NlqYvsDQ_cMe5h4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugx7aWHk02Nh6tM30P94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgzHbXSN-y1ouB8prpd4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgzGhxeIqzYm3soFtjt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugxw0l9IxTYV5WsLcrt4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"outrage"} ]