Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
But... the ai never told him to consume bromide. it suggested it for cleaning. why are we blaming the ai?
youtube AI Harm Incident 2026-04-07T23:2…
Coding Result
DimensionValue
Responsibilitynone
Reasoningdeontological
Policynone
Emotionindifference
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_Ugw94st37z9u5eKoSGd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyPf8Y29nf3fcgFDrl4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwGTj0TuvB0PEI3pGB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"outrage"}, {"id":"ytc_Ugzgjyg4b5klMLmPv194AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugz4HsBjgaRcV_MH7tZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugzekvc7UN9OPZcNA6J4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgyJFcYQHpdYH20IrDF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzJ1WcQ8YQdi-hkgZh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwUXTu8j6i2NshJli14AaABAg","responsibility":"user","reasoning":"contractualist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgzGWlhS1HipDO4DsgF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"resignation"} ]