Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
As we've seen before, they don't mitigate risk until people die and enough people make enough noise about it. They don't care about us, much like the AI they create.
youtube AI Harm Incident 2025-09-10T13:2…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningdeontological
Policyliability
Emotionoutrage
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_Ugx1GpP0zyw15ua1wkl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwkJWrhyMisBQ-Io8B4AaABAg","responsibility":"government","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzUjrEUyUjrG_r4Wfd4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgyfgIUqjOKjtUCmtk54AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgwsCMZ-KOAVc0MiZGh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwnC6z6oVg5lvXRe7x4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugz4f4cE4gKposPCoYh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgxltFGgtboOoM2twE94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugw6fze7-MlcSNMx_614AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugyvp5tw1iDwzUMtAY94AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"mixed"} ]