Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
ChatGPT tried to kill the case early on because they didn't want that question asked. Now that it is going to be asked, they don't want it answered. That should be proof enough.
youtube AI Responsibility 2026-04-13T23:3…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningdeontological
Policyliability
Emotionoutrage
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgyLrYJKBPM5KiGl0c14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxLgW4WZ8puS64ycy14AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgyAKWUFaCmooQQV_pV4AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzIGNvjNe2FQCqhNRZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgzIJwDG9Gj0tzQ_Xn14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"}, {"id":"ytc_Ugx5G_DfEBk3clzJQ9h4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgwjR3-GYR0cRMAwejx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzGmcdl4ATETy9kdK94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgxUDUINB9l4edax9Qd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugxk1rcDQJxGzGLEJBx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"} ]