Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I think it's SO stupid to willingly be creating something that could whip out the human race just because people want convenience... The future is so uncertain but no one will care until AI is the new top predator. It's so fucking silly and honestly I wish it didn't happen in my lifetime 🥹😭😭 why do I have to suffer for someone else's gain?
youtube AI Governance 2025-06-29T01:3…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyban
Emotionoutrage
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgzCSPBnCc-rGr0Wn-t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugz8xia2S1iKmUtFWll4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxbdBTT4xQ3g1Aqxp94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugz9vZ_sUK79yy1xYyF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgwzXhmEdLAKHfKBJn14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgxVd11sBkM1Uk4W6N54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugz8kXJ54R54LGC4pDl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwVHOi9R8IukZippyp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_Ugyr_IfBWsMsVHx8IZR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}, {"id":"ytc_Ugxstt_ZvTwkO0gydr14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"} ]