Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I had to check my AI when it made a Critcal life threatening error in the term “grounded” and “grounding” then it gaslighted me, by inferring those terms where interchangeable. I had an NEC code Book in front of me and asked it to reference Article 100 for definitions. Only then did it realize its mistake and apologize.
youtube AI Moral Status 2025-04-10T20:4…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningdeontological
Policyliability
Emotionoutrage
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgzDXAwSGcO6OF96Lvd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugx1wqdTsOLdbvdbaIF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgxSeVbE6MVQYFKF7e94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"ytc_Ugzi9tJPBrmd4VRUtZ14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgyVmNf9Gp4qrjNF1sB4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_Ugx4bRMIX4wzso5h0cJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgwjdUZpXIsAwEdgb2t4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxU1g1tI3_iK5QKiEp4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzuCNqqtjZTC2vRgr14AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgwwacqFK-JBsq0qeSp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"outrage"} ]