Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I you can't recognize when a LLM "hallucinates" given an order, you shouldn't give orders in general
youtube AI Responsibility 2023-06-20T14:5…
Coding Result
DimensionValue
Responsibilityuser
Reasoningdeontological
Policyliability
Emotionoutrage
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgyOVtYOe3rykmU7nIp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"}, {"id":"ytc_UgzSgBNOAyYzpTWomkp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugyyf6a5LNoB3S71Gid4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgyNxTvm4f8s-9Hf_TZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxW4QjN5uhuA7BTAbp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"indifference"}, {"id":"ytc_Ugyq_bsUQR4Wu5EAJNl4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugx7EeTfPnAmtKx8EgB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_Ugwrsk38fH99Hz6gOH94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugwlephpo6JtuaV8JsV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgzIuJ7ohhlevkIQqQl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"} ]