Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I don't want to condition myself into acting as if the chatbot was an actual person. Hence "please, "could you' or "thank you" is not used when writing with a chatbot. In my view it is a machine that can follow commands. Pretending that it is anything else is kind of perverted. However you have to be very specific with your language to get good answers.
youtube AI Moral Status 2026-03-13T08:5…
Coding Result
DimensionValue
Responsibilityuser
Reasoningdeontological
Policynone
Emotionmixed
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_Ugx52u0MjaJgBgwz5ZR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"approval"}, {"id":"ytc_UgzArkAonqzMGHdvQd54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"}, {"id":"ytc_Ugyg_yYXhNfGEAGzwn14AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgwldolSgpmENyJn9el4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgwgRhXhPwoGiLCu-YF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzQ03Zk68LnYS8wr8R4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgwycZZFwFPBm-TEh754AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugy1NSUvAcEqPrqUQyN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwLhHQrzECI7WcdgiF4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgxAtZNXDkBXg0xVLSh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"} ]