Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Tbh I think you have to draw the line at the fact that a robot's simulated consciousness is just that, simulated. You can remove bits and then realise they're not conscious again. The only reason they can seem conscious is because they are replicating human behaviour
youtube AI Moral Status 2022-04-17T17:5… ♥ 1
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgyBo2zjNMeQ9Ck2T8h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgzSXFR7BTWYMg5MWyZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugy4cMygtzq_TC7mZeF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugz0Arn2F-BVDe0CJAV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyuW5OasCPfXQ_6lbF4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxrNO_EKV4QDi2A9gt4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugxxoplc8U-xt3R5k7l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugy4e8oGV9YvDSjr3xJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"approval"}, {"id":"ytc_Ugx67u6m0mM8NboC7ap4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwkEtom8EoLfXurhoB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"} ]