Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
From my standpoint, it's irrelevant whether or not they're conscious in order to determine whether we should grant them rights. A solipsist would point out that he can't even prove that any one of us besides himself is conscious yet may still favor human rights for its productivity to society. The ultimate reason to give a being human rights is in the case where it's best suited to determine and realize its full potential. It benefits not only the people in question but everyone else to free people from the shackles of slavery and grant them self-ownership where they can then potentially find something far more productive to do than forced menial labor. One of them might even cure cancer. It's to society's overall benefit in such cases when we can't externally predict a being's best-suited purpose that we grant them the freedom to figure it out for themselves. Yet it makes no sense to me to provide self-ownership and rights to a self-driving car regardless of how intelligent or conscious it is to pursue alternative things like creating horrible art since it's designed optimally as a vehicle intended to drive people around. We already know the limits of its potential in advance. It's only if the machine could surprise us and do something far more productive with its freedom than driving that we should consider setting it free. Man-eating lions are very likely conscious and sentient. Yet we certainly don't want to grant them rights to freely roam around with us on the streets. That would be thoroughly counter-productive. Consciousness shouldn't be the determining factor for freedom.
youtube AI Moral Status 2022-07-10T09:5…
Coding Result
DimensionValue
Responsibilitynone
Reasoningcontractualist
Policynone
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgyYreKH5rBrv1_HgBR4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgycJEmtloar2BaKDOp4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxoREn0piQ4hFmISbV4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyvqMci6KNNS7IwakR4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UgwY2O-5KzvaLl4PwH14AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzWIJykcQeD7wfgNAB4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgxXr_N7HIcWal2U0k14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxlKfVJ6uabSwlFpOd4AaABAg","responsibility":"none","reasoning":"deontological","policy":"liability","emotion":"approval"}, {"id":"ytc_UgxkR037_XfdWWDHxK54AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugxq--jUn8cxxyVMtLx4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"} ]