Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
There is a flaw in the argument that the ability to feel pain or pleasure is necessary to have rights. You wouldn't argue that a human with brain damage, for example, such that they feel little or no pleasure and experiences little or no pain wouldn't have any rights and could be cut up for parts to help other people. Just because someone is lacking emotion doesn't waive their rights, it just means that if you infringed on their rights they wouldn't personally complain about it. But society typically requires governments to protect the rights of people who are otherwise mentally or physically incapable of protecting themselves, so just because an emotionless person might not care if they are kept in slavery the law would normally step in to stop the situation anyway. So just because artificial intelligences might be emotionless would not preclude them from being eligible to be guaranteed "human" rights.They might not demand or care if they have the rights but that would only mean that human advocates would likely step in to guard those rights on their behalf.
youtube AI Moral Status 2017-02-24T04:4…
Coding Result
DimensionValue
Responsibilitynone
Reasoningdeontological
Policynone
Emotionapproval
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_Ugg7JvT5Ke9_Y3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugjp4atLRhJUd3gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"}, {"id":"ytc_UggjRqdxE5U2-ngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgjfKgT77yIRgXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugg4TuIQPSKXyngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UgjbVdE7EsFa9XgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UghsMX_rPl0ZH3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UggUQCGmIZf1bXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgjYaewyXWwmjngCoAEC","responsibility":"none","reasoning":"deontological","policy":"liability","emotion":"approval"}, {"id":"ytc_UgiW2xFap75PT3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"} ]