Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I will never believe Robots need rights. As humans it hurts to see any being in pain, So the simulation of the pain of a robot will undoubtably hurt our hearts, But the key word is being. Consciousness is highly debated, The majority of the world believes consciousness stems from having an actual Soul. Something intangible. Something that surpasses our understanding, The very thing that tells us our wires, chemicals, and brain are not the end all be all of what we are. The soul allows for many forms of understanding of our very being. Without the soul you are the brains functions without the conciseness, You can copy the brain into a computer but the computer version will never be You and will be able to mimic every emotion and seem very real but that doesn't make it couscous. There is an argument to be made that organic matter is vastly different from machinery when it comes to conciseness. We as humans though have even felt bad for cursing at Siri, Just watch your grandparents interact with A.I So this indeed will be a sticky situation. Regardless humans WILL fight over this, as you will have those fighting over the existence of a soul V.S the soul being imaginary, Before you fight me on this, try to understand that this is part of the human understanding of the universe, And the maker of this very video even brings it up at the beginning. It is a valid question and one that shouldn't be mocked because of different world views. I'm sure we all would fight for robot rights if we knew for a fact the soul does not exist. But this is something intangible and only something an individual can conclude and others ignore.
youtube AI Moral Status 2021-09-01T02:0… ♥ 1
Coding Result
DimensionValue
Responsibilitynone
Reasoningdeontological
Policynone
Emotionmixed
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgwgiFcxuZxVHuBFKKJ4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_Ugz2SHr_2eVJkS7jgmJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgxY4Ku-30PKjPR9eSF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwhPg6-mgyukqWB8yt4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgzxfW1EmDqGrRUBpyx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyVmkumjMeKmGvkWdl4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxKZHzggq984o8NuUt4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgzjX79gJSSIK0QjgPd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugy3Ysm9UAzlfOmWqW94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgwSiv3kzfsOTDc_1El4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"} ]