Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The lack of human rights creates dysfunctional societies, so giving rights is not intended only to benefit the individual but also to benefit the group. The lack of robot rights currently doesn't disturb society because people can claim property damage. But if robots become capable of waging war against humans (with or without any sense of self-defense, like pain and emotion), that would change the game. And if robots become superhuman, then this discussion is as useful as cats deciding whether humans should have rights - we would be the cats. There's one problem with this discussion: why does everyone assume that future robots would be friendly to one another, that they would become an organized collective with a single objective? Humans don't, so why should they? As free thinkers and free agents, they are likely to diverge in their (superhuman) ideals too.
youtube AI Moral Status 2017-02-23T15:3…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policyregulate
Emotionindifference
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UggD3ftR4rVvoHgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgjHDsa6X9WSA3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgjHNgX2PLTXdXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugg_-pM4pNajdXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"}, {"id":"ytc_UgiRwHOTbYP9qHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"approval"}, {"id":"ytc_UghDAQ9vzeYcbngCoAEC","responsibility":"none","reasoning":"virtue","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UghI9gxfJBEJK3gCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"}, {"id":"ytc_UgjmvxyKpyiwLHgCoAEC","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"fear"}, {"id":"ytc_Ughf3kv_0SxIWXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgjldUSsX4ZuyngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"} ]