Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I wish you had gone further than looking at consciousness. I think the question of free will is a more interesting one. A robot might be sentient, but without cognitive capability for free will, I think there is little basis for robot rights. Instead of the example of a robot feeling pain, I think it'd be more interesting to examine a robot who is prohibited from exercising it's free will. Furthermore, what if humans in the future forbid robots to choose for themselves because of the potentially disastrous consequence of a powerful robot feeling remorse or anxiety associated with that freedom to choose?
youtube AI Moral Status 2017-02-23T17:5…
Coding Result
DimensionValue
Responsibilitynone
Reasoningdeontological
Policyunclear
Emotionindifference
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[{"id":"ytc_UggjZy8rcjGNm3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_Uginb6UDLQhk_ngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgicvaZjhMX6BHgCoAEC","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},{"id":"ytc_Ugh9eNuF4VTjWHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"},{"id":"ytc_UggntMP2kdIoWHgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},{"id":"ytc_UgjEH-SjZ2pMMHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgiDLalxifsbkngCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgiXwA6zw6dnqHgCoAEC","responsibility":"none","reasoning":"mixed","policy":"regulate","emotion":"approval"},{"id":"ytc_Ugh2D6_lDm1Rc3gCoAEC","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"resignation"},{"id":"ytc_UghxYPawvqCsbXgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"outrage"}]