Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Something worth bringing up is the question of "fun" or "What the robot's want" The concept of suffering and rights from it is an excellent question, but it is only one half of the question. For instance say we programmed the robots to have fun mining, to get a shot of excitement every time a cave in almost happens, and to just find joy in sorting one type of rock from another. Maybe they enjoy it so much that if the robots were left to their own devices, they would, by choice, started to mine whatever they could all by themselves? Would the robots then demand the right to mine? Perhaps even be willing to take up jobs cleaning the environment so that they can go home and mine for a few hours each day? Could this even be called a 'Right'? Its what the robots both want and demand, but dose the fact that we programmed them this way make it less than the rights we ourselves are programmed to want and demand?
youtube AI Moral Status 2017-02-23T18:4…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningvirtue
Policyindustry_self
Emotionapproval
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_Ugju7aEfGYlMBXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugjeziu4V1EknXgCoAEC","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgiLWOcRt89jfHgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytc_Uggqw-SfwBxqHngCoAEC","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"approval"}, {"id":"ytc_UghD-anJqaf-jngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugg2MtUBRNtZ9ngCoAEC","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UghG18WWY7H_q3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UggNgQ_Hy9w9vngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"}, {"id":"ytc_UgjaMZKvJE3S4XgCoAEC","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugi411ebTWTvlXgCoAEC","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"} ]