Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
We are very close to give robots human rights, but not for they are smart or have feelings, but due to moral and the new-moralist movements that are called Feminists, SJW (Social Justice Warriors) and PC (Politically Correct), the argument goes a bit like this. Sex robots are immoral and sexist against women and therefore shall not be accepted as a sex toy. But when other pointed out that robots have no more feelings than a battery driven vibrator they changed the argument to "Robots that looked like children are immoral" and at this point no one want to debate anymore, for if someone still say "It is just a machine without feelings" they will be accused for pedofilia for we we want to identify the robot as a human child in this case. So the moral level look like this: Toaster = It have no feelings and we can do whatever we want with it without moral consequences. Male robot = It have no feelings and we can do whatever we want with it without moral consequences. Female robot = It is immoral to even ask Siri sexual question and it more sexist to even suggest they can be sex toys. Child like robot = Shall be protected with same laws as normal human children and even have some rights. So I think that it is our moral that will do that we in near the future will see the first rights written for robots and AI, the point that if if feel, think and understand will be of less important, humans are a moral creature and not a logical creature. 6:34 It's over 9000!
youtube AI Moral Status 2017-02-23T18:4…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningdeontological
Policyregulate
Emotionoutrage
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_Ugju7aEfGYlMBXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugjeziu4V1EknXgCoAEC","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgiLWOcRt89jfHgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytc_Uggqw-SfwBxqHngCoAEC","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"approval"}, {"id":"ytc_UghD-anJqaf-jngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugg2MtUBRNtZ9ngCoAEC","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UghG18WWY7H_q3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UggNgQ_Hy9w9vngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"}, {"id":"ytc_UgjaMZKvJE3S4XgCoAEC","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugi411ebTWTvlXgCoAEC","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"} ]