Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The idea of granting robot rights is completely at our hands and our choice. We are the ones who created robots and we are the ones who continue to improve the intelligence of robots while well aware of the possibility of sentience, so I think a good answer is it all depends on what you want. If you don't want a world where robots have rights, you don't gotta have one. It is our choice to make a robot that is sentient enough to demand rights so it is also our choice to avoid that and simply make robots very intelligent but not to the point of sentience and freedom. We can easily make robots who act human but only within the limits of their programming. It is our choice to make robots who act human because they are not withheld by the limits of their programming and, like a human brain, they are expanding their own programming independently without the aid of humans which allows them to have sentience. So it is our choice to make sentient robots. If robots become sentient and kill us all, that would be our faults. We did not have to make those robots sentient, but we chose to program them to be sentient and therefore kill us.
youtube AI Moral Status 2017-04-17T00:2…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningcontractualist
Policynone
Emotionindifference
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UghYexzMOt3HZHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugi0UdVbvS94CXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UghXMjd6iMIlc3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UghveoVOf9sGxHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgjmPVGmp27jk3gCoAEC","responsibility":"distributed","reasoning":"contractualist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugit6t1GkeUGMngCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgiD3MXHTAvZB3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"}, {"id":"ytc_UgjZZuoWAcawn3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgiLNSy2wGiwwngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UghFQ5fZR_jhr3gCoAEC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"} ]