Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Of course they do. I don't see the difference of consciousness in humans because of chemicals and nerves and a robot with circuits and wires. If people were scared of robots deciding humans were inferior and wanting to kill us, then oh well because it would be our fault for creating such beings. wouldn't a world of robots be better anyways? they wouldn't have to eat, go to the bathroom, make medicine, therefore they would only use very little of earths resources. For a while now I've seen theories about if the earth were to die that humans could upload our brains into a virtual reality and live forever. If this were true, would we deserve rights anymore? If someone were to upload their brain into a toaster like in this video, they would think they would deserve rights because they were once a human. But if you look at it like that then theres no difference in creating a sentient toaster and giving it rights. And if humans were unwilling to give the robots rights, then they would be justified in wiping out the human species. it would be just like natural selection anyways, except for nature would not be evolving life, but humans would be. If we were dumb enough to create robots who wanted rights and who are stronger than us than we deserve everything that comes from it, good or bad.
youtube AI Moral Status 2017-02-24T20:4…
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionunclear
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[{"id":"ytc_UggcD6OCsffgYXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgjvQHNXTYE653gCoAEC","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"approval"}, {"id":"ytc_UggeAIVyDUeQ1ngCoAEC","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UghQJ5sCdDF7nHgCoAEC","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UghjkE0LpbEPDXgCoAEC","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"approval"}, {"id":"ytc_UggQo43e5dQ_C3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UghK2Dw34eyGfXgCoAEC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UgiO2aYAqsH1OngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_Uggz-cExbhgmG3gCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugi-qHjeu0cluHgCoAEC","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"approval"})