Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
ok, but why would we program torture to get robots to work for us? It's not like slavery because we didn't program/create slaves, we kidnapped them from their lands and forced them to work, we don't need to force robots to do anything, we don't even need to convince them. If we have the technology to create a feeling, sentimental toaster, whose only use is to toast bread, it would be smarter to create a high-tech toaster with no feelings that toasts bread perfectly but doesn't suffer or desire freedom. But let's say to make the perfect toaster it's necessary to give it a mind, it would still be nonsensical to program torture to force it to make toast, it would be easier to program ecstasy from toast making, i.e. making toast is a source of immeasurable pleasure and fulfillment for the toaster, and thus the toaster would need no convincing to make it toast bread, it would do it out of its own fruition. What would the robot activists say? "No, you're forcing it to make toast by manipulating its mind" no, that IS it's mind, that how the toaster's mind works because that's how we programmed it, and the toaster (if given a say in the matter) would most likely fight AGAINST the robot activists because it wouldn't want to stop deriving such immense pleasure from making toast!
youtube AI Moral Status 2021-08-03T23:2…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningdeontological
Policynone
Emotionapproval
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugzkm5E2aeH70vkP_g94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_Ugw9M3WY7NCOrpI9_454AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyS3o_P3RZxcCClshl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugz7TTICvq0ZW_9I_hN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"}, {"id":"ytc_UgwmRs33CImqnsz18Sd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugx3iA_IsCYhwrTXck14AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyLyWoMMB36UM-HYqF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UgwyLsusbkulNIdFTe14AaABAg","responsibility":"none","reasoning":"mixed","policy":"ban","emotion":"mixed"}, {"id":"ytc_UgzMJrYEp-_3RERUp6F4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxdD62H6E2g6Psu3t94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"} ]