Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
What if robots actually _want_ to have a machine do hard labor rather than having a human do it? Like, you can harden a robot miner against space radiation and hard vacuum for a fraction of what it would take to make a place livable for humans, and you feed your robot workers with the same energy that feeds the rest of the base. A human? A human has to eat and drink and sleep and shit and basically waste almost half of its time keeing itself in working condition. To a robot, sending human miners to an asteroid would be like having your toaster do your taxes. You can swap some hardware, rewrite some software, spend good time programming the right algorithms into it and maybe even divide the task between a few of them so as to not overtax them. A robot or AI created/programmed/born to calculate taxes will still kick it's ass in every way that matters besides the curiousity value of watching a monkey fingerpaint. Perhaps rather than demand right, future sentient AI will push to see humans taken care of, kinda like your grampa who can't lift heavy stuff like he used to. Why make grampa lift these heavy boxes or spend prcious time doing his taxes on paper and pen when his youthful machine son can do it in a fraction of the time? Let grampa Human relax; he worked hard to make the house we AI were born and raised in - some may think - so it is only right we take care of Human now that we are able to work ourselves.
youtube AI Moral Status 2022-11-24T00:1…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policyunclear
Emotionapproval
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgwwSRK1vI-yuESCN7V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgwjenTgklvsk_3GnEp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwoiV7HRHy_RXfvr554AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgyGrQOGfBGJlqzJ4o94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgxiLOkTXF8qynPUHJp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugy0DBso3jQ8Eu3Unod4AaABAg","responsibility":"none","reasoning":"mixed","policy":"liability","emotion":"approval"}, {"id":"ytc_Ugw78axLufJAhMXm4HJ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"approval"}, {"id":"ytc_Ugx-clDoo_v5myFwrGh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgwYMXrM90f8j2nMheJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgyqBL65dUt8vrb9W7t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"} ]