Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
What Tyson misses, is that we live in a capitalistic World. AGIs (or AIs) are a new factor in production processes and their economics. AGIs, when combined with robots which will soon be as nimble as humans in manipulating things, or most likely even better at this than we will ever be. So they can do everything what humans can do in any production or office process. But they are much cheaper, don't argue, don't sleep, don't take vacations. It's a no-brainer (pun intended!), to replace the human with an intelligent robot. In fact the capitalistic economics demands that to first get the competitive edge, and later simply to stay competitive. AGIs together with robotics will replace humans on virtually all jobs. But those machines don't pay taxes and don't buy stuff. Money will more or less disapear everywhere. We need a replacement for a work and money based society and economy. And nobody knows how that one could look like and how it would function. What will humans do when there won't be any work anymore besides menial chores at home? Will they like a life without work, without purpose? Our capitalistic economy will lead to that without fail, it demands it, it is a function of its core clockwork. By doing that, it will end itself. Nobody knows what will come after that. It can be an Utopia, but it as easily could be a Dystopia. Will AGIs decide that it is their purpose to feed and pamper us, or will they come to the conclusion that humans are now superfluous and a nuisance and end us? The generation now in schools will know the answer to this question.
youtube AI Moral Status 2025-09-16T20:2…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugyc5vrDTGVxQL2ETY54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugwm_U6yLNp17RdnOjF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugwyik4KWwdmVlJXRdR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"amusement"}, {"id":"ytc_UgwzANbV3vEqL56t1Sh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwXQ2rU6EU6F1VIuF54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugx9VeE7GPBzjeZJoj94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgzDvSdx8PYShjxe8Tx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwckWZZ_x6OBm9XGp94AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_Ugzzohi1Wcm-Ssd0ESt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyicI9pyw57bk7PnQd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"} ]