Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
at some point this is just childsplay, because if we actually achieve ai (and llms are fundamentally not reaching that, they have plateaued for more than a year now and all the advancements are just better utilization of the limited intelligence) then the actual issue is not that they will take over the economy, its that they will have no need for the economy. a human-type intelligent ai, can spawn a million average level scientists to make ai smarter. then it can spawn that 1% smarter AI to make the next AI (1.01)^2 times smarter, then ^3 and so on and in anywhere between weeks and hours AI is so intelligent that it has no need for ... pretty much anything. It can just manipulate humans, it can take over any non airgapped computersystem (and use humans in whatever way it pleases through manipulation) so that is going to be a few weeks to months of money still existing as a way to incentivize humans, and after that, we have basically created a superintelligence with total control of the planet, plans we are unable to comprehend, while it understands everything about psychology (because it can spend a million years of a million psychologists just figuring out how to get humans to do what you want) biology, medicine and so on if we actually achieve AGI we have half a year until none of our problems matter anymore because either AI will have solved it, or AI will have destroyed us. so no, most jobs can't be taken by ai for now and once ai is smart enough to take over most jobs it doesn't matter
youtube Viral AI Reaction 2025-11-29T21:4…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugxl0PFFFQxShPCIZgV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyWS_zFNL_nTbozYRt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugx9NSRhdClkTGV89NR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzFoB49iyhFAvhisCt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzsKe9_LVnKtL-JFN54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyekY4lXTKpooUwFSB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzKWhsGIa6KIn_MDIR4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgyBtZA0fqEXAdtKzC54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"}, {"id":"ytc_UgwIoWwKSLnq77j9aAR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugx--9mbX9WQ6Ck6pMd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"} ]