Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
99% of world population will die off in few decades once the agi arrives. No way companies would not replace employees to increase profit. And, no just because there are no people left to buy stuff will not be the reason companies won't do so. They would just make things target to elites. Millionaires would be the new poors and middle class, because they have least say in how AGI is developed and used. Billionaires the upper class, with much control over ai and the wealth generated by it. Also a new ultra-rich breed would become common, trillionares. Most of us would first be taken off jobs, few top jobs like doctor and scientists may remain for few more decades until they are replaced. Most will run off their savings in 5-10 years. Then mass hunger would come, protests, wars, socialist outbreaks, etc. but all suppressed by robot armies. 30 years in 99% people who belong to poor and middle class would be completely wiped out. Few left may sell themselves to survive or become plaything for rich like hunger games. Btw this is not some dystopian take, this all happened a century ago. Where most peasants were essentially play things for the king
youtube AI Jobs 2025-08-29T05:1…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policynone
Emotionfear
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgzpPWHuBwtrK5Q885h4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgzsWNYJzeyBZNqsJyN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgyJ6QXBlMUDa3T58sJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"}, {"id":"ytc_UgxeWH4OSYdZ9fzzK5F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugyn9ykPlnRpeFNc3Zp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgwWABZGAh3dGPZqq7t4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgzMa8UTw_WGDJ1AnHp4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"resignation"}, {"id":"ytc_UgxahWsCECMivLQcMad4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxJwvMbyJEIXDJmwSF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugz5rkr0aRHXc-e7H1d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]