Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I believe humans are paid for their time not productivity. If I can make 1 widget an hour in 1950 and with productivity gains I can make 100 in 2025, I’m still paid 1 hr of labor. In almost no industry is pay based on productivity. If AI 10x productivity then people will make 1000 widgets and still be paid by the hour. The idea that AI will make us so productive that we only have to work 1 hour a day is naive. History tells us productivity gains have almost never resulted in less work. I’d argue that humans would get too bored if we all only worked 1 hour a day. What does happen is productivity gains result in fewer humans needed if demand can’t keep up. If your workforce can produce 1000 widgets now with AI but you can only sell 500, then you lay off half the force. So the result will be less jobs, more productive per employee, more stuff to buy but less money to do it. Acting like AI will create a utopia is stupid and it’s funny how it’s the people who are going to make trillions on it that are telling us to embrace it because it will make us not have to work and get everything we want without costing much.
youtube AI Jobs 2025-12-24T20:3…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgxW2_B29cK6fYsWIN54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxwDNdtFQk9lCK3CdR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugx9NMHqQYH8WqUpvu94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxzhGBmjL6LYkTtGZV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgwTO2pI2AlrhnQRHvV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}, {"id":"ytc_Ugyo8znIHlAOgxGPMEF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgzoZEkfeOcb4iES5cx4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"liability","emotion":"approval"}, {"id":"ytc_Ugwyzfc4XEij2KKsEu14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzMD5ghW6XDqjJ_G2J4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxYcjY8UnDmDbMBJIp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"} ]