Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Cognitive automation always results in less human jobs or less pay to humans. Otherwise it would not be profitable to automate. Automation is expensive, then it also needs to turn a profit. So where does that extra money come from? If the humans get the same share of the money, then the added cost of automation would logically make the whole thing not profitable. Humans do get the money, but it's not the workers. The capitalist owners of the automation company get a larger and larger chunk of the money as they automate away the human jobs. The workers at the automation companies will not get the bulk of the profit. Money into a system must equal money out. So if automation is expensive, then the automation gets that sliver of money. Therefor the automation company takes that money from the drivers. It most likely goes to the automation company owners and a bit to the workers at the automation company. If the amount the automation company had to pay it's workers (engineers, repair people, etc), was larger than what the truckers made (i.e. this technology creates jobs), then the automation company would never be profitable. Automation from here on will always result in less human jobs and more profit for the owner class.
youtube AI Jobs 2025-05-29T04:5…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugyu3uor8MRcGBMjFsh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzFpJjgJQCSZQIVIUx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgwPijGwIQf4mMEwW4F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugxjaz8T4LgSsgP3y3l4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugx_EjQswA5Oi88vU-p4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgzsYtIYSFdRibDeS6V4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgyLQ4NV-hcyNdCW9xZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugyg_uNLiRKLtIdVODd4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"regulate","emotion":"approval"}, {"id":"ytc_Ugyz1qfjFywK4-wvrkt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_Ugz8DRJV2xUpbaAaIDh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"} ]