Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
There's a pretty interesting part of the AI success scenario that isn't really explained much. As millions of jobs are removed from the economy, the very market forces that enabled that will also make it exponentially harder to replace the rest of the jobs. You point out that they're trying to automate jobs that are already incredibly cheap for humans to perform, and in a bad job market that is a sketchy business proposal because people will be willing to do those jobs for even less. But it goes further than that. As millions of people become jobless what we view as low-paid jobs expands greatly. Millions of desk workers who lose their job aren't going to just mope around. They will accept desk jobs for far less compensation, and importantly, they will flood into other sectors and drive down wages everywhere that still employs humans. Eventually it hits an equilibrium, where computers and robotics that potentially could take more jobs aren't produced because people are willing to do those jobs for a dollar a day or something to eat. The price floor for human labor is incredibly elastic if the job market gets bad enough.
youtube AI Jobs 2025-12-27T11:2…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policyunclear
Emotionfear
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgxHpqczDcxgFRAVy7J4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxKdgrVYheZjSY8wTJ4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzojYTo1_oR7SpsNRN4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugyj3Z07IKPfFQMHJLh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgzNxuY9cn68_I4IHOt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgzNDgX_g_bBlMmRLd14AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugw9hJQDg-qfpEw3Xzh4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgwUQpHGWd8tfgVX4Wt4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxKecgs-Fsv1f7PGoF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgykFNPsDiJNY_NFDB54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"} ]