Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Many people seem to be worried that AI and robots should "take" the jobs from humans, as if work were some blessing given from God, and not just a necessary evil for survival! The often-stated idea, that if AI "take" our jobs, we couldn't afford buying anything, since we won't get any salaries, I think indicates a lack of understanding of basic economics! IF AI and robots, i. e. machines, would "take"  some, or all of the jobs humans do today, the only thing that would basically happen, is that we would have to work less, or not at all, and still get all the goods products and services we have today! Also, if some of our jobs are "taken" by machines, it DOES NOT necessarily mean unemployment  have to increase! if working hours are shortened, the number of jobs will be the same, and thus unemployment does not have to occur. Please note that the salaries DO NOT have to be reduced, no matter how much the working hours are shortened, in order for the companies profits to stay the same! The alternative is that a certain proportion indeed become unemployed. This means companies save money by not paying salaries, which means taxes for them can be raised enough to pay EVERYONE, whose job have been "taken", with some form of EQUALLY HIGH government compensation, also WITHOUT reducing the profits of the companies! If ALL our jobs are taken over by machines, NO work has to be done, NO wages  have to be payed, and taxes for companies could be raised enough to pay EQUALLY  HIGH citizen salaries to ALL PEOPLE, and just like in the other cases, WITHOUT lowering the  profits of the companies!
youtube Cross-Cultural 2025-09-29T17:2…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgwAVH8qTefN0rXgSFp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugx6zmjXMxs3WMDPrcZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgyMa0Ry-D1ccPc5d9R4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgwgXeyu15NTrDC-sLt4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugxtma1Wcw54s3MroqR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwunEHfWuKunD_Q37t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugza0ON_yNFSVPAdsrN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzTthPa21ih1v9NrCl4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UgygF1OWAPfGe3qRZ5p4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugyj-X_JP3I62wzf3ah4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"} ]