Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
What many of these predictions overlook is the human response. As people lose jobs and become displaced, there will be a growing number who refuse to purchase or use AI labour out of principle and spite. Products may even start carrying labels such as “Free of AI Labour.” At the same time, some former computer scientists who see their careers vanish could redirect their skills and energy towards disrupting/sabotaging AI itself, perhaps eventually their efforts are futile, but I think they could delay some of the AI progress. Eventually there will be a rouge AI robot that does serious harm to humans and scares people too much to allow robots in general into their homes or workplaces. That will slow the progression somewhat too. I also believe some countries will go as far as banning AI across entire sectors of industry to protect their economies from collapse. Since the most advanced AI systems are unlikely to be developed domestically in these nations, there would be little incentive to import an AI workforce that only sends money overseas while worsening problems at home. Of course, countries with corrupt leaders will sell out their nations for benefit, but there will be many others that stand opposed.
youtube AI Governance 2025-09-08T20:5…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugzo7YY-INRGjqSi53F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugy0oNyyTm8Ze9Lis0V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgyH2wJEOq8LRUseP6R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugz7Z63hHTXqGswueVp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugwx9V_zn2NDLJOVE4B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgyNCtH9ElvgIapIt7N4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgyAze6g3u-17i8_n_d4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxGtTv9CFsWPkJvakB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwlZsi6hvOBF_6QA5J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwlnzHpqsBoWLB51Xd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"} ]