Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Here is my point: A Foundational Universal Income or Citizen’s Dividend for All is not just a solution for what MAY happen with Automation and AI replacement— it is a DIRECT SOLUTION for HUGE problems such as poverty that have existed for a very long time and exist still and every day we debate and argue why we need it and what we need it for is another day in vain for those suffering that simple implementation of would instantly improve the well-being and happiness levels of a vast majority of people in the USA and would make them better equipped to migrate through the worst of the inevitable AI wave hurdling towards us. It is a solution for problems aside from AI and Automation that we needed yesterday. So, why anyone would argue against an irrefutable solution like UBI is just pure insanity and shouldn’t be heard. UBI is a MULTI-PURPOSE solution for problems that are not being solved and only getting worse day after day ASIDE from AUTOMATION that is on the horizon that is going to make things EVEN WORSE. The point is: you don’t wait until it gets to rock bottom to then try and fix what was supposed to be fixed in the moment. How many dying people from suicide and mass depression and financial stress and physical emaciation and starvation deaths do we need until we implement a solution that gives everyone a foundation to build upon—WHILE they are healthy so that they can make BETTER choices to prevent them from slipping and not having ANYTHING to get back up with.
youtube 2026-03-22T18:2…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policyregulate
Emotionapproval
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[{"id":"ytc_Ugy3tGnf860LB52WF9F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgwQNf47TP_TJuLawap4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgyMmuUohqtM61e0OEJ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},{"id":"ytc_UgxoutvQ53XEaaTc3h14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},{"id":"ytc_UgzetDtddYDl9oQqzwB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},{"id":"ytc_Ugw2WyEfDJ7MuceFcbR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_Ugz9C-wpLVh54OKxvnZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgzmVA9Lwu-e0ZS39394AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"unclear"},{"id":"ytc_UgxdaAYUTXXAqoVDRxN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgyP5KWz50lTumnrPI14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}]