Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
You forget one thing, bread and circuses, rich and powerful people understand that it's better for them to give people UBI, give them something to lose, something to keep them distracted. yes it reduces their wealth in the short term, but they vastly reduce the risk of being assassinated. unless they have seriously unscrupulous AI and are comfortable with genocide, people with nothing to lose will be a much greater threat than a placated population of people with video games, VR, "corn", etc. all made super cheap. rich people don't need all the land on earth after all, they don't care if some of their nearly unlimited robot army makes a few colonies of people with food, shelter, health care, and entertainment. and ultimately, whilst it's unfair people have more and some have less when none of the, are working anymore, at the end of the day everything provided will be enough to achieve happiness in life. it's more realistic imo, even if still quite pessimistic compared to e.g. star trek or the imagination of some AI bros. what a bigger threat imo is that many of the rich will have genuinely evil views, like forcing Christian laws onto everyone in an even more undemocratic way than currently, and use their dominance for that. and maybe have a rolling door of scape goats who suffer abuse and propaganda to further reduce chance of an uprising. besides, imo after we get one AI that's genuinely better than every human at once at any mental task within reason, I don't think it'll even be a month before we hit ASI, so I think all these circumstances are unlikely, we'll either all be killed, all be tortured for eternity, or all be cared for by a an artificial "parent" that abolishes the economy as we know it and fosters as much fairness as it can whilst operating billions of physical machines as one "mind".
youtube Viral AI Reaction 2025-11-23T21:3…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policyliability
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugw3NggXDMRM3in86KV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgximSBe6DiYmtgzRcR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwFi2WS1DmKIHK8FsV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_Ugzr7Eib6FAbL6W2Kix4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwrE0ShyxVTqY0DU594AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxzXRoPAB19W4LThyJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgyIGNhYvgMPUS578tV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgzjG3YkXvoOhndHzPV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwKPrtiA8mXqqiOlgp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyPM7vnUHW2x8MWYp14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"} ]