Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
It could get much worse, in a techno feudalist dystopia. The only human thing maybe many of us can provide is our bodies or the ability to suffer, so a horrible future could involve most of us having truly awful "occupations" to amuse the wealthy. If the meaning economy becomes our main way to contribute, I do agree that not everyone will want to or be able to. And even the greatest poets for example, may be surpassed by AI in quality. A good AI might be able to make a million plays the worthy of Shakespeare in a minute or so. That could have profound consequences for our understanding and enjoyment of art and creativity. But hopefully many will contribute to the meaning economy in much less grand ways, just by being human. I can see running a TTRPG, or walking with the elderly, or teaching a kindy class for 1 day a week about things you love, or moderating a forum on people's favourite cat memes as all being meaning economy work that we do. The point about this possible future is that it requires us all (globally) to share the wealth of AI, and then our "jobs" in the meaning economy are really just our preferred activities to give ourselves purpose. I agree that if we are stuck having to actually be economically productive compared to an AI in the future, most of us are likely in a lot of trouble without massive changes to our societies and economies.
youtube AI Moral Status 2025-07-24T14:5… ♥ 4
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytr_UgxToSmdUI55Ar7oCyN4AaABAg.AKzeOlz8MfYAL1hz6Eq0gN","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgxToSmdUI55Ar7oCyN4AaABAg.AKzeOlz8MfYAL4D4tDlnkC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytr_UgxCPSoh3LipNk7QAet4AaABAg.AKyWbPbFs_dAKykwDf9bo1","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytr_Ugxc4B6bCl5g9HaoPgl4AaABAg.AKyKkxQx2txAKyL7qS3Dmr","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytr_UgyWjAIrzOVVmRWx1Qh4AaABAg.AKy-ksyCO6wAKz9w5j_mYj","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"mixed"}, {"id":"ytr_UgyWjAIrzOVVmRWx1Qh4AaABAg.AKy-ksyCO6wAKzC6B_qDAH","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_UgyWjAIrzOVVmRWx1Qh4AaABAg.AKy-ksyCO6wAKziGO9VFoI","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytr_UgyWjAIrzOVVmRWx1Qh4AaABAg.AKy-ksyCO6wAL0I-irX6yR","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytr_UgxwELpb3zk4KZ5kEjJ4AaABAg.AKxu9YWA1gNAQUtA5Tkv5p","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytr_UgzO4z7a2S9DzFtR3dt4AaABAg.AKxplsvWF7tAKy9M7E7PSU","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"} ]