Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
A lot of people claim that a general AI would be able to do so much, that a universal income system would be developed. But it seems more likely that whoever develops the AI first will consolidate more wealth and power than anyone else on earth, and it will always be difficult to convince the majority of politicians that universal income can be paid for by taxing those ultra wealthy who get rich off of all the people who can barely afford basic living expenses. I don't know if the sci-fi examples of the singularity AGI is even possible, and I don't think intelligence can be measured the way many tech CEOs describe it. Anyway, Niel had a physicist on Startalk just a day or two ago that had a lot to say about this and it was an excellent episode. I don't remember his name, but his book was a long title that started with More Everything... and its all about critiquing the rise of AI and other big tech billionaire goals, like going to Mars.
youtube AI Moral Status 2025-12-08T23:5… ♥ 1
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policyliability
Emotionfear
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytr_UgxToSmdUI55Ar7oCyN4AaABAg.AKzeOlz8MfYAL1hz6Eq0gN","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgxToSmdUI55Ar7oCyN4AaABAg.AKzeOlz8MfYAL4D4tDlnkC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytr_UgxCPSoh3LipNk7QAet4AaABAg.AKyWbPbFs_dAKykwDf9bo1","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytr_Ugxc4B6bCl5g9HaoPgl4AaABAg.AKyKkxQx2txAKyL7qS3Dmr","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytr_UgyWjAIrzOVVmRWx1Qh4AaABAg.AKy-ksyCO6wAKz9w5j_mYj","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"mixed"}, {"id":"ytr_UgyWjAIrzOVVmRWx1Qh4AaABAg.AKy-ksyCO6wAKzC6B_qDAH","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_UgyWjAIrzOVVmRWx1Qh4AaABAg.AKy-ksyCO6wAKziGO9VFoI","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytr_UgyWjAIrzOVVmRWx1Qh4AaABAg.AKy-ksyCO6wAL0I-irX6yR","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytr_UgxwELpb3zk4KZ5kEjJ4AaABAg.AKxu9YWA1gNAQUtA5Tkv5p","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytr_UgzO4z7a2S9DzFtR3dt4AaABAg.AKxplsvWF7tAKy9M7E7PSU","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"} ]