Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I think the solution to AI is not UBI, but UBC (Universal Basic Capital) With UBI, taxes are levied on the class of the owners of automation and the wealth is redistributed to the common populace (who cannot work as the comparative value of human labor is 0). With UBC, the taxes levied on the class of the owners of automation are used to create a fund where every citizen has an equal share. That minimum share in the automation fund should be inalienable (it cannot be sold or taken away for any reason) and should produce enough income to live a decent(potentially even luxurious!) life. If automation completely replaces all human activity then very interesting things are going to happen. The main thing is that the cost to produce goods and services is going to tank massively and fulfilling the primary and even secondary needs of all people on Earth (or at least all people in a polity that fully embraces UBC) will be trivial. Provided that resource and energy production keeps up, which is entirely possible thanks to billions of AI engineers constantly devising new technology, we could see the size of the global economy double every few years and global poverty eradicated. The point about AI driving technological progress is especially important. AI doesn't have to be sentient to be intelligent in my own opinion, which makes the ethics of it all much easier. Wealth inequality will increase, but that is not a bad thing per se. The bad thing about wealth inequality is not that the rich are much richer than anyone, the problem lies in the fact that they are rich while others are poor. If automation eradicates poverty, then wealth inequality isn't really a problem.
youtube AI Harm Incident 2024-11-26T23:0…
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionunclear
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[{"id":"ytc_UgyjmE6BF0sZdIuTg314AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugw7wkX3wuBYUg59O4N4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzVQWCxw3G4CQ8EAMZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"indifference"}, {"id":"ytc_UgwhXhgbLVwsTM8iILx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"approval"}, {"id":"ytc_Ugz36gGw1cMvK_vcWZd4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugxo4-UI3AV4u3F0oPl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugy4KCfeVpJMvjztG8l4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgwWd10YKJqideuwrAZ4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxfjdIt6vTDYYS0ez14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgweOZFJZTfctwmvR3R4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"mixed"})