Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I don't think these tech billionaires want more power and wealth. They want to maintain what they have in the corporate competitive landscape. It is more their nature; they are not bad people. AI feels like the last technological revolution to them; if they don't win it, they will be left behind. They are trying to win. Additionally, the US isn't the only player in this race, so it is important that we participate in it. We need a way to support people who become less productive members of the economy, such as those who lose their jobs in the short term. However, UBI is expensive. 50 million people (14.4% of the US population) * $25,000 per year is 1.25 trillion. That is more than double OpenAI's market value. That is a lot. We should not print it otherwise; that would result in inflation, essentially a tax on regular people. We can tax companies some of this money, but then the products will remain at the exact cost. If we don't tax companies, we could see more market pressure, but most likely (since companies have profit incentives), the product prices will remain the same, and their stock will increase as the company becomes more profitable. I agree that we need a robot tax, perhaps more specifically an artificial autonomous intelligence tax (for any AI tool running without a human controlling it). We will need money to offset the substantial change. The funding should go into UBI to support new businesses and overhaul our public education. The tax probably should include the hours of autonomous actions as a variable in the tax equation. Something like revenue * (hours of autonomy/some value) * tax multiplier. Something like that. Easy to calculate, fair to businesses. I see this as totally reasonable; otherwise, people will suffer.
youtube AI Jobs 2025-10-08T04:0…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningvirtue
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_UgzV7UJdFSmd1DWukWl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},{"id":"ytc_Ugxc106Nbi8iLwgGBeV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_Ugy_60bky7M7qAW-j3x4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"indifference"},{"id":"ytc_UgzG6bd267pEwBZL2Q94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},{"id":"ytc_UgxrKPISHuxDeJkexrh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgyrIiA0egQg8eibma54AaABAg","responsibility":"company","reasoning":"mixed","policy":"unclear","emotion":"outrage"},{"id":"ytc_UgwGvvVAwNDU--XWHhh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},{"id":"ytc_Ugz14O8ga3VYFuXBUFt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},{"id":"ytc_Ugxzudi04yq8jfezzhJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},{"id":"ytc_UgzhkevzaMJCKoC9hUp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}]