Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The monster of AI is the energy expenditure, its economical impact, the fact that they are spending trillions on data centers that will have global terrible impact on electricity, the shortage of clean energy, water, the shorts of chips (ram has gone up 500% in the last 4 months) and politicians and everyone at the top are so hyped on the power of AI to control people and do mass surveillance no one is stopping to think if they should spend TRILLIONS on something that will cause climate change to go into hyperdrive...we haven't figured out clean energy, we already have economicaly issues, and AI doesn't produce anything, so this isn't something like the industrial revolution which increased productivity to food and supplys in mass, no this isn't even profitable, it a giant ponzi scheme, and we keep investing! The monster is the billionaires pushing for it! The monsters are the trump admin that just did an executive order saying its illegal to regulate AI...these data centers are going to destroy gaming making it unaffordable, they are going to destory our electric grid, people are going to starve and be homeless but no its fine because these assholes in charge are excited for the propaganda and surveillence and the automation so they can devalue jobs. I'm not afraid of AI, I'm afraid of the psychotic coked up billionaires shoving it down our throats. It needs to be banned, or at least heavily regulated.
youtube AI Moral Status 2025-12-15T22:3…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyregulate
Emotionoutrage
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgwGCzQBM6B_4VSO-N14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwjU3yNkzmWRaJjf9Z4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"}, {"id":"ytc_UgykZxNMUbv4jGCt82d4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgzuJ-cpOh_FvZ9295p4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugw2CJh8oPw9TgFp-Dp4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgxElQDcm_NAUNqlM2F4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgwoYN1GPqrbFN-uCs54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"resignation"}, {"id":"ytc_UgwSSBWsxmZP9XxuGOl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugw0Hm35ZZnDJwLTHX94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugy9iityQ0p0S42Mqut4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"} ]