Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
What surprises me most is that almost no one (especially on mainstream American podcasts) addresses the massive ecological costs of AI—its surging energy use (data center power demand up 72% since 2019), resource extraction, water consumption for cooling, and the growing stream of e-waste—all accelerating climate impacts. Framing AI as a simple “horses to cars” style transition ignores the crucial difference: our planet’s ecological limits. This technosolutionist narrative assumes society can always adapt, but oversimplifying like this risks greenwashing the scale of the problem when AI infrastructure could use up to 4% of the world’s electricity by 2030, with data center emissions potentially tripling in the next 5 years. I would genuinely like to see real scientific rigor and investigative lens when it comes to these environmental costs—not just the economic disruption. Isn’t it essential, especially in public debates about the future, to confront whether AI’s benefits outweigh the risks to planetary health? If “adaptation” depends on infinite energy and resources, isn’t it time we demand clearer oversight, enforceable standards, and a systemic rethink—before climate damage becomes yet another unintended consequence?
youtube AI Governance 2025-08-26T15:3… ♥ 1
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policyregulate
Emotionoutrage
Coded at2026-04-26T19:39:26.816318
Raw LLM Response
[ {"id":"ytc_UgwJdj-t5X00D4XYjHR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgzvM8QbtHoTHwmM5Bd4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwiPF5sD2k-OclQya54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugw-75RWbImxxx2sO9F4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzdWwlEsq-Ji1ssVjV4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"} ]