Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
You don't need to tax the AI's. As AI and automation accelerate, we're facing a fundamental question: what happens to our economy when machines can do most human jobs better and cheaper? Traditional responses like "retraining" miss the scale of this challenge. When AI can replace marketing managers, accountants, and even software developers in months rather than decades, we need a new economic framework entirely. That's where energy-based economics comes in. The core insight is elegantly simple: energy is the foundation of all economic activity. Everything we produce, transport, or consume requires energy. Unlike dollars that can be printed infinitely, energy follows physical laws - a kilowatt-hour represents the same real value whether it comes from solar panels in Arizona or wind turbines in Texas. In this system, instead of chasing ever-scarcer jobs, we'd use energy units (like kilowatt-hours) as currency. Your economic security would come from your community's energy production capacity, not from competing for employment that machines can do better. This isn't utopian theory - it's a practical blueprint from someone who's spent 40 years building the automation systems that are transforming our economy. Kelly Balthrop combines deep technical knowledge with real-world economics to show how we can transition to abundance rather than fighting over scarcity. The book "Energy-Based Economics" launches on Amazon in just a few weeks. It explores how ordinary communities can start building this foundation today, using existing technology and proven cooperative models.
youtube AI Jobs 2025-09-09T02:3…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionfear
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgzOku4LwvWjC4vnatp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgzBIn594z-Vtx36nkB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwYna2wvWvqihMi-bl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"unclear"}, {"id":"ytc_Ugzg-SqwFnU-6VaDKbl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugw2ak0wbgr12CLDLtV4AaABAg","responsibility":"government","reasoning":"mixed","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgycMqPA4zZdqlv8xJp4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgxtXbYlvLjgTEweftd4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzVKR5Yw_m_fpJaRQh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"sadness"}, {"id":"ytc_UgyIXUkH9b7CqF75tJB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgxF7CPeDy6TGuvVWah4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"} ]