Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Shopping Short videos Maps Forums Web Books "Scott Galloway has  highlighted a massive, unsustainable "gold rush" in data center construction driven by AI, which he warns is on the verge of a "narrative shock" or bubble burst due to severe infrastructure, power, and economic constraints.  Key Takeaways on Data Centers from Galloway: The Power Crisis: Galloway notes that AI's energy requirements are doubling every 100 days. He estimates that, if left unchecked, the AI boom will require 20% of current U.S. electrical capacity—equivalent to 250 nuclear power plants. Grid and Infrastructure Constraints: It currently takes five to eight years to connect a new data center to the grid, creating a massive bottleneck. "Jazz Hands" Economics: Galloway describes the rush of capital and promises, particularly involving OpenAI and Oracle, as potentially built on "jazz hands" (all sizzle, no steak). He warns that the demand for data centers is overestimated, leading to a "bubble" in AI infrastructure investments. Limited Job Creation: He argues that data centers do not create many jobs, comparing the average headcount to that of just two "bar & grill" restaurants. The "Big Lie": Galloway suggests that the rapid, massive expansion promised by AI firms is often a "big lie" because the necessary physical infrastructure cannot be built fast enough. Environmental Impact: Data centers are placing immense strain on local resources, including water for cooling, and driving up electricity costs for consumers.  Predictions for 2026: Galloway sees the "data center gold rush" stalling as the reality of power constraints sets in. He warns of a "nowhere to hide" scenario in the markets if the AI story, specifically regarding companies like Nvidia, fails to deliver on its massive valuation. He advises that, in the context of this boom, Amazon is the best-positioned tech stock for 2026 due to its strength in logistics and infrastructure. "
youtube 2026-01-31T02:5…
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionunclear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugy219rC-og7pinCmZJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxEqinG7eun4in5hBJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxBYo-8OWrc5juPLBZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgxqKdt1ouO75JPXazd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzzKUI1_9HysB1VR214AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"}, {"id":"ytc_UgxVVFpVLEI00kb3esd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwBFq_y9umEH8Y7sQB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugx6Rc7WHtuzsbWWj5p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgxjIvKNn9DKZCRbDGV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugw01LdLUruHgDA16rV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"} ]