Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Many opinion pieces argue that the hype surrounding AI, particularly its promised productivity gains, is largely a delusion driven by self-deception and exaggerated expectations, a new market hype by HighTech moguls. Cites a METR study -showing that programmers using AI tools were 19% slower, despite believing they were 20% faster, suggesting people overestimate AI’s benefits due to a desire to believe in its success, moguls hype. All points to other evidence: an IBM survey indicating 75% of AI projects fail to deliver ROI, and studies from Carnegie Mellon and Salesforce showing AI agents fail 65-70% of tasks. Gartner also notes current AI models lack the maturity for complex business goals.He highlights real-world examples, like Klarna rehiring humans after replacing them with AI, and attributes some AI-related layoffs to cost-cutting PR rather than technological breakthroughs. All critique the cycle of hype fueled by benchmark-beating models like xAI’s Grok 4(against human is a B.S.), which still suffer from issues like hallucinations and reasoning failures. No feelings, no intuition, no consciousness, no reasoning, no future prediction, because is B.S. thoughts from books by parroting search. text-green-game-over
youtube AI Harm Incident 2025-07-24T14:2…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgwgH4WZxT3jZyN1eFt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgySxnlQY5nMmThEFwd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugzt6efwIq1c_yO0G4J4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgwbZNxlul3xltS-cJN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugxmah8pFjN5Ymins-x4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgzHjO_3Fa7H1_IJq6d4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgxH55bEZO889l9e3b14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxRSPVGc5soIdxb2sN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyYN1ZwYxats9DCkbp4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxN6B47Zh8iKmVHoaF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"} ]