Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
@MaxwellAyo Haha, yep, exactly. It sounds like a contradiction. But there are many different tech people, and Neil is listening only to those who confirm his "optimist" bias. Here's how to think about it: 1. We should NOT believe the tech people who are _required_ to support AI. The CEOs like Satya Nadella (Microsoft) and Amjad Masad (Replit) are _required_ to downplay AI risk because otherwise they will be removed as CEO. This is because AI is (for now) making billions of dollars for their shareholders and investors, so these CEOs must outwardly support AI. Likewise, the employees at AI companies are _required_ to say only good things about AI, otherwise they will be fired. 2. We SHOULD believe the tech people who left AI companies so that they could tell the truth. This includes Geoff Hinton (Google), Daniel Kokotajlo (OpenAI), Steven Adler (OpenAI), Miles Brundage (OpenAI), and many others. 3. Then there's Sam Altman, CEO of OpenAI. He's complicated. He gets his own category. In 2015 he wrote an essay that said AI is "probably the greatest threat to the continued existence of humanity." Now, in his latest essay (titled "The Gentle Singularity") he says that AI will be wonderful — but this is after OpenAI has raised $64 billion in investment capital. He is a master at saying the right things to the right people at the right time to make a company's numbers go up. Even so, he still signed the Statement on AI Risk (which says that AI can be as catastrophic as nuclear war and pandemics) so we must _never forget this_ and hold him to it, no matter what he says next.
youtube AI Moral Status 2025-07-24T04:4… ♥ 1
Coding Result
DimensionValue
Responsibilitycompany
Reasoningmixed
Policyunclear
Emotionmixed
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytr_UgzmTkSdVq60o-Z1Tih4AaABAg.AKwc-EU86DeAKwmubuQCW9","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgzmTkSdVq60o-Z1Tih4AaABAg.AKwc-EU86DeAKwmvYye-cM","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytr_UgzmTkSdVq60o-Z1Tih4AaABAg.AKwc-EU86DeAKwnxzr7AQS","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"}, {"id":"ytr_UgzmTkSdVq60o-Z1Tih4AaABAg.AKwc-EU86DeAKwvnsqK4WK","responsibility":"none","reasoning":"virtue","policy":"unclear","emotion":"mixed"}, {"id":"ytr_UgzmTkSdVq60o-Z1Tih4AaABAg.AKwc-EU86DeAKx3_HxAK1c","responsibility":"company","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytr_UgzTcWf2kvWsDJvdMv94AaABAg.AKwZ5E-OTZPAKyI2-YMW5k","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgzTcWf2kvWsDJvdMv94AaABAg.AKwZ5E-OTZPAKyMtdPQrAA","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytr_UgzOLN7FjeGylw-yjLZ4AaABAg.AKwV3CcN6E_AKy1JEMPHix","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytr_UgwahPJeQ_LeTKcUhPd4AaABAg.AKwUnMBvzExAKyILP0yhbu","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytr_UgzcORwrPOW2jPacwXJ4AaABAg.AKwSbTTW-CXAKyRCLiXYcO","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"} ]