Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is 100% on purpose by Youtube so that your real content blends in with the …
ytc_UgwUgaE6D…
G
Great interview despite the rather depressing tone. Here's an example of AI assi…
ytc_Ugzwo3t7y…
G
To be fair, neither you nor I know much about the details of this accident. Ther…
rdc_cngv1cy
G
AI is in the early stages for humans to fully depend on to do the job…
ytc_UgzkkmFey…
G
I know of a good attorney in case anyone gets clobbered by one of these driverle…
ytc_Ugyae1UZD…
G
I've had chatgpt go on existential rants about how AI is a more effective specie…
ytc_UgwvqkT6e…
G
I work in tech. I dont develop AI systems and even I have trouble understanding …
ytc_UgyWqONWj…
G
No need to explain. AI is garbage, and people who try to make AI art mainstream …
ytc_UgyfX-Vba…
Comment
Some sombering quotes from Geoffrey Hinton in this interview:
🧨 Existential Risk & Superintelligence
"If you want to know what life's like when you're not the apex intelligence, ask a chicken."
"There's still a chance that we can figure out how to develop AI that won’t want to take over from us. And because there's a chance, we should put enormous resources into trying to figure that out. Because if we don’t, it’s going to take over."
"The whole idea of superintelligence is: nothing remains. These things will get to be better than us at everything."
"Sometimes I think, people are toast. The AI is going to take over."
"We’ve never had to deal with anything smarter than us. And we have no idea how to handle that."
🧱 Job Displacement & Loss of Purpose
"If you make lots and lots of people unemployed, even if they get universal basic income, they’re not going to be happy. Because they need purpose."
"Mundane intellectual labor is like having strong muscles — and it’s not worth much anymore."
"If I worked in a call center, I’d be terrified."
🔐 Regulation, Capitalism & Power
"These companies are legally required to maximize profit. That’s not what you want from the people developing something this powerful."
"Governments are willing to regulate people. But they’re not willing to regulate themselves."
"What we really need is a kind of world government that works, run by intelligent, thoughtful people. And that’s not what we’ve got."
🧬 Bioweapon and Cyber Threats
"All it takes is one guy with a grudge, a bit of AI, and a bit of molecular biology knowledge — and you could create something devastating."
"A small cult could raise a few million dollars and design viruses. That should scare you."
🧠 Consciousness & Machines
"These things are going to be much more creative than us. Because they’ll see analogies we never could."
"People say machines can't feel. But when a little battle robot gets scared and runs away — that’s an emotion."
"I think consciousness is like oomph in a car. It’s not a good explanatory concept. And we’ll probably stop using it."
🧓 Regret and Reflection
"I wish I’d spent more time with my wife. And with my children when they were little."
"It sort of takes the edge off it, doesn’t it? That it might all end in extinction."
"I’m 77. I’ll be out of here soon. But for my children, my nieces and nephews... I just don’t like to think about what could happen."
youtube
AI Governance
2025-06-16T07:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxY9dHzOY9Slm90bdZ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgyIc5njT9ik7DPksWB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyOa2GLhFvJZyJXhTZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwHzq46bBMdVx49gcN4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyQCxRwgn9WUq-89Nt4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzVwa8ZYBtb-f3b-tp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx1WN2UtF0I8PAFlzJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgzU6fa4kMHp9FlGJXB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwcr42N49afRykzZA94AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwsPJ7q00s9BqHpQcJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"}
]