Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
For AI there is literally no effort whatever involved. For the human, the amount…
ytc_UgwdGmmyr…
G
You used AI. You are part of the problem. We should completely abolish it, or el…
ytc_UgwT9iH5R…
G
What if the false pretense had something to do with the support they'd be given …
rdc_cjooja7
G
… I told people for years not to go to that stuff and to not listen to that man …
ytc_Ugypq_9rp…
G
AI is a huge power grab, and worst yet, it's a lie.
A program, programmed by pr…
ytc_Ugyk0BSkY…
G
I've had imagery created in MidJourney with faint watermarks and corner signatur…
ytc_UgxYy4y7y…
G
Low brow conversation - Neil too optimistic and not realizing that AI can do so …
ytc_Ugxl4NJ9t…
G
An AI bro complaining about his work being stolen is like a burglar complaining …
ytc_UgwAdOOcU…
Comment
46:04 Umm, wtf not?
There’s this fear of being labeled a tin foil hat guy in academia IMO. They’re also too old IMO.
They should add a few members to their panel consisting of AI researchers who left frontier labs for these concerns. Maybe add one who each specialized in a specific AI risk- Jan Leike, Daniel Kokotajlo, etc. These guys would carry more weight than say Eliezer to the general public- just optically.
Great interview, and I appreciate his open mindedness and risk acceptance, but damn I left this interview realizing how little the doomsday clock scientists know about AI.
youtube
AI Governance
2026-02-26T14:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxPkIgsNXF30aWQnJt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyQgFB3FIL2RHGZ4jV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"curiosity"},
{"id":"ytc_Ugycd9SlpC2nIiSmjx54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz2BsgSvA7qkOT81a54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugx20P5Xzj0BvwMxSqF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyZDYgTm0gbtBFBLsl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx8NvYRWnRxewpYUPB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzgZVrn7axdpU5BL0d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwRDdsg5tnsuygonbB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyBV9QmRgTXrm2zEcl4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"unclear","emotion":"mixed"}
]