Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@Tijaxtolan what is life? How do you define it? Its artificial intelligence. If …
ytr_Ugy6RxlkE…
G
Well i can live with this only if ai is not going to become terminator 💀…
ytc_Ugxxi7Zps…
G
Yeah, he’s speaking just to hear himself talk. Ironically, ChatGPT would have gi…
ytc_Ugxc02bXW…
G
So for everyone here, I'm really curious -
Let's say I wanted to make an AI for…
ytc_UgyXy4S1A…
G
ChatGPT and similar programs should be illegal except for select purposes (eg sc…
ytc_UgxyeO4Dv…
G
In the US the bottom 30% sum up to $0. Having a networth of $0 puts you around t…
rdc_d7ktz9z
G
The moderator posed the debate question as "will AI make work obsolete", but it…
ytc_UgxdaAYUT…
G
Boss: Enough, you're fired. AI, finish their tasks
[AI: `WHY` `DON'T` `YOU` `…
rdc_ofihknj
Comment
As a big AI skeptic who works in big tech, this conversation left me really wanting more substance... Most of what this guy said was just some version "well I'VE been scared of superintelligence for 22 years, so listen to me!" without really making many real solid arguments as to why. The segment with him "roleplaying" as natural selection was baffling, I'm glad Ezra kept pushing back. Haven't read his book so maybe there's more solid arguments and he's just not a great verbal communicator, but he did not really come prepped for this conversation imo
youtube
AI Governance
2025-10-15T11:3…
♥ 68
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwgVNJgSLMJDLUBU8R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgynkSjGpEQy8-Kc8zh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz-4krbJQUYK77HCYJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxVP508yV27MiLU79V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugya6dVxuaLTosbbcnN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugyx5P5xveaUApfBTcp4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxSFDbALHW82C1XitF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwj_ej0JjPm54ddYm14AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"unclear"},
{"id":"ytc_Ugxcc7er26T-uw7x5YJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxYuB6uVGN6VsEjMuF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]