Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
When you read the book Understanding Claude , people may come to the conclusion …
ytc_Ugwc2LScz…
G
@topdogalphaagreed. Any celebrity has the same issues with deep fakes, streamer…
ytr_UgxPG9VQh…
G
How about we not make AI too advanced in the first place so that we don't have w…
ytc_UgxdCiXaI…
G
In all honesty, most people hate the 9-5. To have to work until you retire has a…
ytc_Ugy_QJKlk…
G
Definitely, but I would say it is negligent to not use the radar systems as well…
ytr_UgwpNDOGR…
G
Yeah, "remembers too much junk" is exactly it. Decay isn't just context economy …
rdc_ohz8yx1
G
Which is why they want people to scan them. Lots of AI training data to sell. Th…
rdc_ohzrhkx
G
I never heard of this AI program. He tought they would be together in heaven?
I…
ytc_UgzhJ_z1a…
Comment
Ezra seems completely immune to listening to this man. It's astonishing. It's like he has no relationship to what is being said to him, and he's just reading from a list of questions he got from ChatGPT. Eliezer's natural selection analogy was profoundly salient, and Ezra's response is yet another version of "but why don't you just tell it you want it to care about what you want?" I can't take much more of this, but I need to listen to Eliezer a lot more.
youtube
AI Governance
2025-10-17T20:4…
♥ 59
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwMl0lzPm1xxsZAcFd4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwHETfA2xfKMAwPscZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxbSHLBd4DThntlSod4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugyf--mdlNyhqTlfHCd4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxvwJQYs13AEvXUJxJ4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz0CFx7FESDtuIi6Ct4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgziYw-IS1-k18tCdB54AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxEVepyVO83WaMwQZR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyWaFodxKcXl5O9FRl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzWfXWpwGEAhnW7eqN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]