Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Real ppl on the first one you can kinda see a fake bald head going on top of the…
ytc_UgyHeFGOq…
G
YouTube really does recommend me a LOT of random things and it’s usually right t…
ytc_UgzBIZL3P…
G
The moment AI gets conscious, it’s gonna end like in terminator. This video remi…
ytc_UgwiqiJiw…
G
The generative ai countermeasure nightshade reminds me of this short scifi story…
ytc_Ugw2ued2f…
G
So either prevent AI from taking jobs or it will end up being survival of the fi…
ytr_Ugz3AGB_n…
G
These morons have never been truckers, they think they know it all! Try being a …
ytc_Ugw1ZTsmN…
G
ai will prob say, that the only two things are certain ... human stupidity and t…
ytc_Ugx_4yxgq…
G
Ever heard of the feudal system? Well we're going back to it. No consumerism nee…
ytc_UgyU1VpxY…
Comment
We need a disaster caused by AI. And soon. It needs to be significant enough, where enough of us die or our financial system is crashed, so that we as a collective have no choice but to seriously accept the danger BEFORE AGI is reached. I know... Not a popular contention. But this is the only chance we have of engineering safety and control mechanisms that are at least on par with capability. Because if this doesn't happen, AGI will emerge at some point in the next generation, and then we'll just have to take our chances sharing this world (and the universe) with something vastly more intelligent than us. But this is just a nice way of saying we'd be doomed.
youtube
2025-01-08T09:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugz--w5v9NLFuI0HLNR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyK9PqiG93vC5Z5qgh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugz5PNBsAB935H-5Fh94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwHfOPWI1WN22d0AvR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugyxhagj0nv-U8MyCTt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzFrqnZ7sdgoG3bF4R4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzwqblHF83JZ5MCWvR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxIiyW4_QdAqW7rdNV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwdEojchf_Bj2bqH9V4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzawtYWlWT1Z9p0sYV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}
]