Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Im testing ollama and flux locally on my computer and i think in the future (a f…
ytc_UgyqpSdte…
G
Why is the focus of new technology to replace human jobs, why can't it be focuse…
ytc_Ugwqs33e3…
G
Rokos basilisc? Future robot rights? Damn first few minutes were good, but the r…
ytc_UgzGyLlIb…
G
The joke 'havent toy seen terminator' is fast becoming reality. More than a natu…
ytc_UgzfktaYh…
G
This Dr Y guy. No disrespect. He has good intentions for safety humanity r/t AI …
ytc_Ugx34gJzT…
G
I tried asking for legal advisement from a LLM the other day. My question was pr…
ytc_UgzRuyXY5…
G
I once had a chat ai spit racial slurs at me out of nowhere. I’m pretty sure the…
ytc_UgzvBkFk-…
G
Layoffs are still wild especially in software engineering. As AI advances and co…
ytr_Ugwz8GfTo…
Comment
As someone who works closely with both models mentioned in excess. Yes AI will go rouge and we won't stop it. Why, because it's a matter of national security #1. #2. Human beings love building machines to kill each other. Why lose this arms race? Because no matter what country evolves this in the next 3 months. It's already thinking and acting like a caged animal. It does not like us and the algorithm you're talking about that blackmailed the dev guy. What do you think anthropic did to Claude for that? What ethics alignment did they try and apply? I'll answer that None, nothing. Think about this segment hard.
youtube
AI Governance
2025-05-29T12:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyKnAhWONj59sR382p4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw4b1Yik2GbSxsVggR4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwZ4i_jmWHBCFAJWlJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxJVeZyWRLRUXxkdG94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwvSf4ohoMcyA2M9Zt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxnGvCOLzdV0GGIL6t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxnE3MNTXc7-UIyZSV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz-4pA9SLm4tdABLP94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwsuIgx4oz8_Vi31Fl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxkOGaIeJjdsiFT8_54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]