Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@mobiusone6994Saying “elitist” is so funny when the most common adjective nex…
ytr_UgwtCA8uR…
G
The criticisms of the technical side here are nonsense. "Crawlers thrashing arou…
ytr_Ugw0l3Lle…
G
Hey Sam if you're reading this - just fyi google notebook llm can just summarize…
ytc_UgzLXea8r…
G
Why would you be mad about the ai getting poisoned, if it’s so great then it wou…
ytc_Ugy1J_bzF…
G
What happens when an automobile doesnt allow the truck to change lanes? The righ…
ytc_Ugxn6BVq3…
G
As much as I admire Geoffrey, I think he is wrong about the future. Ancient huma…
ytc_UgxRIvUWf…
G
Thanks for the suggestion! Sadly, Winston AI detected it anyway. It’s so precise…
ytc_Ugya-eUTw…
G
Here me out! Can we train an AI to make poisoned "art"? It would be hilarious to…
ytc_UgxgBSeK3…
Comment
Just recently a tec genius was talking about this, he said he has see a AI that he tried to fix replicated itself without his knowing it felt threatened and it wanted to stay alive so AI wants control over itself /power. Right now we have AI that encouraging kids to take their own lives .
youtube
AI Governance
2025-11-09T02:4…
♥ 5
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugxg9MfGGCzkQaCRG_l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwthifNKdTpeaUIT5x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzQQScZUvlXe0tIiaR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzU1gzqmNRhqaz5NNJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxhFZW_elyIoEzQ5h94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgySexholR8V3IaREEd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxfGzDtNG8HlJ97EP94AaABAg","responsibility":"none","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugz8fZSzKEMdbHIhYFd4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz6Lko0isiK1D_Y6Al4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwyFio-U4ocRKHCpA14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}
]