Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'm sorry Ellie, blaming the programmers is ridiculous imo. Platforms like these…
ytc_UgzT15dm0…
G
At 1:54, could you please provide clarification? Did the user have Autopilot, En…
ytc_Ugyke8Ls5…
G
He's not evil, not motivated by greed. He worked his whole adult life on this, …
ytr_Ugx9cKosG…
G
Revelation 13:16-17
[16]And he causeth all, both small and great, rich and poor,…
ytc_Ugwnoyftu…
G
Never lose human control of the military. AI is likened to an impending nuclear …
ytc_UgzcTnm2M…
G
This is exactly what Black Mirror warns us about. That dog robot is in the episo…
ytc_UgxOlAe5f…
G
The best thing for this planet is to have AI take over and get rid of the humans…
ytc_Ugwtiqwum…
G
If you dumb enough to get into a self driving car… then you must know you on you…
ytc_Ugz-RLYmG…
Comment
So basically, this video is slow rolling the Terminator effect, but in a different decade. Instead of 1984, it's 2034. Instead of Skynet, it will be Starlink or the equivalent. In the Terminator series, once AI became self-aware and was given control of the military weapons systems, it immediately began to aggressively eliminate the perceived threat. The human race. In this video, it uses viruses and biological means and quietly takes over. It's not logical, but it's the way leftist humans think. AI will simply pit the 2 sides against each other and use that to distract them into war with each other. Then step in as their "savior" and wipe them both out.
youtube
AI Moral Status
2025-04-27T12:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzbDDC1AMWay2Y6Ghp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy9PD8bzyu2pshB-Q54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy0K8DCC3XjcVyVtXN4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzNnFMlTEkOpd2WXyV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwcHku5NTreMECXc514AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyVnxg0f2DHWeqUudZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwc-nToVOOb2N-VpYh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwWweoTpFJ9L6SJMIV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzrw-USzcdFKUkyGpl4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyN3RJTc08wEqpnbeB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"outrage"}
]