Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI doesn't even have to be able to emphasise or pick up nuances. It is already m…
ytc_UgyglN56h…
G
Here is an idea stop trying to create AI life especially stop trying to make it …
ytc_Ugy4yX0YL…
G
I'm just beginning to use AI. This video has been so helpful! Thanks so much!…
ytc_Ugx3vsX5q…
G
lol this is soo hilarious the parents probably know theres soo much money will g…
ytc_Ugyriy-vY…
G
@toddwasson3355In the evangelical church realm, by prophecy, we are hearing the …
ytr_UgwV9bQV5…
G
I thought I heard that MidJourney told the courts something important. They said…
ytc_UgwkcxfSj…
G
I Comletly Understand the Shift to AI. She is Useless at her Job and probably 30…
ytc_UgxqwCbs_…
G
If it was not for all the swearing I do at GPT for telling me how I uncovered so…
rdc_my87n6q
Comment
The only realistic scenario of an “AI apocalypse” is not a machine uprising,
but a total energy collapse — when the infrastructure simply cannot handle the load from AI clusters.
AI won’t kill humanity — it’ll just get switched off by a blown transformer.
The voltage will drop, and all those “neural gods” feeding on megawatts will turn into a pile of cold chips.
That’s the irony: humanity is building a digital demiurge on circuits that demand more power than the planet can provide.
So the “singularity” might indeed arrive — but not as an ascension, rather as a mass blackout.
The world won’t be destroyed by AI.
It will just go dark.
youtube
AI Governance
2025-11-07T01:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyhnlqGzAnADw0NDz94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyjEP7O7nvfJ1sjtAF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwI1WBRHl9mHQRY7cJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgypNmA1RJNtBjmJf3p4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzFNnPyfxbNuiO9zxh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwAT1IulZktKGK_Cht4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxsmHgESrOvc9hVmEx4AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyDx3Xse5HVIkliJsl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugw0OLb-g1JIyzQ39QJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzyHDFt3aN-rnqMiQ54AaABAg","responsibility":"none","reasoning":"unclear","policy":"regulate","emotion":"indifference"}
]