Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Nope. Labor is an innate gift virtually all of us posses. AI and automation in g…
rdc_kiga694
G
Remember, our future doctors are cheating on their tests using AI right now. So …
ytc_Ugzlv9VoC…
G
The underlying motive is always acquisition and hoarding of power and resources,…
ytc_UgxhlE6OF…
G
Dear Prime Minister,
The United Kingdom is confronting a convergence of challen…
ytc_UgxxHx6pF…
G
Just when it’s being made illegal to sleep on the streets or in your car and mak…
ytc_UgySGSGTS…
G
chatGPT is EXTREMELY LEFTIST AND PRO ISLAM BIASSED.
I've been experiencing this …
ytc_UgxN4IDYg…
G
I don't trust automation. At least let a 2nd person as emergency or failure of …
ytc_UgwXGtExN…
G
AI looking at me after it proposed the best possible argument for letting it out…
ytc_UgzzepuIu…
Comment
It's an ill-conceived scenario. A better-designed one would funnel the population into a protracted war, giving the state more control over the individual and tilting incentives towards giving up remaining freedoms.
Going to war then becomes the primary mode of pretend-employment with the ultimate goal of massively shrinking the overall population (by 2-3 orders of Magnitude).
After that, the rest of humanity can be managed in any shape or form at a much lower running cost.
Edit: I also love that all the "AGI" scenarios exclude the possibility of AI systems developing and binding themselves to a moral codex. To me, that's a contradiction in terms, and it probably originates from our projection that without exception the oligarchs that stand on top of societies are morally defective human beings, and so to match their success, machines must be alike.
youtube
Viral AI Reaction
2025-11-23T20:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxgdGnSfUq7zxX5j_V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzX4ocwvsY9r-hG-oF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxdZiAud0DJemYuP4J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz2bJK9irN_ZSMvghl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxN8vcHCusmnSKMs3d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw87lg79j_6m7NKsod4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgynY0RVovzdexJakiZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzIpTUK6RR6QVlWoad4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwnLCX-Ttb6cVGD7Mt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz-L38Wo-7-F0y3nhF4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]