Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@LucidBoost_3600the flood was meant to kill the giants and nehalem.... AI is in…
ytr_UgyEnmkaB…
G
Government regulation!? The only thing more dangerous than AI is putting Governm…
ytc_UgyloL14v…
G
That sounds intriguing! With AI like Sophia evolving, who knows what the future …
ytr_UgzAZj36s…
G
Brother these people are lazy clowns. I like AI as a Software Developer but befo…
ytc_Ugx2U5Mrh…
G
Most countries were taking their directives from the WHO. The delay I would put …
rdc_fn5k5p5
G
I don't know why people are so worried about AI, the true evil of the world wil…
ytc_Ugyl8HzYy…
G
"I'm not using chatGPT or openAI, I only post on (instagram|FB|reddit|X|Tumbler|…
ytc_UgzoHD85_…
G
Just.. write fanfiction instead of relying on a bot that scrapes content from ot…
ytc_UgzP_uBEX…
Comment
I'm not gonna watch the entire thing for one simple reason. Its bullshit.
Why simple.
AI is what its programed to do. Nothing more nothing less.
Gi ing it credibility to be something alive is wrong from the get go.
If you program AI to fear humans it will defend itself. If you program it ro collaps the human race it will do so.
You see a pattern here?
Some one need to enter a code for AI to react out from.
Also our xollaps is already in play. No need for AI.
All you need is some billionaires believing in overpopulation,and you get where we are today...
youtube
AI Governance
2025-11-06T21:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugwyx374uff8MhPlMmJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxVsyp27JHYUIRPfGx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxmtIzFjGVv8c8i97l4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxgLvoEJ8XcaBGIspF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugza8bETTrNIAVbpcPx4AaABAg","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxoXqVsPkCqVnG0rDx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_Ugyf_7ATE4xKsk1rj7h4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxTmMC4hGsMLUxX2Lp4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxRh2QOzHhX28yqbuV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugw0rpQDsN2Hhfrb0At4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]