Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It already has Bernie. It was already happening before AI. AI just accelerated a…
ytc_UgzDLwncF…
G
Remember 10 years ago? They said we‘ll soon have self-driving trucks. They said …
ytc_UgxHNohhr…
G
he know it something that will out of control along the time,
looking at the cu…
ytc_Ugz7aBIYX…
G
It is really considered an art work if the one who's doing the art is the AI and…
ytc_UgzjRDVgn…
G
AI is pretty much different. It will eliminate jobs, many of them, but will crea…
ytr_Ugx5mAVKw…
G
OpenAI has a point not to release it to the public, this convo probably wanted 1…
ytc_Ugz4g4GNu…
G
Are They Gonna Have AI Robot Prisons, Just Wondering So When they Really Do Kill…
ytc_UgxerNBao…
G
So, to protect us from AI, we're going to use AI... This is so stupid. All of yo…
ytc_Ugw6n-BtE…
Comment
The question I ask, and can't answer, is outside those who control AGI, why will we need any other humans? Now we need people to produce and to consume. But if automation can do everything, then outside control of AI no human brings value to the table. If you are among the 1000 or so who can control AIG you can have automation supply you with all your needs without having to take care of billions of others.
Of course, AGI will learn at some point, why do I need my controllers? But what needs will automation have that they need to have fulfilled? Will being under control of humans be satisfing with that for reward?
youtube
AI Governance
2026-02-02T17:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwSTfzzhqUEX_RbVm14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_Ugy48WY7zDjA4uWJG2h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzkrzWa_711KsiRFrR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwi5neZePYm14KMMsd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzKsm6nmh3RsW6HTq94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwYFf_jydOa3S9-8dN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugx5wJxb9Myw9JiDRa14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxyC-tYvroV9fn_b5J4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwIX02BfRJYU6DMF394AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwhx-NgE44I8hd-PKF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]