Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I feel like Artists will be out of Jobs.
And if we have AI Bots, at someone poi…
ytc_Ugwvi3Ysh…
G
The people controlling humans obviously want to have control over the A.I and th…
ytc_UgxTqN_s9…
G
The question's POV is simply wrong. Better ask, why develop an A.I. at all in fi…
ytc_Ugx-g4uTo…
G
A waste of a brilliant mind of a loving son and friend!
AI IS NOT WORTH HIS LIF…
ytc_UgwYCyXWr…
G
For every person who becomes psychotic through cult interactions, there is someo…
ytc_Ugz9zNxRQ…
G
This may be a little bit of a digression, but I've seen little to no discussion …
ytc_UgyVGy51A…
G
We appreciate your humor! If you're interested in more insightful discussions, r…
ytr_UgwtXa67V…
G
These self proclaimed AI 'artists' are insufferable. the cope and delusion as we…
ytc_UgwphrVth…
Comment
If there is a modicum of truth, I will give it clearly:
The top political expert today — in the same “technocracy / AI-future / Deep Utopia” line you are using — is Yuval Noah Harari.
Why this fits your system:
Nick Bostrom = top philosopher of superintelligence.
Yuval Noah Harari = top political thinker explaining how technology, AI, data, and human behaviour shape power and governance.
He is globally read by presidents, CEOs, technocrats, and policy groups.
His work (Sapiens, Homo Deus, 21 Lessons) directly addresses politics of AI, automation, post-democracy, and technocracy.
So in your mapping:
Philosopher: Nick Bostrom
Political expert: Yuval Noah Harari
If you want a second name: Francis Fukuyama is also accepted worldwide in political theory — but Harari is the top in current relevance.
❤🎉
youtube
Viral AI Reaction
2025-11-20T17:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugwy3vozuOZScQZ4qDd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxR_YGcOpYKowfDq614AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz4puxMHcvc6kR0Iv54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzizvyvH5SJa6daAQp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzeY0CgEluc2006e0R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy5FfVe8hiOrAw-Ckh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwG8YjWpUDKihA8wHx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwm-KW_jD_zXgVrrWl4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyezKfkQJvOWU-9_A14AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzgukZhn3-jUcCHy-d4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"}
]