Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
There will not be a pause. Game theory and prisoner’s dillemma is in full effect…
ytc_UgyGOTZHV…
G
I’m going to risk being condemned here. But I use ai art to help me paint. I can…
ytc_UgywegPce…
G
Trusting an AI is the fundamental problem.
Even the name AI is misleading - it s…
ytc_UgxLqXLXT…
G
Some people who use ai are alright it’s just the people that feed children’s wor…
ytr_Ugx9j8hal…
G
Ai isnt replacing your job, Its creating New Jobs. The MEDIA is the Liar, the te…
ytc_Ugyp5kZY-…
G
You'd think people would listen to someone as smart as Elon. Nope nobody listene…
ytc_UgzYn_kKF…
G
Ok first of all it's LLM and not AI but anyways:
As a programmer who can't draw …
ytc_UgwAKoMiZ…
G
Bro I'm not even surprised if he used AI to come up with what to copy and paste …
ytc_Ugx32LpkL…
Comment
I think what is missed here in the line of questioning is what will happen. If Superintelligence is confined and still a slave to mankind we will be fine. But if there is an AGI that becomes super intelligent at some point, that General Intelligence will simply wipe out mankind unless one steps away from civilization to a place where there isn't enough technology and even then the odds of getting wiped out are pretty high. Maybe an island not connected to the internet will survive, somewhere in the middle of no where where no one reads or learns anything.
Fact is there are NO jobs in 5 years, most of us will be starving including the CEO of this podcast. The only way out is spiritual improvement and recreating civilization from scratch independant of AI. Yes we lose all knowledge, but that's expected
youtube
AI Governance
2026-02-18T10:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxkeNDMXtbPijAttV94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwTPoh-QPj3qPPkkyx4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz1UmXow4W-2336UO14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzbdvGDZ67AohYivOd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxH1Ba9ufiwtUJkPPF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyzVbxuWfYt_2-wmPd4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzyQrLssUQmiXhJhWl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyGqulGF4_qvNvD1d54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxITGIhQwHBneNT2Cp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgypAAVXd0Tm7mwtt1d4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"resignation"}
]