Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
How will voice acting even work if its ai generated? The mouth movements will be…
ytr_UgyJIOJbg…
G
With respect. This is the only AI art that should be allowed to be copywritten. …
ytc_Ugxx2fNTm…
G
duuuude ai bs aside (totally agree with u btw) ur art got soooo frickin good jus…
ytc_Ugz1S4xfA…
G
The problem will solve itself. They start using AI slop to train the next AI and…
ytr_Ugwn5xrmU…
G
"AI cant exist without human artist. " Oh man, this statement is going to age li…
ytc_UgzWgxI03…
G
the one and only reason I’ll ever use AI art is as a placeholder while I train m…
ytc_UgzwijzQw…
G
I think companies have over invested massively in AI and it will end in a DOTCOM…
ytc_Ugw-XLDZj…
G
The only AI I know is Allen Iverson. Can we talk a out practice now?…
rdc_l57e0ut
Comment
I really like Steven but this episode really highlights just how out of touch with reality he is, AI is great, it will make plenty of people very wealthy but my concern is what about the rest of the world? Like the people who’s jobs will be taken over and the Jeff bezo’s of the world won’t have to worry about paying humans. Ok so if you’re not super rich what happens? I feel like desperation will set in crime very violent crime will skyrocket, the world will be super wealthy and brutal poverty. No in between. I see the world becoming borderless in the next 10-15 years so then what? This guest brings great points unfortunately it’s not stopping it’s only getting bigger. For the first time I think we will extinct ourselves in maybe 30-40 years.
youtube
AI Governance
2026-01-14T05:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxlTf6Bh_TnZL5Enmp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgxUNhLNHBmnSZedMoJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzPslccHQRft-fGy6h4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzWHjjq1r8KldtG_VR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugym25fS9C-A7ANMiJJ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"disapproval"},
{"id":"ytc_UgxzA4Oji-LyKoaF1lR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugw6Al9RDdaL9oDYoKp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy75zseYlt_4EeFHOV4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx0Zl9dKrgKOHlG6xB4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgzJ5_wtZ2tbkkZO3Ul4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"resignation"}
]