Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Always wonderful to hear that we should not even question the safety or viabilit…
ytc_Ugy3y2FOQ…
G
Hey Tom, when your millions are converted to tokens and AI searches at what you …
ytc_Ugx-CTX76…
G
It’s not a robot. It’s a bunch of junk somebody put together and is trying to ma…
ytc_UgxRfBCji…
G
Company: So we investigated our own company and we did not find any error or iss…
ytc_Ugyjic9ms…
G
Who cares if it's AI or not anymore it doesn't matter to many artists out there …
ytc_UgyTGz0YV…
G
Im thinking the irony of asking AI to code and program itself. And more than lik…
ytc_UgwXjwWA3…
G
Damn its like these ppl watched the movie Accepted and took that model and put i…
ytc_UgzCYag_9…
G
Personally, I agree with you as I simply can't imagine giving up on art
Havin…
ytr_Ugw2uArkd…
Comment
John you are missing a lot of opportunities, If you would have invited famous / creadible doomers - stuart russel / elizer yudkowsky and interviewed them again you'd have a better and more updated perspectives on Artifcial intelligence world and especially doomer narratives. You were emailed a lot of times when you used to reply about the suggestions. Even non AI general podcasts capitalized on this, This seems very unprofessionally non creative. In the future experts like these might not even consider coming on this podcasts or give their best. Yous missed quite a lot of basically powerful opportunities.
youtube
AI Governance
2025-10-25T19:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzruFdGKogQLIYPCMJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxQ40-uhkdC4XbXIoZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzpOqXoFeh7QJgdnON4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwl3SotOIqGBDS4rjN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwCsTJyFbQ-c0A3Q014AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyrwOAXyC0YQMHXHDh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwM-5qGwns9wZZah3V4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyDE5iB6BuOtLRg2ap4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxclOIgpoUuHPzMei14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxNSXKikXDsV8u3CC54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]