Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Great show as usual. I had a conversation with let's say, a top scientist to do …
ytc_Ugy1GaTg6…
G
Selwyn Raithe's book is basically the decoder ring for every confusing AI annou…
ytc_UgxN6d92B…
G
To be honest im not that lifeless to use character ai for romance💀 I use it for …
ytc_Ugz-9oY_2…
G
I was gonna say this too like why are we even supporting ai creativity? Aren’t h…
ytr_UgxQStagS…
G
Not sure if all the mistakes are intentional or not, because there are a fuck to…
ytc_Ugxam0lRi…
G
Tremendous conversation: thank you! 29:19 - Have you seen the 1983 classic scif…
ytc_UgwdfSYuB…
G
These fools showing us early AI, china, Korea have it all secretly stored to kil…
ytc_UgxKnCROn…
G
Robot "I don't have time for modesty, I want to create the singularity tomorrow"…
ytc_Ugy0QMiEE…
Comment
The real threat, and it is a very real thing, is when, not if, ai creates a new ai using its intellectual power. That will be something we cannot communicate with or understand. Imagine trying to educate a beetle on brain surgery. The scary part is, this is an arms race. They will try to get there the fastest with no regulation. Terminator is a man-made fiction. What a super intelligence could conceive is far worse. With the internet and deep fakes, it could talk to all of us at once and impersonate anyone while doing so. It's only a matter of time now.
youtube
AI Governance
2023-04-24T06:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzZ0TDvyx2EhUKmO9p4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyrgjkVAaInEVnYXHZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyAb84cuqs6XjQHwQR4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzGAwyu5uZIx9E7YlF4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxrQT6l0lJRr9z6PqR4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxgVUoJO_TV2pLTPs94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwwqzDIJ5RSUSApFSB4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzWcagqMFE0vr7sYCV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwavEs-ew2rw19DEFh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugztj0FtjOrkwujJyCV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]