Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
These people are assholes. But the video is making fun of the cop.
We don't kno…
ytc_UgzgbmSQa…
G
Three thoughts: 1. If AI could ensure that no nuclear weapons could ever be depl…
ytc_Ugy-EzZvX…
G
Chinese just stop selling rare earth to TSMC to end ALL US semi, AI, Tesla, Mili…
ytc_UgzvpwnXc…
G
Fr like noone is gonna look at my shit like oh yeah no thats ai generated lile💀…
ytr_UgyKcL0L7…
G
lmao imagine only having a job that can easily be automated, with the same quali…
ytc_Ugxl7BcIY…
G
Why would you even do that? Its not even the true her just a deepfake. Whoever d…
ytc_Ugyb2lcBx…
G
I believe this dude way more than I believe Google and am surprised to learn tha…
ytc_UgzA4RKCW…
G
This guy doesn’t understand that privacy is already gone. The nsa has all of it.…
ytc_UgxbGxfhs…
Comment
Kurzgesagt, this is another masterpiece! The way you visualize the jump from narrow AI to the concept of an Intelligence Explosion and ASI [12:28] is both fascinating and deeply unnerving. It really makes you pause and think about the speed of progress and the profound risks involved. The comparison of ASI to a 'God in a box' [14:14] perfectly encapsulates the magnitude of this challenge.
Thank you for consistently delivering such high-quality, thought-provoking content that pushes humanity to confront its future. I love the historical context you provided about human intelligence too. To the community: Do you think AGI will arrive in just a few years, or take many decades? Let me know below! 🤯🔬🚀
youtube
2025-11-16T02:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugy0jhJJqDnWC-lzoQd4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzaBmZXcsbea_lTrXx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzzdaMkAmQP0UmC2xN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx5yPvbayTrEKtw0Nl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxH9C_5NME0A_Qz8dp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy4QE9eeY8YdLeK2ml4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy4LCn__Tmd_IMOB4R4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx1J9sSDfkjfpF51cd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwwm3R7aVVzO0UDCg54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugyp6ZXk7UwhtpKd2np4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"approval"}
]