Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'd just be like, "Go for it, but be sure to send me screenshots of their reacti…
ytc_UgxKyzEDG…
G
But what happens if one of these trucks hits another car, or a bike, or a person…
ytc_Ugz22LNJH…
G
Act like a critical thinking partner. Your role is to avoid blindly agreeing wit…
ytc_Ugx20rCVR…
G
Again, why do people trust government, corporate media or corporate medicine ?
W…
ytc_UgykprnJx…
G
You guys AI poison art? I ai poison everything, my discord pfp, my personal pict…
ytc_Ugzp2gduG…
G
7:45 in the original version the version Mononoke looks as in she’s in deep con…
ytc_UgzdrvhtA…
G
AI isn’t perfect, actually far from it. The human brain is irreplaceable. Common…
ytc_UgydUQrlA…
G
Lex, Would love to see Slovoj Zizek on your pod.
Zizek has some intriguing tho…
ytc_Ugzsxd9Ra…
Comment
Preventing the creation of superintelligence will have costs, and _we must pay them._ If Dean Ball thought that his life and the lives of all of his loved ones was at stake in the next 5 years, he would not be making any of these points. He would have an existential crisis and then do everything he can to shut down AI development, _no matter the cost._ To do anything else would be deeply disturbed and not in line with the values of the vast majority of humanity.
The real crux here is that Dean is confused about the nature of the risk, and has epistemics that will prevent him from being significantly less confused until bad things happen in the world that are sufficient to dislodge his emotional state.
youtube
2025-11-20T22:4…
♥ 11
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugxhev4NGxygLF8oZMF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxDuXnyJgV4hXBoO4B4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwJ_Utk815mESSL_xd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugxf2ysfrcjwOnYW4F54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxmcFJw3kKLiERMxy54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxL0H9m8rS1m5QivgV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgyFFyGyzTCs1cqrp2N4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwMyuR03RQrhVBnhxp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxcR4GjDOFwp7z_kMd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_UgzG-M5F4kw2zM21MZh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}
]