Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@MorbidEel when did they start driving? I ask because a lot of places have put i…
ytr_UgwiJM4xA…
G
When AI puts massive number of people out of job, govt bear the brunt of huge so…
ytc_Ugwf668-Y…
G
As an artist, I say
Fuck the writers no one gave a fuck about artist
Let the A…
ytc_Ugza7jH6-…
G
It's ok. AI is still learning from the content available. But when available con…
ytc_Ugyf7XBet…
G
I love it! By the way, 6:12 this might be the reason why LLMs are so valued thes…
ytc_UgwiuCw9M…
G
> **It's a safety feature to help you stay awake and alert.**
Yeah, Ford Mus…
rdc_grkpp7v
G
I had to stop in the middle. I feel how Neil talking about AI/AGI and it's effe…
ytc_UgwYndSuB…
G
Because it threatens their livelihood. A lot of these people live off commission…
ytr_Ugx1WOfJb…
Comment
The problem even with Opus 4.6 is that 1) it doesn't give an f about convetions (unless explicitly explained), and 2) it never refactors. Both of these lead to the codebase getting more and more and more entangled over time, and ultimately, the LLM cant keep up with all of the hacks it has created, cracks begin to appear and, because the control flow and execution paths are so overly complex, there's no really easy way out except rewriting the whole thing from scratch. And I'm in no way saying that this is a bad process (software development would be iterative anyway, with LLMs or not). But I'm saying that there's simply no way to get rid of the human developer just yet. Except if you're developing something trivial, like a todo app or a simple flight simulator that no-one will ever play.
youtube
AI Jobs
2026-02-08T14:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxyTDzUWxd-iuB8RlV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgyzJVQ3LXdxwSOeAAB4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwsVLZCEsuftd1bOSN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx8HqC9MjFBs0Wz5lF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxFHzsdu0J1uxNv7Vh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyJNlghgFdOBHTFcdx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugy3qiOmab-UcwxR4lR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxJvCF8me8wOVNoAZV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugyq3uw8erjOwOyconZ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"frustration"},
{"id":"ytc_Ugz4XNlFd5depzlP0i94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}
]