Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Brandon, I know you don't want AI to train off of your books, but what if it was…
ytc_Ugx3_JtYD…
G
Also, when AI inevitably develops a consciousness, you’ll be sorry you treated i…
ytc_Ugz01MXEl…
G
it’s HILARIOUS that the dude advocating for AI “art” said: “you can’t expect muc…
ytc_UgxEeH1ef…
G
About consciousness—if AI ever has it, it’s still just tied to bits of info, so …
ytc_Ugz61Srqb…
G
ChatGPT Imitating the Qur'an? |Mansur
https://youtu.be/TAd7Nn1wagE
Youtube Shad…
ytc_Ugwd2zj2S…
G
Ai isn't the enemy, how you wield or use it is the benefit... I like to use it f…
ytr_UgzIfHVwq…
G
best case scenario and likeliest is Ai the machines will want to merge with use …
ytc_UgyVxOcQ6…
G
I don't think people realize how bad this next decade is going to be... (AI tota…
ytc_UgzWAKjD4…
Comment
Context Engineering...cool new buzzword. Only problem is learning to do Context Engineering is basically like learning a new and VERY VERBOSE high-level programming language designed for non-CS people. That kind of thing will make non-CS people more productive, to be sure, but it will NEVER make software engineers/developers more productive because they already KNOW how to program in better, faster, considerably more concise programming languages than your "Context Engineering".
The average result will be people getting code with a lot of minor bugs that doesn't run and then iterating like crazy with the LLMs to fix the bugs but without knowing what causes them.
youtube
AI Jobs
2026-03-09T12:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxL5vie7bp2xryNyi94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwI8G115HUkrRrfWIh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyxLVjgpjuKQAiUxiN4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxgrIZ7VjUKc0409Fp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwkGAZwrMJkM7fOYy54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwIb4oQntO6mtRZY5R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxRv0ZqJoiB00cDgAp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgycckSaeYxTxwnZw-p4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzqkFFED4QkXDG_r-h4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwHlqrM5oaU2ziUjp94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]