Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It really all changes with quantum computers. Right now AI runs on basically wh…
ytc_UgxDFYA6B…
G
Typical humanity....we can't really control it and what happens might be bad but…
ytc_UgyAJsLls…
G
China 100% will regulate it even more than the U.S. China is a authoritarian cou…
ytr_Ugwk5-sDt…
G
There is one huge problem with such lists.
It only takes into account the work …
ytc_UgzMq4Z77…
G
The issue is that AI steals the art of human artists to make souless works that …
ytr_UgzOZHjaw…
G
ok how exactly is AI gonna do something to the tribes in the Amazon that don’t …
ytc_UgyLTRBkp…
G
I understand your concern! The dialogue highlights the balance between AI effici…
ytr_UgydtIYOL…
G
That's what I hate about AI, the way humans use it.
Just to say, ai is neither g…
ytc_Ugz6UhB6S…
Comment
One comment you made caught me by surprise. You said something along the lines of "we will still have large code trees that programmers will need to navigate". I disagree there. I don't think a programmer's job (in the long run) will be to look at the code the AI made and "fix" any issues with it. Instead I think coders will become more like business analysts that understand the requirements and maybe a little bit of code, but don't actually do any coding. Once you go there, you will no longer need a programming language that's readable by humans. Things like OOP, multiple files in a tree, and even assembly itself will become unnessesary and possibly a useless complication. The AI can more easily generate some sort of byte code that is executed directly or in a runtime instead of text in a file.
youtube
AI Jobs
2024-01-17T14:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwR60PDnaiWnx-ljJx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzTsA7W6T-ATyUL9Kd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy6xYwhRM_1DGv0hqZ4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyf1pd0c_Fo8acHMB14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzwQZ-rD_MrTC_QgLt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzj_e--UqYRk3wfj2Z4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxsm3ZVt_l2uCx57Jt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwDI2OH_Eaz3Fos8U94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxaW59kXLpIWrSkCPx4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxD0jkN5QdxTHZP0ep4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]