Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Look at fictional stories. Take Star Wars. If you look at Star Wars and all the …
ytc_UgzcDIEG-…
G
This is a perfect case where limitation rather than excess drives innovation, la…
rdc_m9gdvan
G
Having no workers...as in replacing drivers with A.I/robots is actually good. Wh…
ytc_UgzH5hB5t…
G
This fellow speaking is the prototype of the superintelligent AI with his speil …
ytc_Ugy4smT1J…
G
There are only three possible cases:
1. Benevolent ASI takes over - Utopia (50%)…
ytc_UgxDTJErI…
G
"SO IF GROK BRINGS NEW PHYSICS IN THE NEAR FUTURE,IT MEANS THAT GROK WILL BE THE…
ytc_UgzMBoxEw…
G
"Form of collage" that's just not true. That argument started on social media an…
ytc_UgyQCA9Uf…
G
lots of rmrk company functionality has been replaced by AI, the thumbnail of thi…
ytr_UgwEhu5cr…
Comment
I have a prediction.
I think that coding will adapt to the constraints of AI and the result will be very interesting. For example I imagine small codebases that are very modular, so that the AI could always fully contextualize. Something like functional programming basically constructing a large application out of many small ones.
I am theorizing currently how to optimize a standard Terraform codebase to work seamlessly with AI and unit testing, and that's exactly where my mind goes - small almost or fully independent chunks, automated to work together.
youtube
2025-03-12T19:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwyDCMH30kGqHFdJRF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugyw39MhL7MblnXhoph4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwBQ4CCTJxaIPIGd7B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx2fxktcWSc3zGfvpF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw8URsZJD1xOiKplxJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgypeNZ8b-cvL2Bp3DN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyV4YjUP4E-Ky7TYkt4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwn-Wn2cjP3ZeUj5Bd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwztBAjbt3luKApKU94AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwTOdj0llFDtxHvMsJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]