Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
LLM will not bring superintelligence if you are interested in news and not just …
ytc_UgzZxzbS-…
G
But well that doesn't help 1 iota though. People like me are not listened to. Mo…
rdc_jrrr8f1
G
There is no driver shortage!!! There is a wage problem a respect problem not a d…
ytc_Ugyn6InL3…
G
5:36 It's telling that all Shad's examples of his own digital art look absolutel…
ytc_UgwPwHm-D…
G
Imagine calling yourself "godly" after typing in some words to an A.I generator …
ytr_UgyjHUEpt…
G
I don't remember where I heard this but I heard it 10 or 15 years ago and it's a…
ytc_Ugw2BvxQS…
G
It's i robot now screw this their gonna take our jobs thand kidnap us than turn …
ytc_UgyPogvzK…
G
In the late 1990s we had a similar issue with non-academics getting in on the In…
rdc_le7yyf7
Comment
I only use AI for:
1. Single-line autocomplete
2. Simple repetitive tasks, like translating/restructuring a large JSON or rewriting a simple template into a different templating language
3. Generating boilerplate
4. Sometimes, doc comments
That's all I've ever found it useful for. The common examples of what coding AI can do like simple algorithms and such are not something that I write for real-world codebases, and for anything specific it writes something between a suboptimal solution and complete nonsense about 99% of the time.
Also, I thought it was obvious that if you don't check the code and just assume your AI wrote it correctly, you will definitely spend more time debugging your code than it would've taken you to write it yourself.
youtube
AI Jobs
2024-06-15T01:4…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxskXcdCu_2uOcu__J4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyXLBGJSGG5SQRgBuV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyGOGKyGPg5SbkHEFt4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwMHlI0WF1DeQMIqZx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxWq-_Wo5JEzmrQ5b94AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzgO6JckVaR-nCJ1bp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyFM31JXwiQc8R5eWd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgztffuPX7mtnNb1duB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx14M1hbd7Gz3yZLi14AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzEUZBVOGudLiOenCF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]