Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You kind of skimmed over why the open ai employees sided with Altman over the bo…
ytc_Ugz6oVjkT…
G
This is the new future where robots and
artificial intelligence are taking over …
ytc_Ugy0cWQxV…
G
Tech Bros: “You can barely tell which one is real… AI is the future bruh.”…
ytc_UgylZ0pk0…
G
Seriously concerned on advancements in artificial intelligence and the governmen…
ytc_UgzkiKUbK…
G
If all jobs are taken, then whoever employs the ai will have to pay a significan…
ytc_UgyEstcFT…
G
I feel sorry for whoever thinks this is real 😔. They replaced the human with a r…
ytc_Ugzi94u3I…
G
Na I would never trust AI with my health
An AI diagnosis of cancer while I just…
ytc_UgwmvcBFv…
G
I think the fundamental misunderstanding people have about AI is that it's just …
ytc_Ugyo-4332…
Comment
Considering the fact that when AI tries to produce a picture it can’t ever get it “right” enough to be believable I’m not surprised that it’s not all that great at producing elegant, maintainable code. I was a maintenance programmer for years. If code isn’t easily maintainable it’s a f’n nightmare when it breaks due to changes elsewhere in the system. Bad code can increase the time needed to research, fix, test, and load exponentially depending on how bad it is.
youtube
AI Jobs
2026-02-07T04:0…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgycIDPUkjfHLO3cT7J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgycdeGqsZfgO5eFhB54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz68YjJNnnn1rTJvGd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzUn6XevoRdTEBWAcx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxpeXbyEGCEJziGJgB4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgwpeX738iUFeI6Irsh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzW7T-r73nExDI1xOB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxC25fCkGCR2os_IyZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugxih-JBBw9uSXfaRup4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzlrbbuWo283uU7HJd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]