Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So when will we be making anti-AI software?! This is a war and we lack the neces…
ytc_Ugza5WtMn…
G
It's a new world that requires emerging new skills. Make calculated, future base…
ytc_UgxnxDfw_…
G
Wow, I was expecting some mouth-breathing fantasy nerd desperate for a robot fri…
ytc_UgzjGwQZ5…
G
It’s like the “bean soup theory”/what-if phenomenon. No matter what you bring up…
ytr_Ugyiwbhcc…
G
Your last statement is the main point & no one wants that to be the case. An Ai …
ytr_UgwpJirMg…
G
The fact that most artists can have insane amounts of talent and still view it t…
ytc_UgzV7EuoB…
G
Please, DON'T use generative AI for getting information, at least half the time …
ytc_UgzrbkPtt…
G
You need a better hobby, to get those LLM to reply they need to be repeatedly ov…
ytc_UgyMhXezn…
Comment
yeah you really don't want half ass optimized, bloated, error prone shit quality code in your applications. Ai is fine until you're using it as a tool to generate boilerplate or making something small scaled and explain errors but don't use it as a crutch and take control over your codebase cus i promise you it will F up real bad down the line and you will be stuck with those horrible errors. Anybody who has prior experience making production grade application knows ai can't do it all, specially if it's unsupervised or a codebase is huge.
youtube
AI Jobs
2025-03-05T16:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugy95f8zWzRxxzCYTBl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxvb1PHxPGeeRivN2V4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz4_jvlN4z1JhX0gOx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxa7yiLY5SwpxU_05F4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzupzA4Pu6Ftr19whp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzJQ4Na1b8h7k4u6rF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgyutzMTP_NYKY-2SCJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxva4NY5voCWbhswxt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxH9db-1C8k9V3M7IZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxHSsrP8yznJAugBPZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]