Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'm so sick of hearing about losing jobs "people find meaning in." You know wha…
ytc_UgwyaX5PZ…
G
tldr: You probably have a choice to not use AI. Don't use it.
I dont think mos…
ytc_Ugy4m2gpG…
G
GOOD overcomes ALL evil. Only good AI in JESUS CHRIST'S MIGHTY, VICTORIOUS, NAME…
ytc_UgxRTeYPT…
G
hha i thouth the same but its actually make sense when u understand it chatgpt i…
ytc_UgzIFTAG6…
G
While 2001: A Space Odyssey filmed in (1968) did not specifically know about Lar…
ytc_UgwjY-dXw…
G
🙄😶 Prejudice is built into the "systems" and ai uses the ssme data that is biase…
ytc_UgwviAm1U…
G
AI cannot think. These people are scaring the people because they want all the p…
ytc_UgwMFn23C…
G
Once AI AND robotics are working good enough, humans are not needed anymore. 90-…
ytc_UgywIlQjQ…
Comment
The idea of not having developers there that know the ins and outs of a codebase is the stuff of nightmares for any programmer. It'll work at first for awhile but something will break and something will go wrong and if you you used A.I to make everything you will have zero clue how to fix it at all. If you need to add specific changes or upgrades you're screwed. Even if there's A.I generated documentation. Having millions and millions of dollars riding on something that can fly off the rails with no ability to stop it is what we are looking at.
youtube
AI Jobs
2026-02-06T06:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgybcnRXnzKODOcDA5R4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxghy9C8n0QtmVYolt4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx3kiAvAgk6W-xX_rl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxK2XDiSTlY0v0wPFx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyAutQLxGZWCefkXAB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugz5voaIEUF07yRv--B4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugx7m1xorNA3ZBpYBAV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgykhDs1K5t4RxzP7Ox4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxFbsjWnbWegGMe0d54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxt8HLhvgUzOpGDBwx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"})