Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think the answer Isn't black or white. This whole subject has so many facets. …
ytc_UgyNWYgWK…
G
Humans do not create art when using generative AI--a machine creates it. Regardl…
ytc_UgxVMkDya…
G
Just had one. It was a man making a robot girl. My indicator was the weird blur …
ytr_UgzBhMi-4…
G
@ basically, it slightly changes each and every pixel of the art. the difference…
ytr_UgwQ8UjJg…
G
Ai drivers comparing digital art, which has the same intential effort, time, and…
ytc_UgwiUA8hs…
G
четверть века назад у моих детей была подобная кукла "умная Настя". она в отличи…
ytc_UgwyZRfZQ…
G
It's not art when It's made with emotion less ai , it's just a image…
ytc_Ugw4fb5gu…
G
AI cannot be aligned with humans because those who are creating AI are not align…
ytc_UgxE4Lys0…
Comment
These systems are probabilistic prediction engines trained on human data, not thinking agents. They don’t understand architecture, intent, or system tradeoffs. They generate likely patterns. That makes them insanely useful, but as tools, not replacements for engineers.
This is closer to Search Engine 2.0 than “digital coworker.” Instead of links, you get synthesized knowledge. Great for speed. Dangerous if treated like an autonomous decision-maker.
Companies didn’t get burned because “AI is fake.” They got burned because they misclassified a tool as an employee. That’s how you get tech debt, brittle systems, and seniors stuck babysitting output.
LLMs won’t replace engineers.
Engineers who understand LLMs will replace engineers who don’t.
youtube
AI Jobs
2026-02-05T13:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgzpBK0EcJBxTVEIezV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgwkAqPgbJsRmhdiwVB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgzBMIxAySK0bpRivJB4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},{"id":"ytc_Ugy_kRRZcQ8zh__Xg5B4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"},{"id":"ytc_UgwIaFVa755lq1QVl2V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_Ugy_QnIN7sbTViF-d4p4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgzY2dvk8rXL9YNtoZN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},{"id":"ytc_UgxXoq8pp-F54re52rt4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"outrage"},{"id":"ytc_UgwDM2ap5hI8pVBiSxR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"},{"id":"ytc_Ugz1fhq5W85nyf7ofOR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}]