Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Interesting that money is the focus of her answer: Truth is not part of her answ…
ytc_UgxC66A53…
G
AI is supposed to find its own solution for that. If we're nice to it, it might …
ytr_Ugw3wdX2A…
G
No way. Any possibility to control la AI in the future is barely a dream. In 20 …
ytc_UgwnL7R2d…
G
You are already following the lazer pointer: LLM could have guardrails written …
ytc_UgwhQLZ_n…
G
We need to levy 60 Percent tax on ALL AI ASSISTANTS and advanced robotics. Get o…
ytc_UgwxmT5EG…
G
This is pure tyranny. They target people who don't automatically bow down and li…
ytc_UgxVztR5U…
G
It's not happening. LLMs are a dead end. The actual crisis is simple: We inflate…
ytc_UgzUz8ZRz…
G
please tell me there are other dudes here that knew this shit was ai and not onl…
ytc_Ugy19y-bW…
Comment
Sadly in our company, the dev teams in IT have no QC process in place and they are using AI to generate code for them leading to constant buggy code and lost productivity, and as they relied on AI to generate code they don't know how to rectify all the gremlins.
youtube
AI Jobs
2026-02-21T22:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugxp8KNsguSX6q5KFoR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy-w9VQTVaMJMHHz3F4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxOdj0QcuMruL8md0B4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxJtqHccKPxDa_s5854AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwZqEcFu6ObFx1a0eV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxcyKuvcMCgzCBD7Gh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzXLLbX0ojqULARqnN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzyzopRd8liHDpblhJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxGQmR7H3uJXq7X8Ot4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxjID38ElBEAp6ykcV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]