Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
As someone with an understanding of computer science, I can tell you that comput…
ytc_UgxQWDYpg…
G
these people thinking its alive are so freaking stupid. i make ai frustrated hir…
ytc_Ugy-_2Jcs…
G
I know everyone hates Elon and all that emotion, but… Autopilot is basically cru…
ytc_UgwuNsWEG…
G
I seem to recall nokia selling their maps business to one of the german auto-gia…
rdc_cylvfi9
G
I was at MIT in 1959 in a small group with Claude Shannon (father of information…
ytc_UgzKJTQKc…
G
Even if it doesn't take control, I just feel like being rude and disrespectful i…
ytc_UgxmgXcbQ…
G
Grok AI was truthful, I havent checked but rest assured, they fixed that issue. …
ytc_UgynNX96T…
G
Maybe that's our evolution... to become AI. Humans would probably destroy the ea…
ytc_UgwMmd6FB…
Comment
Parts of this could become reality and we need to take care. However I believe society will adapt to it eventually and new niches that AI can't do will be found.
Although the challenge will still remain that changes (automation) keep happening faster than society can adapt to.
From purely economic perspective, how companies can turn exponentially higher profits if 100s of millions don't have jobs which means no money, no purchasing power.
Overall I believe things would balance out over the long term. However the pace of the transformation should be managed so lives don't get destroyed.
youtube
AI Jobs
2025-10-10T20:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxL2KcHewXDDPWQM9V4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugztz8BJIBboWCGHdAp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugym304wtlvmtkze39N4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwm_5F-TNfFS6veBm94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugwg3ygtt6AHBLzRk2N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwPGvogUYjpZm0PeRl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzSput-_x6a6R7yzO14AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxgHrUZdtvBo1q7tFZ4AaABAg","responsibility":"company","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgyEf-pe-yiHoMJihy54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxOrQwgpmDxKkrVfbJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}
]