Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I can predict later in 10 - 20 years AI completely relace jobs that human do, ex…
ytc_UgwXnDHjI…
G
How I think about hallucinations: if you ask an AI to generate a photograph of a…
ytc_UgyJOTqlM…
G
If AI gen is so good why not go sell in it in their own con…
ytc_Ugxv6HpnE…
G
28:20 what was she expecting? Waymo to just wait behind while she bikes? I don’t…
ytc_UgwL6IdCl…
G
This format is rare. Most AI jobs discussions online end up dominated by whichev…
ytc_UgyyuWyKx…
G
Practically all roadkill to date is from humans not autonomous vehicles so do no…
ytc_Ugy9ewkIf…
G
Im all set for a mini robot pet that i can raise he can run around ill give him …
ytc_Ugwn75618…
G
Thank you for bringing up the DoorDash SA controversy. It was an AI smear campa…
ytc_UgxzpmK7P…
Comment
I would reckon AI will replace some jobs, not all. It all has to do with reliability and liability. AI hallucinates too much, that needs to be fixed before we automate jobs with AI. Managers like to have legal cover, with people you can blame an employee for mistakes. If an AI is to be blamed, the manager only has itself to blame, Plus companies need to have their own in-house AI technology bc they can't allow their data to become open source or be used as training data for open source. Because AI companies are not gonna become closed source. This all takes time. 10 years before mass hysteria sets in.
youtube
2026-01-12T09:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwJMg1KVor17YnYxhV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzXV2DoU8xnCD9NFYh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyBKNh-BJ8WurUWpAh4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwyhVHYVmEJ5Jr7Q4B4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzdwTFU6nQtIHbuCuZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgwR_q7lREvpRWzfMxh4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwVxWDrQpTP3LjKpNV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwD0ZCzQXQZ_pk-yMZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwv4dVgrKZQeNXM8P14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_Ugxh8_n_e592HcUT8QR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]