Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I DON'T SOUND LIKE THAT OR TRAIN ON NONCONSENTING DATA :C not all AI people are …
ytc_UgxUvI1TQ…
G
There's no doubt that the Tesla Autonomous Driving is very far away from where i…
ytc_UgyS8Bufb…
G
Instead of ai taking jobs why not use robots to help people like my mom who is b…
ytc_UgxQ1O1Xq…
G
Common people does not give a fliyng F if a image was made with AI or not, much …
ytc_UgxoCAjf4…
G
ai is amazing but you need to use it right. Use it like a teacher for learning n…
ytc_UgwSuQ8oz…
G
Staaaageeedd! ChatGPT always talking about your sponsor and now you pretending i…
ytc_UgxYlHWMU…
G
Jon I always watch and enjoy watching your shows but Instead of taking to people…
ytc_UgzZryPRl…
G
i think it would be beneficial if ai was an idea giver rather than making actual…
ytc_UgxuhvcC6…
Comment
I'm kinda embarrassed for both of them for not understanding that in the twice-used horse/car in the early 1900s example, which I think is a very apt analogy, people are the horse. Horses went from having lots of jobs to almost having no jobs in the span of a decade or two. And no amount of retraining or job creation could support the now unemployed horses, because technology simply made horses obsolete in all but a few cases. We are rapidly approaching a time when focused AI (not AGI) and robotics (includes self-driving vehicles) will make a large portion of the workforce unemployable just like that 1920s horse, and there is no retraining nor new job that's gonna solve that because there will always be 30-ish percent of the population for which a machine will do that new job better/cheaper. And I feel like I'm being wildly optimistic with 30%.
youtube
AI Moral Status
2025-08-15T01:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgyU8m4YapVLYseN6pR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugxm0YZSIhqDfeY437N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxLIqCLTBzfz9gXyGd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwPBIriFixF4OErPH54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxbaibMyT1oTohbas54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyjV8icF4wYRj4cQ8J4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugyk_cC2LeCR5VpIWll4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzRS2lh8YDd3N6Xp3N4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugy95JFpasveZB4vKsx4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxC_E7NyAp15uvFUtl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}]