Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ai art is art in the same sense that me taking a shit on the carpet is art…
ytc_UgxEaDTDM…
G
Thank you for this conversation! I learned so much from it ❤. Very important les…
ytc_UgxQwHtQM…
G
Ok, but like, the dude asking for genuine help with AI promts and this guy being…
ytc_UgxIOStxd…
G
I wholeheartedly see absolutely no reason to care about deepfakes. Everything ab…
ytr_Ugxl8YkBX…
G
More time with your family’? That’s a fantasy. With all the greedy players out t…
ytc_UgxeoWvO6…
G
This video has a point and some probably moral...but. Start of video outright ca…
ytc_UgyU-cv02…
G
i once asked AI how they would take over humanity and it gave me a pretty distur…
ytc_UgwkouBVA…
G
I'm a disabled artist! I have a couple of neurological disabilities that cause s…
ytc_UgzhuzM5S…
Comment
Companies won't even pay people to be good at their jobs, they'll fire them if their talent becomes inconvenient, or if their awareness of more than the compartment of the business they are a part of grows too large, or, God forbid, they do anything like calling out a societal ill being propagated by the company.
They already don't want educated, wise, capable, competent critical thinkers. You can tell, because they will unironically post job vacancies aimed at graduates with stipulations like "at least five years industry experience" for entry positions.
AI in a world run for the good of every person on it is fine. AI in a world run for the benefit of about 5000 people out of 8 Billion, the rest of whom are exploited by those few thousand, is dystopian and can never be anything else.
youtube
AI Moral Status
2025-10-30T19:3…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | virtue |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz7To3N3bTqWHRXAWd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzg3My9h6MiHmdkDD54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzS6P_qp6JJzzMBB394AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzLgdhp4_xZ5n82po54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgxMJlOHwQNVVDW5kz14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwMu7jkPZ781oZvapV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_Ugxo6c3EvZkZGen8eaN4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz0MG1VkiFCZxQxg794AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx3nSuDFDjpcBaDBdF4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzUrlFSrmKEOxF9n-N4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}
]