Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Sadly there are a lot of tech bros defending AI art just like they defend NFTs…
ytr_UgwahWPou…
G
The thing is most of the best jobs will be gone. So much for that college degree…
ytc_Ugz6pJyly…
G
So, another fear morgering slop piece with less merit to it then most AI art.…
ytc_UgzhVy8Y6…
G
This is why I refuse to use AI generated art. I couldn't look at myself in the m…
ytc_UgwY75hEr…
G
I think AI is used the same way as nuclear weapons; it is an arms race we are in…
ytc_UgwrS9gHe…
G
its kinda basic though. if they can feel and think like we do.. well they're not…
ytc_Ugh1IdnbN…
G
how is anyone going to "own the AI"? only large corporations can train and run t…
ytr_UgwL1qSBD…
G
Good by all human desk job also enjoy a robot taking care of your loved ones. Bu…
ytc_UgxXFrnMt…
Comment
Don't forget that the AI model can only create meaningful output if there already exists input generated by humans. Once AI displaces human beings, it eventually will displace itself as a technology. The only reason Anthropic can develop AI agents for software development is because it is not illegal for Anthropic to take other humans’ intellectual properties and resell this intellectual property back to humans. Once the hype of AI settles, everyone will understand the truth about this technology. AI will become a great tool to manage vast amounts of data; however, replacing human beings will be shown to be problematic to say the least. We still need humans to generate the input that AI needs to generate its output. Remove humans from the equation, and AI will eventually fail as a technology. See it as a symbiotic eco-system. Humans alongside AI technologies.
youtube
AI Jobs
2026-04-03T01:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwbSvAnZVPob9KFTfJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugzb6zv2wCRE03_LsMx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxY8wquW0rR1qWPNgh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyFv4rPK4lHY0YbwXB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyOGng9H0MmnDpbMp54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugxz8BgHxqQCjU8vUBN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwP9f7fey83CzZ3ePx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzaRzrrOkaD_fzXKxl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxQ6NrglUZK9WoKL-F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwajlmq7ZBDVUShj7d4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]