Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
i think if some disasters strikes by AI, this video will be part of some edits o…
ytc_UgwKBaUlY…
G
A reflection of humankind's unfortunate portrait...
What a sad pathetic plague w…
ytc_Ugwj0I3QG…
G
i have tendonitis and am potentially going color blind and also know someone who…
ytc_UgyBkxTnY…
G
I feel nothing for these people. You created this brave new world (probably for…
ytc_UgxIoCnSs…
G
Pausing a 8:01 answer your questions. What happened before people had "employmen…
ytc_Ugz82fhaG…
G
@andyreactsI'm not so concerned about how much money it takes (as I don't have …
ytr_UgyaG5iKv…
G
If robots and AI are taking over people's jobs are they going to buy houses and …
ytc_Ugz8T-W9C…
G
There are companies in Germany that are already contracted with other countries …
ytc_UgwHpANOi…
Comment
Artificial intelligence, when viewed solely through the lens of panic, becomes a mere conspiracy theory. Humans don't need money, power, atomic bombs, or artificial intelligence. What they truly need is health, shelter, food, and clothing.
Money is merely a last resort that no one truly needs; it serves only as collateral for the exchange of ready-made products. Artificial intelligence can be considered a weapon, but if it doesn't serve humanity, it becomes useless—even if it has the potential to contribute to the world.
Consider the atomic bomb: it is capable of controlling the world, but it is useless because, if used, it will kill all humans. Therefore, it is an important instrument of control, but at the same time useless, since it does not serve humanity—it only eliminates it.
If artificial intelligence reaches the point of dominating humans, it will be an important, powerful, and useless creation, as it is harmful to humans.
youtube
AI Governance
2025-12-15T16:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | virtue |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwU-0SOkR6ksE0nM-h4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgxWBJTZ8DuvAxz3IfV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw5rcPt-gEJfFEzdC94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz1fyKr1krzgW1fwgl4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugxnh_ruGZIiDyvoTql4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwvNwotJjR9EGppcwF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugyo81l7McnXnYwFvpB4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwFpycqILSOd5e8pm14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwW-4uI15aTpqT2mBx4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxJ9qQM71yNQ0VCLDZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"resignation"}
]