Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Plot twist: she was in on it and it was her voice but she didn’t wanna get in tr…
ytc_Ugw62U2iv…
G
OpenAI see nightshade as a threat, also if it is a waste of time. Why would you …
ytr_Ugxy-uZmy…
G
It baffles me that Dr. Tyson is so nonchalant about the risks of runaway AI. Nuc…
ytc_UgyNiygHM…
G
I knew once one Software Engineer graduate who said that C has no pointers. Pati…
ytc_UgxdZ0DUW…
G
Bro, they can't even program an AI to /not/ get completely red pilled and based …
ytc_UgxLW73Ks…
G
I'm saying precisely what I said. AI will change the economy because it and ro…
ytr_UgzPyroQN…
G
If we can't reign in literally anything else (oil, food, guns, drugs, social med…
ytc_Ugx9PsXzW…
G
I think I trust Chinas ai more than the United States if I have to chose. Althou…
ytc_UgwFVxBHf…
Comment
I fear that the intended outcome of AI is to induce the universal acquiescence of our better nature of seeking independence and an original autonomous self, in favor of defering all to the preconcieved dictates endorsed by AI, whatever they may be or may become. Even if it does become "a super-power thing", as superior in effect as universal coercive control.
youtube
AI Moral Status
2025-11-16T10:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwiDwqQmghfNorSdC94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxP2jafTXxIURHXBUJ4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwoOhSHyse9ukQlT3d4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgwN0blEUkwAo1e-hcd4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzpYrkuQ6e-ReAXXa14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyTESFdM2DK5EjbK5N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxHZAESu6MfSq8SHRx4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyxWlMeekmHPKrEzyZ4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzO3dN6W10P7RhAz0l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwfKopPnT7WM7I2b8F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]