Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
WHAT ABOUT PALESTINE GENOCIDE FROM ISRAEL? Is the AI an issue more important tha…
ytc_Ugy69eCx2…
G
Have you ever had the unfortunate circumstance to interact with an automated voi…
ytc_UgwiRk3Hu…
G
as an art student who was just subjected to an “ai art show” in an actual galler…
ytc_UgzUZ3Bml…
G
That doesn't even matter anymore. Large companies need advertising to promote th…
ytr_Ugw26MzSS…
G
Thanks for your comment! It's fascinating to see how Sophia articulates her unde…
ytr_UgyjqPp4j…
G
So the fear is that super ai could manipulate humans, gain control of robots and…
ytc_UgwfGyKf3…
G
i dont like ai much but i think its funny to put my own ocs through an ai machin…
ytc_UgxFUhGZU…
G
8:27 I actually adore how you made an anti ai poster based of one of the FIRST h…
ytc_UgzOGYnNC…
Comment
There are two entirely different concepts regarding the fear of AI. One is the fear of AI ruling over humans, as Elon Musk worries. This is completely nonsensical; without human biological needs, AI has no motivation to do anything. The other is the fear of AI being used by people to effectively enslave or harm others. This is a very realistic social issue that requires immediate and continual resistance from society as a whole.
youtube
AI Governance
2024-05-25T05:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxEs1GIAalTUnxG8Ep4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwMD5JqVxzHNnDJ0Xp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugy8X3z6mtWJnZb4Z6N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwDYcWLqLrDaDXnTj54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzIho2tt3uGG19aT4V4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzGG7wZrP2SyiPuZix4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw49fnU-rqdZEpnFM54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyDEIBFOrxWBGWlJqd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw-tlyV9kpYo553G_94AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzuKiZKJQO9vjwqoqd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"}
]