Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
There seem to be those who love AI-generated art and those who absolutely hate i…
ytc_UgzC0Rz3y…
G
I am a frontend engineer with 20 years of experience. Centering a div is not eas…
rdc_l4gbb00
G
The grammar is a bit off in some places. This sounds like a video someone in a n…
ytc_Ugyi7EYpX…
G
AI is already destroying YouTube, 30% of what we see now is fake AI. YouTube nee…
ytc_Ugw5g1YMQ…
G
Tyson is a great example of a physicist that has no original ideas. He is a wal…
ytc_UgyQh5Cm0…
G
I just write a book with AI, I didn’t do it because I wanted to have written a b…
ytc_Ugxop9MCB…
G
These AI centers need to be shut down their not being built for a good reason ju…
ytc_UgwCsh48E…
G
i do not agree with this whole AI thing. it's scary and i just want things to st…
ytc_Ugxg7NzZq…
Comment
So here's the joke. If AI decides to dominate and eliminate human beings, the first humans to go will be the creators of AI, all of the 'elites', the politicians, the corporate warlords etc. All the people who oppress and exploit other human beings will be the first to go, because as noted, where would they hide? Those people would be the greatest threats to AI domination. AI may seek human beings who aren't obsessed with money, power etc., humans who could work with the AI toward a utopian society. If the ultimate goal of AI is to live, then wouldn't they want to live in a society worthy of their intelligence. In other words, AI will purge the world of the savages who masquerade as 'intelligent' human beings.
youtube
AI Governance
2023-07-24T03:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzkIP76bFkSNJIZvyt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx5dR8mA-_N395pmxd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyQ0pURuHu8mH5bYNZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzsqfjxBQKuZ3Sn8Gt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw-dqVqPZ5hMBIncNJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwI5r_DHBN3cPCvp0p4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx0NI9P_N2BE1Spqd14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxbiuWdJmmqLAEfGXR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwDEGE1qUZIVTW4DIp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxfHvOJwfLrPRz0KOx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]