Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Depends entirely on your definition of intelligence. Does artificial consciousne…
ytr_UgwZTecmq…
G
I thought humans are intelligent when they created robots but when I see now I t…
ytc_Ugz-gm_yy…
G
Now I’m a big proponent of the idea that AI isn’t the real problem—capitalism is…
ytr_UgzjEKtJ7…
G
Every single mainstream steamer I see have been pro gen AI cuz their little rat …
ytc_UgxCFKwd6…
G
As a heavy leftist hoping for the Star Trek future, I don't have any problems wi…
ytc_Ugz66yHq0…
G
You have to hope decent types are eventually in control. Trump and his oligarch …
ytc_UgzG-9SjT…
G
I do that- please, thank you & I phrase things as a role for them. I also ask th…
ytc_UgwlDpcDp…
G
Repeatedly, reliably, 24 hours a day, 7 days a week if needed. And, that robot …
ytr_UgwE17ATK…
Comment
Wait the AI a nazi made became a nazi? Wooaaah who coulda seen thaaaaat
But yeah no, it's almost the perfect storm of ensuring The End. Capitalism never unhooks from money-makers until they stop making money. Ppl, wittingly or unwittingly, gave them a monster capable of ending humanity, which is bad enough, but they made it a business that could rake in cash. And that's all it took. In the end the extinction of humanity, peeling back all the layers and details, will and has always been Greed. Climate catastrophe from ravaging the earth, or nuclear annihilation from those who wany more power, a disease that fails to be stopped because the cure isn't good for shareholders. Greed will always be what ends us, the only question is in what form will the final bell ring, and when will it be struck?
youtube
AI Moral Status
2025-12-15T12:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugx7JNfbcvWlgsraDR94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyYPftS1TpOsFeHs0F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzF2eBZVfkglB_garB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwncipLIvZXpDP72fN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxjJEOrEoSXOjjCqqF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyyY1SMsloxpUoPCct4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzxYWbzmNW9IHlBD1J4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyqQ59snl980pwFDL54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzeeuIipDwmUxx84hd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyuEWDxovxc8xKaQpN4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"}
]