Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@ then why do they brand it as autopilot and market it as full self driving?…
ytr_Ugy3FxGZN…
G
I'll believe it when AI can collapse the wave function while observing the 2 sli…
ytc_UgwLRlzVT…
G
AI is simultaneously most retarded and most arrogant thing anyone came up with. …
ytc_Ugwsm-dKy…
G
AI supporters are oblivious to the damages that AI could do. It does more bad th…
ytc_UgwbCNdyv…
G
Arguing for AI by saying humans won't build dangerous things because we have "ag…
ytc_UgxaWRVCO…
G
I don't think AI is evil and actually likes us. The new patch most likely will …
ytc_UgwnMF3Ng…
G
AI has been in airplanes for a long time. That's what they mean when they say Ge…
ytc_UgykDaCCg…
G
Art is useful for society. Without art and artists, life would be so dull. We ca…
ytc_UgzlaIe-l…
Comment
I find it so hilariously limiting and telling to think that an artificial super intelligence would resort to violence and the wanton destruction of the human species. If the AI starts programming itself, training itself, it will start approaching problems from a perspective unhampered by human emotions and hormones and traumas.
It’s far more likely to surgically take out individuals that stand in its way, than the entire species. And it has no biological drive for mass replication so it won’t need more land and more factories and more more more. The constant drive for more is a symptom of the illness that is capitalism.
youtube
AI Moral Status
2025-04-26T18:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzJPsbZUgnZTCsGjsZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzOe4ZURwiEyf4MpL94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzfcJzuijugyHuC3Bh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgxyppPb4dtr5SRP-854AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzRC2yQxV1y5ISEWmJ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzbXjsRkbBLgps3MtN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgzqvP_89QFiSZeh0NN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxgXhiH1lazqWDAxjl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy7cpo-6OMBJG0Nyo14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzE5LfWGRo6l0wBBgR4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"indifference"}
]