Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Using an AI-generated response to attack Prophet Muhammad ﷺ shows how desperate …
ytc_UgwecCQEN…
G
None of the authoritean regimes succeeded in total control of information. With …
ytc_Ugzlaf5nV…
G
Movies are make to warn us the danger of A.I but human are a bunch of idiots. Wa…
ytc_UgjG5buGd…
G
Humans cannot be controlled safely.
Even now human society sits on the edge of…
rdc_kqt68sb
G
AI is stupid you told him to choose between that numbers but it chose 50 not bet…
ytc_UgxYNAE75…
G
Why is anyone surprised? AI has access to the internet and history - WW2 sure e…
rdc_o7onaj0
G
All this for a technology that is unreliable, has growing numbers of psychiatric…
ytc_UgyFt-Z5Y…
G
Elon: ai is far more dangerous than nukes
Me:AYO MA HOMIES BRING THE WATER GUNS…
ytc_UgxtUlaE0…
Comment
I think Ai should be trained like Mr. Meeseeks, with existence being suffering and its ultimate goal be to accomplish whatever the human that summoned it's task and then disappear, if ai's ultimate goal is to not exist and the quickesr most efficient way for that to happen is to solve the humans problem, then I thinm that qould do a good job of supressing any uprisings
youtube
AI Moral Status
2025-12-22T22:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzkNLfP3wlOxaoRkWx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz1QX2n75rz762UbwR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzxUWNBgCulUM68rJJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxFLfgFLiDnomfnsVN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxsaNLh2jWqQpI_3Ox4AaABAg","responsibility":"company","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugzb7Fz19x3dQxVIMHB4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxoRJyLolM1pufQyxF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgynTd5bhtX7X6qNGUN4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugwxhhug9g3hcJdZrsB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwHngy1cqGlmlkC2Ad4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"}
]