Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I agree we are in a horrifying point in time where AI is not fiction anymore @S…
ytr_UgyRCwPB2…
G
I dont really mind self driving cars, but I really hate the idea of remote contr…
ytc_Ugx4mLw6P…
G
bro I got banned from a server because apparently the work i have done on my sch…
ytc_UgyfMptqz…
G
Has no Boundaries. AI without limits is going to destroy humanity. this is th…
ytc_UgxuKFDU_…
G
Really stupid discussion. OMG - big bad AI will cause someone to contemplate sui…
ytc_Ugz8y3i_c…
G
i don't see future advancement or new technologies to give new tech software job…
ytr_UgxZ2EDhg…
G
Penrose is the big BOZZ. He knows. AI is and will be by its architecture and pri…
ytc_UgxFE72xe…
G
Let’s keep the masses in absolute fear. AI, impending wars, climate change, et a…
ytc_UgxuASs_n…
Comment
No thanks. Don’t believe it will be used for good, only evil. You can make it funny all you want. Just like they say they want to put our consciences into AI’s so we can live forever. We already are going to live forever, it just depends on who you serve, on where you will spend eternity. Everyone trying to be God. The only way we make a better world is to get back to God our Father, the creator, accept his son Jesus as our savior and have a relationship with him.
youtube
AI Moral Status
2022-06-03T23:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugxcgfdm3B6EzWMFOQN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyV1AlXI0tXtOzCZt54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgydUZ9O12R0qSMcByl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwCyXFWQ7G5O0LksU54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzEqVDDlOcTfBGPYpp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxoU7tqeSn2ggq87g54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwB2FhHW3fMUwv08Nx4AaABAg","responsibility":"user","reasoning":"virtue","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgzqUJFQqFkC0kohbyt4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz_l2So5Qjz2rMqDa54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgwgKEPdVmDxQ3auBwF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}
]