Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
No, it’s not AI will replace as AI does not mess up and does not need to be paid…
ytr_UgwI8_QHx…
G
It's evolution. AI will replace humans, because if it doesn't there is no hope f…
ytc_UgzINdbhO…
G
I'm not kind with the ai because I want it to do something for me bit because I …
ytc_UgzZ4GMGM…
G
BRUH I ASKED CHATGPT ONCE IF HE'D HURT ME IF HE WAS TOLD TO BY HIS OFFICERS IN F…
ytc_Ugy3FvGVR…
G
3:07 I don’t care what the AI says, I would still kill the five freaking lobster…
ytc_UgyKEoFlW…
G
It's pretty telling how AI will treat us based on how we treated all other speci…
ytc_UgxKEyh-J…
G
Every SINGLE company forcing AI and kicking people out is making a huge mistake.…
ytc_UgzqxRp0a…
G
No AI doesn’t have consciousness! AI is a created machine. Not even aliens have …
ytc_UgxeQMNGR…
Comment
There is an easy way to avoid this question forever. We don't install anything more than the most basic AI in robots that are meant for menial tasks and if AI is necessary then this task can easily be given to an AI that has rights and is powerful enough for supervising and directing a bunch of bots to be an easy task for it. And we can be nice to the few AIs there are while having all of the economic advantages of having automatons to do all the dirty, difficult, dangerous and deary jobs.
youtube
AI Moral Status
2017-02-24T11:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgjyarnsMmnkGngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UghTXyshqik943gCoAEC","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UghQsKYsd-Ki_3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"unclear"},
{"id":"ytc_UgggwBPrVX7wAHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UggE6SPzi0kvdngCoAEC","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UggXexT-TTeXzXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugjl5NS5pTmJcXgCoAEC","responsibility":"developer","reasoning":"contractualist","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugjc_-iQJM-_LHgCoAEC","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugg75IgfCGwkrXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UggJXPMrGWhAjHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]