Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I support artists, but AI is not something we can put back into the tube. I sugg…
ytc_UgxFlW58N…
G
It’s such a bizarre capitalist race… to socialism. If there’s no Labour, then t…
ytc_UgzbsUPKQ…
G
AI should never have a psychical body. Once it develops self-preservation, it me…
ytc_UgxgWc2fn…
G
Does the Harvard Business School really believe that AI will replace every job?…
ytc_Ugyc-QTjV…
G
I think the premise of the video is flawed. A surprising amount of the industria…
ytc_UgyLuozTl…
G
Maybe now you could do an interview with Emily M Bender, author of ‘The AI Con: …
ytc_UgxZrgBzr…
G
I’d rather be ass at art(which I am) then use ai and judge people who actually s…
ytc_UgxD6ZSVD…
G
@ProjectBitricus your sad desperate attempt at making a comeback argument is so …
ytr_UgxypoqU3…
Comment
4:29 This is a one-way argument: it isn't 'morally repugnant' to put a hurt on an automaton, but why it isn't to put a hurt on something in the first place? After all, it's just a biological thing that we meat-brains evolved to be used to, even though it can be at times a completely irrational behavior. Not trying to be Greenpreachy, but on purely logical grounds, people that use this automoton argument should be mindful about this detail, if they're planning to win any debates on this thing and get their slavebotz to harvest their sugar cane
youtube
AI Moral Status
2017-02-23T15:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UghhKmo9_JfX43gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UggEgJ7EC9Fn7HgCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgiwGuogXeiLSngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UggyedImQiwnlXgCoAEC","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugja_3SpDlGXR3gCoAEC","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UggF-8GTaBLZY3gCoAEC","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugj-XchBuRYUSHgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgilfWcbkrPkAngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgiQOIvmeQsqZngCoAEC","responsibility":"ai_itself","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UggSPSoLSSifkngCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]