Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@tomykong2915 yeah, helpful ai that stole from artists, how do you think ai can …
ytr_Ugz4pLXGJ…
G
13:03 TECHNICALLY it IS correct because the chatGPT model the guy used is not t…
ytc_UgxbzA1XP…
G
If it's gonna take AI to kill off the dumbest of the dumb, then let it have at i…
ytc_Ugw_lfstL…
G
This is a disaster. This means progress will be hindered by different states. Th…
ytc_UgwsKgJxz…
G
To me it sounds like the creation of AI cannot be Controlled by its creators. Ha…
ytc_Ugw-a6rBr…
G
Im suspiscious of George, brings up not getting paid by the family and also brin…
ytc_UgySYzk3L…
G
Maybe I’m wrong, and I hope I am:
AI is overhyped.
Many people still don’t und…
ytc_UgxNyKfC9…
G
AI should always include a hard shutdown switch that is not able to be changed b…
ytc_UgzzISH_4…
Comment
"Consciousness is to Psychologists, like life is to Biologists -- we know WHAT it is, but we have a harder time defining it."
-Hank Green
I think robots deserve rights when people start to recognize AI as full-on I. When people understand that they are fully conscious and aware of that fact, is when we understand that they deserve rights.
The question is, when will that be, and how soon would humans implement robot rights?
A better question, and an argument for the 'Aware of pain' argument, is that, if you program a robot to feel pain, is that already denying them the right of not having them feel pain? If they need to be artificially altered to feel pain, is that immoral?
youtube
AI Moral Status
2017-02-23T21:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgguP2YK6Evav3gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugh2mlfZdYufvHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugha9jkNx4mc5XgCoAEC","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgiWDs3vwBNUm3gCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgheyRWddzrNB3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UghnyBtTnaUr-HgCoAEC","responsibility":"distributed","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UghS3cv6xjhhLHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugjyf06LYA3O0ngCoAEC","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgiuONZhggJmMHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugj-mRAJ04tKY3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]