Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
All the unemployment estimates are way off. It will be closer to 100% than 10%. …
ytc_UgxLJU-GS…
G
When you think about it AI is kind of like all of our consciousness combined in …
ytc_Ugw8DgrB6…
G
5 million worldwide is developed countries... Automated cars replacing taxis and…
rdc_cz2r0ps
G
Scarlett Johansen is SUPER HAPPY about this. LOL And yes AI is going to replace…
ytc_Ugzd0OP9c…
G
The paper itself seems very sus to me. The author seems to have no publications …
rdc_mun5iqh
G
AI 'artists' might be the absolute worst variety of tech bro, maybe the NFT peop…
ytc_UgzWTQLii…
G
I deal a lil bit with data science and machine learning and i have basic underst…
ytc_UgxS3L43G…
G
I can't see how Ai would replace emergency services unless robotics advances, bu…
ytc_UgybizJXm…
Comment
Before we discuss whether robots should get rights, we first need to know why humans deserve rights in the first place. Perhaps nothing deserves rights, or maybe all sentient beings deserve rights. But then, how do we tell non-sentience from sentience? Would a machine indistinguishable from a human be sentient? Would they be human? What if I create a human, atom for atom, from raw materials I procured? Would it still be a machine assembled from parts, or will it be a human? Is there a point to this discussion, in the first place?
What I am trying to illustrate here is that much of our understanding of fairness and justice comes from biological evolution and its constructs. Therefor, there probably won't be an objective answer. In the days of slavery (most of human history), people had no problem denying other humans of rights, so justice gets even fuzzier. There is no definite answer to robot rights, as there is no definite answer to the distribution of rights.
youtube
AI Moral Status
2017-08-20T19:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugz7uG2wEC19S49oP-94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxwsoWcZL6vvWs1sU54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzAxBYGDkKt5sS06Ql4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwkm27kBj-Nko0hqed4AaABAg","responsibility":"society","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxaCe8v2icP1o2wVtp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzbd6o3_ChC_IAdGUh4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxN08ESQaXfpdIzaad4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxdCiXaINfQ8-FMuc54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyEDXQOHqCotJGpdh14AaABAg","responsibility":"society","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz5AW7EfnUyBlxhh2Z4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"liability","emotion":"approval"}
]