Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I always laugh when I see people who think they can get away with just AI tools …
ytc_UgzLlX84y…
G
Social media scoffing at these Tesla drivers should relax. The drivers that cras…
ytc_UgyjRd-Ob…
G
telling an artist to make ai art is like telling a stone sculptor to use a 3d pr…
ytc_Ugw6n6NYk…
G
The great thing about artists is they can respond to this the way they’ve respon…
ytc_UgyoqeQ36…
G
I love seeing you battle these entitled Art thieves. I really wanted to join the…
ytc_UgzJACiML…
G
If i hear someone proposing to have sufficient understanding of so called AI , …
ytc_UgwQrNRVC…
G
Ironically, you can use AI to inform you of media's authenticity and likelihood …
ytc_UgwZrj7_V…
G
I don't want A.I to take over, but I think I'd rather the future where A.I takes…
ytc_Ugw7-1JwW…
Comment
I wish you had gone further than looking at consciousness. I think the question of free will is a more interesting one. A robot might be sentient, but without cognitive capability for free will, I think there is little basis for robot rights.
Instead of the example of a robot feeling pain, I think it'd be more interesting to examine a robot who is prohibited from exercising it's free will. Furthermore, what if humans in the future forbid robots to choose for themselves because of the potentially disastrous consequence of a powerful robot feeling remorse or anxiety associated with that freedom to choose?
youtube
AI Moral Status
2017-02-23T17:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UggjZy8rcjGNm3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_Uginb6UDLQhk_ngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgicvaZjhMX6BHgCoAEC","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},{"id":"ytc_Ugh9eNuF4VTjWHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"},{"id":"ytc_UggntMP2kdIoWHgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},{"id":"ytc_UgjEH-SjZ2pMMHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgiDLalxifsbkngCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgiXwA6zw6dnqHgCoAEC","responsibility":"none","reasoning":"mixed","policy":"regulate","emotion":"approval"},{"id":"ytc_Ugh2D6_lDm1Rc3gCoAEC","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"resignation"},{"id":"ytc_UghxYPawvqCsbXgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"outrage"}]