Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We just have to make politicians the targets of deep fake videos, and we'll have…
ytc_UgyPh1Th7…
G
Well, the problem here is that most people don’t truly understand what radiologi…
ytr_UgxpiGY9c…
G
There needs to be a social program to support humans. It's the idea of Sam Altma…
ytc_UgxE50Hfj…
G
Oh, I haven’t seen this sub. Nice. I feel like joining it would make me (more) b…
rdc_fwi43vw
G
Great video! I think it’s important to remember that AI-generated art is still a…
ytc_UgwobtF0n…
G
Are we sure this video was not created by AI to see our reaction to the idea of …
ytc_UgiACXM3r…
G
Naw, the robot was following its pre-assigned path. It's in an area enclosed wi…
ytr_UgyB_4KKA…
G
not in a million years would i take a humanoid or pretty much any kind of autono…
ytc_UgxS5lxoV…
Comment
the question is how would they earn there right, would we treat them like a child until they are 18 or do we give children rights to even the boundaries, because you can argue that you need a robot to "learn" for that time but at the same time if robots got instant rights as if they were "old' enough then that would also raise the question why does a robot get more rights than a human child
youtube
AI Moral Status
2017-02-24T00:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_Ugizh8nsOE91DngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Uggyl3hVgRsJR3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugg3NvlXnGLtkHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjIf01qQSO2LXgCoAEC","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UghC_fBL9IzRwngCoAEC","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgiW6UEDZs8n7XgCoAEC","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgjZjn-YzcpzIngCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_Uggbivfnf2X5BHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UggKtzN8-y1cSHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UghTBvSlrH_EcXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}]