Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
you didn't the Ai does, that's the point. we can see that it looks the same but …
ytr_Ugx-2I5wv…
G
Everyone can be a producer, editor, or director and make their own actors now wi…
ytc_Ugxdio4re…
G
The scenario involves an automatic car that does not keep enough distance for th…
ytc_UgiIRvaFL…
G
If only people could own AI agents then they would own their income and be finan…
ytc_UgwllG9Jc…
G
using digital drawing could be easier than traditional art, but it still takes s…
ytc_UgyonR5NH…
G
Sir I want to do artificial intelligence and machine learning and deep learning …
ytc_Ugy0lJsKS…
G
The whole shit looks so stupid.., one day these robots will destroy humanity.., …
ytc_UgwgmqpMv…
G
It takes art with (usually) no consent and trains itself. Also, an ai cannot tec…
ytc_Ugx4ibPmC…
Comment
So no one gets it? The argument was more towards the robot rights. Fact that humans can deny consciousness for their self interest is possible. Then it flipped with the fact what makes us so special to deserve rights. The video was a trip
youtube
AI Moral Status
2020-01-23T14:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | contractualist |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugy0gjdV9fhwM-m-7zp4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzaDKLbMavJLtCFuLV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxXjpnmOtWG7JYTYnp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzM7D55jKAMLucgup14AaABAg","responsibility":"unclear","reasoning":"contractualist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzazQbmVdwYL00S9SV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzYSi9YA1viwzY-opV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy4hiZR9Il2L3m5d2t4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxm9eAGlUKJCQJKt5N4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyFEwB2G_UoLk2bF_R4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgxMG6x2mRHVwqZBgvJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]