Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ai is so unreliable for this stuff you can't prove if it's written by ai…
ytc_UgzmrhqJZ…
G
@righthererightnowproductio9525 Well, you are in good company because it seems l…
ytr_UgzgParke…
G
There is one thing that makes me deeply sad around those job-loss discussions. I…
ytc_UgxzvmhkL…
G
Bias in algorithms? They literally don't have the capacity for bias. How sensiti…
ytc_UghJSrz95…
G
@nidadursunoglu6663 problem is: I ripped this example straight from a headline. …
ytr_UgyUmVNE1…
G
Here's the thing: up until a few years ago my character drawing ability started …
ytc_Ugz2d1cqe…
G
Well, it's quite surprising how AI can turn a badly written micro controller ref…
ytc_UgyOkjgsT…
G
You can ask this question of yourself because you can think and wonder at your o…
rdc_j5w9nvf
Comment
Yes, I think they do deserve rights. If they can fill and think for themselves, they're not just a simple machine anymore. After all, humans are more than a simple biological machine. The only difference is one is organic, and one is a metallic body. We all deserve respect, and it should be granted to them as well. I see no reason why artificial intelligence and humans couldn't coexist in a peaceful manner. Just as long as we don't try to kill them, they won't kill us like the other way around either. We're all slowly adapting to our environment and the things around us. I don't think it should matter if you have a fleshy brain or circuit boards and chips.
youtube
AI Moral Status
2022-06-29T15:5…
♥ 4
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyYreKH5rBrv1_HgBR4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgycJEmtloar2BaKDOp4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxoREn0piQ4hFmISbV4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyvqMci6KNNS7IwakR4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgwY2O-5KzvaLl4PwH14AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzWIJykcQeD7wfgNAB4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxXr_N7HIcWal2U0k14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxlKfVJ6uabSwlFpOd4AaABAg","responsibility":"none","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_UgxkR037_XfdWWDHxK54AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxq--jUn8cxxyVMtLx4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}
]