Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We appreciate your interest in Sophia's development. In this video, we focus on …
ytr_UgwKyVqIn…
G
Way to make a case for AI art. It didn’t sound obnoxious and arrogant at all.…
ytr_UgzS-uQPz…
G
@dalibornovak9865 bingo. Make folk feel smart by letting them jump to their own …
ytr_Ugw10prI4…
G
Do you know what I think is funny I got multiple AI ads on this video…
ytc_UgwvdxSwX…
G
Lost count of the times I feel completely embarrassed to be part of the human ra…
ytc_Ugw0YMIdN…
G
Learn a trade kids! Last time I checked AI can’t install central heat or AC or W…
ytc_Ugx8_onca…
G
Bro who the fuck allowed ai to be used for this stuff. This is dystopian shit…
ytc_UgxV4HjFx…
G
A.I. doesn't have to be conscious to be extremely dangerous. I don't how we wou…
ytc_UgxtYqq07…
Comment
The real question is, why would you want to give machines consciousness? We don't even fully undertstand it. It's like if we tried to send a man to the moon without understanding laws of physics. I think machines should never be conscious, at least until we know everything about ourselves first. But even then, what would be the purpose of a "smart" robot? We alredy control these things, it's already slavery, so by giving them mind we will just make it harder to control them.
youtube
AI Moral Status
2017-02-23T15:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgiFBSdhZ6OiBHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Uggmwyliw6Ndm3gCoAEC","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgjagJyEa3ihdHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UggaSjYh5W4t03gCoAEC","responsibility":"none","reasoning":"unclear","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgjYo2NEXZe5yngCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugj7xHSCB362wngCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UggFmovwouz0T3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UggXNbZRYXgRMXgCoAEC","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgiOm4edH9tF53gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgiVIEHUHhcKzngCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}
]