Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Give it 5 years, AI will be able to replicate the human experience through art f…
ytc_Ugyg_XYAz…
G
You are all delusional if you think AI has achieved conscientiousness. AI is ju…
ytc_UgwFvyLd4…
G
Sorry, the correct answer is Option C) To program an intelligent machine. The go…
ytr_Ugx2odTed…
G
I've found Humanlike Writer to be super reliable for creating content that passe…
ytr_UgwT9AOZE…
G
Character AI is so good…(I have Poly AI, Talkie, Chai AI and Character AI…all fo…
ytc_UgxKCULRC…
G
ChatGPT said:
Hypothetically speaking—and based solely on what's been publicly r…
ytr_UgxaQtBry…
G
Lol. Id replace her with a robot in a heart beat too. Atleast AI isnt lazy, enti…
ytc_UgzhtdWWA…
G
static builds up and is stored in so many things here on earth. i have to keep a…
ytc_UgwjArOBv…
Comment
I buy the argument that if you built a learning, self-aware, pain-sensing robot with a fear of death and all that, it would be sentient enough to have rights. But under no circumstances would it ever make sense to make device that had the abilities and rights to defy its owner and transcend its original function. It defeats the purpose of making a machine. Who would make a stove that wouldn't turn on if it wasn't in the mood or a gun that won't shoot because it developed a new moral code algorithm?
youtube
AI Moral Status
2017-02-23T15:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgiFBSdhZ6OiBHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Uggmwyliw6Ndm3gCoAEC","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgjagJyEa3ihdHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UggaSjYh5W4t03gCoAEC","responsibility":"none","reasoning":"unclear","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgjYo2NEXZe5yngCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugj7xHSCB362wngCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UggFmovwouz0T3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UggXNbZRYXgRMXgCoAEC","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgiOm4edH9tF53gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgiVIEHUHhcKzngCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}
]