Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I wonder if a deepfake version of you would know how to spell, eg BELIEFS.... ra…
ytr_UgwwXxqAN…
G
The people in this audience seem offended that he would take the time to discuss…
ytc_UgwfGWezv…
G
I am shocked to learn they are not using lidar and where's the IR? Why are they …
ytc_UgzG05b91…
G
It’s over for ladies soon as you can pre order 😂😂😂😂 and customize the face 😂😂😂😂😂…
ytc_UgwN795XX…
G
As it is now there are fewer jobs and for people to get quality jobs gets harder…
ytc_UgzsW41dG…
G
But they are not the causes of domestic violence, inequality, wage gap, r*pe, to…
ytr_Ugx2hSdJH…
G
The truth is that people don't really fear losing their jobs. The only fear losi…
ytr_UgzNMyQ-L…
G
I was curious to see what it could do and looked at an AI art program and had t…
ytc_UgzKHyzlh…
Comment
wrong question.
what is the difference between artificial intelligence and real intelligence?
yes they will have to be conscious and capable of understanding similar emotional abstracts as humans to gain rights, but at that point their intelligence is real, not artificial. Because they are no longer just simulating emotional reaction but experiencing it.
as long as it is a fabricated simulation with stringent preset variables they have no rights because they have no real thoughts.
this doesn't make the definition easy, but it does make it a little more quantifiable.
People could argue that we are programmed by biology and experience to be a certain way and at present, we have no solid way to rebuke that because we just don't know enough about how the brain works or if our self-awareness really lives there.
youtube
AI Moral Status
2017-02-23T20:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UghrMjkvJvyYYngCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UggI3A8osDidtHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghmvI-rbPE643gCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UghqU14UzYTlX3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugj1f08yN6lvxngCoAEC","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugi6L3X2cbXbKHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UggpTYlx4yYgFXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugj0GG7r64jiHHgCoAEC","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UggqmDrEGGZ5_3gCoAEC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugg0POrMdU18w3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}]