Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If we’re going to make a.i that can think and have it’s own conscience, we first…
ytc_UgweTq-jb…
G
I hate conspiracy theories
But, the best thing that happened to the AI giants w…
ytc_UgzIoICz2…
G
@commonsenseisdeadandgoneOwners of AI companies are trying their hardest –an…
ytr_UgwljaOiH…
G
@beakhole793 Geoffrey Hinton and Hinton are on the same side as Max, and they wo…
ytr_Ugz-ZQO-B…
G
i for real did a spit take when he said "Ai artist" like yeah """""artist"""""…
ytc_Ugwmg4lwF…
G
Imagine someone confesses to a murder on ChatGPT, then a writer asks for a plot …
ytc_UgxZAn_8Z…
G
These kind of conversations show that Elon does not understand what an LLM is. I…
ytc_Ugyj4vgEn…
G
Check out: [Butterfly weed](https://www.prairiemoon.com/asclepias-tuberosa-butte…
rdc_eh5i02x
Comment
But what about sentient's that are nothing like humans? For example if someone was going to design a sentient AI to work for them they're not going to use pain and torture as a motivator they're going to use pleasure. Doing whatever they're designed for would simply be enjoyable for them. How do we decide what rights such a being needs or deserves. What about when the inevitable nutcase designs something that finds pleasure in murder or when whatever task they were made for is no longer needed?
youtube
AI Moral Status
2017-02-23T16:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UggRTChRIG_e5XgCoAEC.8PKYumpGV8_8PKeQaTKbYF","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytr_UgiwGuogXeiLSngCoAEC.8PKXOOD5-Vn8PKaZuevC6Q","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytr_UggXH575m2uJ53gCoAEC.8PKXBHZyov88PKZpHX-dQj","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugjx2gfLE92JJXgCoAEC.8PKUkxaT3jn8PK_6CiO70N","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgixNOeeOvaCY3gCoAEC.8PKTuxzlLSd8PK_ctOVZMx","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgixNOeeOvaCY3gCoAEC.8PKTuxzlLSd8PKadfoRDRb","responsibility":"government","reasoning":"virtue","policy":"regulate","emotion":"approval"},
{"id":"ytr_Ugi12tcY5scji3gCoAEC.8PKTkFK8siC8PKeIyPbVYp","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgjGDitq2edvs3gCoAEC.8PKTjXh-9B18PKZ3gGEvOj","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytr_Ugi0V_EZKk0yKXgCoAEC.8PKTfSEVN4o8PK_7ZVnzub","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_Ugi0V_EZKk0yKXgCoAEC.8PKTfSEVN4o8PKa1brKO-R","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"}
]