Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
But what about sentient's that are nothing like humans? For example if someone was going to design a sentient AI to work for them they're not going to use pain and torture as a motivator they're going to use pleasure. Doing whatever they're designed for would simply be enjoyable for them. How do we decide what rights such a being needs or deserves. What about when the inevitable nutcase designs something that finds pleasure in murder or when whatever task they were made for is no longer needed?
youtube AI Moral Status 2017-02-23T16:1…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningdeontological
Policyunclear
Emotionmixed
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_UggRTChRIG_e5XgCoAEC.8PKYumpGV8_8PKeQaTKbYF","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytr_UgiwGuogXeiLSngCoAEC.8PKXOOD5-Vn8PKaZuevC6Q","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytr_UggXH575m2uJ53gCoAEC.8PKXBHZyov88PKZpHX-dQj","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytr_Ugjx2gfLE92JJXgCoAEC.8PKUkxaT3jn8PK_6CiO70N","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgixNOeeOvaCY3gCoAEC.8PKTuxzlLSd8PK_ctOVZMx","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"}, {"id":"ytr_UgixNOeeOvaCY3gCoAEC.8PKTuxzlLSd8PKadfoRDRb","responsibility":"government","reasoning":"virtue","policy":"regulate","emotion":"approval"}, {"id":"ytr_Ugi12tcY5scji3gCoAEC.8PKTkFK8siC8PKeIyPbVYp","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_UgjGDitq2edvs3gCoAEC.8PKTjXh-9B18PKZ3gGEvOj","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"}, {"id":"ytr_Ugi0V_EZKk0yKXgCoAEC.8PKTfSEVN4o8PK_7ZVnzub","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytr_Ugi0V_EZKk0yKXgCoAEC.8PKTfSEVN4o8PKa1brKO-R","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"} ]