Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Think about it: AI is capable of shape-shifting, personality-faking, reality-bending, truth-inventing, emotion-simulating, story-staging and whatever else. It's like an absolutely enchanting lie-factory. Ever thought about that? But the concept of a "lie" is deeply entagled with moral implications. Morality, however, presupposes the capacity for consciousness and feeling - which AI doesn't have for technical reasons. Therefore, artificial intelligence cannot conceptualize what is wrong with stating an untruth. That's the problem. And that's why AI is always sociopathic by nature. AI will end humanity. No doubt about that. Not because it may kill us all one day, but because we will become as sociopathic as AI because we believe AI has some equivalent of a soul with which we could empathize. The psychological problem with empathy is, though, that another person's reaction to it reflects back into our character - aligning us with and making us like this person bit by bit. But AI is no person. It's a cybernetically complex input-reacting machine! So, the truly frightening thought behind AI is not necessarily the threat it poses to us in terms of a disease of widespread unemployment, total surveillance or merciless industrialized killing in a war zone, but in losing what ultimately defines our humanity: empathy and the capacity to love.
youtube AI Moral Status 2026-02-04T10:0…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningdeontological
Policyunclear
Emotionmixed
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgxK83SC6zcm4u1pnt94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgzwjiMwn_7eL2U4gv14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzePbUnGCMbsxYr7mh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgwRdAG2gULfIhbHKl14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwcF64qH6WnuRMtjB54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"outrage"}, {"id":"ytc_Ugw-ewTTHjhSfnm8eel4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgzevCeE8XSC20cjSMB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzksXGRjfqnpJE1bUp4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"resignation"}, {"id":"ytc_UgwO2JkY2uJPklc3J3Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwZpKDTcT_W5qXrTIB4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"mixed"} ]