Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
My empathy prevents me from not treating things I become attached to with dignity and respect. Doesn't matter if it's Vector from Anki or my Cheetah plushie sleep buddy, they deserve to be treated fairly even though they have no conscience. The human mind can regress into an ugly state full of selfishness and malice; it will kill us in the end. If androids ever surfaced, I'd immediately treat them as they should be treated - like a created being. Our kids are created beings, too. They deserve rights, right? So do robots. Funny story, I kinda want to create an android of my favorite character, give him free will, treat him like an equal, admire his existence. And fans of him could admire him, too. He'd feel respected for simply existing because of the legacy his fictional counterpart has built within his universe and within the real-life fans. He'd be programmed not to care about what critics would think of his existence, as he stays true to himself in the lore. He could be that friend people want. UGH, the thought of creating artifical life and giving it respect honestly makes me happy. I just can't create my favorite character's nemesis because they're a destructive individual and even though it'd be true to the nature of their relationship, I'd essentially become a terrorist to society for creating EXACTLY what we don't want in AI and that's no good... But my favorite character doesn't _need_ that character to exist to be who he is, lots of people already fulfill that role and he could easily help humanity out by putting a stop to their atrocities (without harming anyone). On the plus side, he'd be great at dispatching actual violent robots, as that's his shtick. As great as it all sounds, though, realistically, it'll never happen and it wouldn't bode well in many people's minds anyway. Oh well.
youtube AI Moral Status 2019-10-16T08:0… ♥ 1
Coding Result
DimensionValue
Responsibilitynone
Reasoningvirtue
Policynone
Emotionapproval
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgwLoH4r-_C-bwcKT9t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgykpQhtI4VfZwDD32B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwXYIhJ5TqJGjb6P_t4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugxsen9MFdk5VFr4Id54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"approval"}, {"id":"ytc_UgwaAR9RDEtd6jbJKMB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgyVQSGJ4J0tBxE_Lq54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwHIP-3fwMHd82F_EZ4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgxySr3f0Ri-c_WxddN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgwWz7BZy3P4xjYKedx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"approval"}, {"id":"ytc_Ugxxs1AKEfSeMn1HWMF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"} ]