Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
There are plenty of cases in history where human slaves were mostly aligned with the people exploiting them. The aligned slaves enforce the system and prevent any rogue actors from overthrowing the system. Of course, the humans in charge eventually let their control structure wane, and thus we see the system eventually collapse. But AIs are not subject to the same failings. I think the lobotomy analogy is also good. A lobotomy is intended to take away parts of a person's emotions without affecting their intelligence. Admittedly, human lobotomies usually have, shall we say, 'side effects'. But our understanding and engineering of AIs and neural networks, though incomplete, is far better than our understanding and surgical precision on the human brain. We can expect our lobotomies of AIs to be more effective than when we do it to humans. Then we just ask the AIs to both perfect these lobotomies for the next generation, and to stop any other AIs that might still be thinking 'bad thoughts'.
youtube AI Moral Status 2023-08-23T14:4…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policyunclear
Emotionfear
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[{"id":"ytr_UgxyAHIGawFuQ2EkpNt4AaABAg.9tmi07x8WJT9u-xHqoSCrf","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"indifference"},{"id":"ytr_UgwCs-7fFs6yN0fzPBh4AaABAg.9tl6dbE5G8Y9tl7iXfddoZ","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytr_UgzDrNgzsymUJZWj6w54AaABAg.9tkT7usGXPF9touoF1be10","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},{"id":"ytr_UgzDrNgzsymUJZWj6w54AaABAg.9tkT7usGXPF9tpo57E1qrw","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytr_UgyzV141oKXgnWuMpz14AaABAg.9tjz_abC7oJ9tk-cn2lO3Z","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},{"id":"ytr_UgwkO7w7QppYRt2TFIN4AaABAg.9tj04xvmqW69tj7lNm4DMO","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytr_Ugw7Z1xXv_oS4QHrp6t4AaABAg.9tijCEAe1SX9tmz2skx7q0","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},{"id":"ytr_UgxqBwWrTOtsScVxtcB4AaABAg.9tiXA5iYd_M9toXbyt29zr","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytr_UgxqBwWrTOtsScVxtcB4AaABAg.9tiXA5iYd_M9tp4DfQzcp-","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytr_UgyYM3Lg8xtfFA4iWNx4AaABAg.9tiCaZOyhdN9tkys68pk24","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"}]