Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
This is just wrong because of the simple fact that the AI doesn't have a body. There is a horrible misunderstanding here on how humans feel emotions. Living beings don't "think" their emotions. They feel it. It's an internal signal. Even our gut bacteria is determining parts of our personality. It's a complex system that involves the whole body. Your post assumes that all it takes to immitate a human is to process language information. But the truth is, even the language itself is limiting our way of communication. That's the purpose of art, for example. Art speaks to the feelings, as it is able to communicate something deeper, which can't be communicated by language alone. The human psyche is one of the most misunderstood things that exists, as we are the most biased about it. To be able to replicate the humans mind would mean to understand all of human psyche, which we are far from it.
reddit AI Moral Status 1676612206.0 ♥ 19
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionunclear
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[{"id":"rdc_j8uxcae","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"rdc_j8vtcc6","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},{"id":"rdc_j8venqd","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"rdc_j8voz1b","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},{"id":"rdc_j8w4dhm","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"})