Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
We have laws for humans to make society better. The reason we don’t live in a perfect utopia is because of the flaws of human nature. But that’s a factor we have to accept because that’s our nature. However when you create AI, how can you say that it is it’s nature? (Good and bad) When a person can look at the AI script and say that’s the reason for the malfunction? The difference is that humans are stuck with what they are, while AI is not and always room for change. That assumption that robots can be held liable is that there is no room for improvement. Which is a strange thing to say with technology in general. And it would he even less acceptable if robots held humans back rather than improve our situation. So humans don’t have to prove their worth while robots would have to. So to hold them st the same level is quite difficult while not perfecting the technology.
reddit AI Moral Status 1524974753.0 ♥ 2
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningdeontological
Policyliability
Emotionmixed
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_dy5c2nm","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"mixed"}, {"id":"rdc_dy4s4e2","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"rdc_dy4gvcs","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"approval"}, {"id":"rdc_dy4jakb","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"rdc_dy4h89a","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"fear"} ]