Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
They generate the most probably response based on their training data. They do not have thoughts. They do not have awareness. They do not have sentience. >All that was needed was neural networks to be similar to the human brain Resemblance != equivalence. Neural networks MIMIC *aspects* of neurons, but they don't have(and that doesn't give them by just association btw) the underlying mechanisms of consciousness - which we don't even know much about as is. We don't even have a concrete definition for 'AGI' yet.
reddit AI Moral Status 1739920305.0 ♥ 5
Coding Result
DimensionValue
Responsibilitynone
Reasoningmixed
Policynone
Emotionindifference
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[{"id":"rdc_mdiqoe3","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},{"id":"rdc_mdis7v9","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},{"id":"rdc_mdjkew7","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},{"id":"rdc_mdinqug","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},{"id":"rdc_mdirvo7","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"}]