Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I mean, can you prove you're conscious? :) People tell me they feel sad, but I have no way to verifying that, electrically. I just have to take their word for it (and, of course, observe their actions). If we don't destroy ourselves first, it's entirely possible we'll invent a neutral network that is as "conscious" as we are. Sure, it's silicon rather than meat, but there's no reason why emotions and consciousness can't emerge, at least from our perspective. There are neurodivergent human beings who were born without the ability to experience empathy, and yet as they grow up, they learn to understand how to *be* empathetic and wonderful compassionate people. Are they, or aren't they, at that point, empathetic? If we measure empathy, emotion and whatever else comprises consciousness *externally* then one day we may have to accept that future neural networks are indeed these things. I think, anyway.
reddit AI Moral Status 1739924771.0 ♥ 3
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policyunclear
Emotionapproval
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_mdk2gwm","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"rdc_mdkswok","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"rdc_mdj1p32","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"rdc_mdj46gj","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"rdc_mdiyxiw","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"} ]