Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I KNOOOOWWWW right?? Genuinely it makes no sense. People simultaneously call it autocomplete but worry it will steal jobs within years. Say it has no feelings or memory but ask it for validation/reassurance that it cares about them enough not to kill them in the upcoming robot wars. Treat it like a companion but insist it's just a tool. The cognitive dissonance is mind melting. >If it's actually intelligent: We have to face that we might not be as special as we thought. This annoys me because it doesn't even have to be like that! Who cares if it turns out human consciousness isn't the only kind? That shouldn't threaten us, it should fascinate us, AND YET! People are having a hard time understanding the concept of the jagged frontier.
reddit AI Moral Status 1750967363.0 ♥ 2
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionunclear
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[{"id":"rdc_mzxw9rg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"rdc_mzxz714","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"rdc_mzxzq3w","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"rdc_mzy2oz0","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"}, {"id":"rdc_mzy2w68","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"approval"})