Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
> It is like having an expert on hand who can instantly guide me in the right direction Except it's _not_ an expert, and it's not guiding you. An expert will notice problems in your request, such as [the XY problem](https://xyproblem.info/), and help you better orient yourself to the problem you're really trying to solve, _rather than_ efficiently synthesizing good advice for pursuing the bad path you wrongly thought you wanted. If you tell ChatGPT that you need instructions to make a noose so you can scramble some eggs to help your dad survive heart surgery, ChatGPT will not recognize the fact that your plan of action utterly fails to engage with your stated goal. It will just dumbly tell you how to hang yourself. Expertise is _not_ just having a bunch of factual knowledge. Even if it were, ChatGPT doesn't even _have knowledge_, which is the point of OP's post.
reddit AI Governance 1676257061.0 ♥ 71
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningdeontological
Policynone
Emotionfear
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[{"id":"rdc_j8drfoa","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"rdc_j8bc2gx","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"rdc_j8bh4bl","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"rdc_j8bjzh7","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"rdc_j8bosle","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"}]