Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
AI isn't a parrot, it doesn't copy, that's not how it works. Neural networks process complex patterns to perform tasks requiring contextual understanding. Like take gpt 4 for example, it generates text by synthesizing context, not just repeating data. Or translation tools, those need to handle grammar and idioms, for which they need linguistic reasoning beyond just word substitution. Or when a user applies AI to math theorems, which require abstract reasoning. This is great example of why AI will never fully replace human jobs. This person is confidently wrong about a tech they've never bothered to research. If they run a company and buy an enterprise version of some AI to manage the company's data, they wouldn't know how to troubleshoot it because they fundamentally don't understand it. Plenty of jobs will require AI assist going forward, sure, but you'll need a human to both guide and correct the stuff AI spits out. That's not a bug, that's just how this tool is used.
reddit AI Moral Status 1746979628.0 ♥ 37
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policyunclear
Emotionapproval
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[{"id":"rdc_mrrnk9i","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"rdc_mrsci5v","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"rdc_mrrwpmc","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"rdc_mrrpufj","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"}, {"id":"rdc_mrv5g24","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}]