Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Then how about local/on machine AI similar to something like Ollama? So we don’t have to rely on external/cloud servers for AI since we can run it on our hardware (power)
reddit AI Surveillance 1766829512.0 ♥ -12
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policynone
Emotionindifference
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_jti4vpe","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"rdc_jtfmlbi","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"rdc_nw638p4","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"rdc_nw629a2","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"rdc_nwcy8an","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"} ]