Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I can't wrap my head around that. How does it not make sense that if AI imagery was legal it'd reduce the number of real crimes? If people are out there looking for images and they go for AI which is "safer" to produce, is this not in direct competition to real abusers? (Isn't everyone complaining that AI is taking everyone's jobs??) It's very unlikely you would not be able to figure out if an image is real or not and if there's 1 real photo in 1000 then it's better than 1000 real photos... unless you wanna argue for "acceptance" of pedophilia but even then... Let people be pedos all they want if this reduces harm to children. Also AI can easily make these photos without real references.
reddit AI Harm Incident 1730122163.0 ♥ 12
Coding Result
DimensionValue
Responsibilityunclear
Reasoningutilitarian
Policynone
Emotionmixed
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_lu68v8o","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"rdc_lu5vzkj","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"rdc_lu6d6l6","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"}, {"id":"rdc_lu6870d","responsibility":"unclear","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"rdc_lu6mzxq","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"} ]