Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I believe even people with hundreds of real-life CSAM content on their hard drive have gotten less than this guy creating deepfakes. I guess it raises the question on whether a deepfake can be considered rape and by definition it is involuntary pornography already. If you would take regular (clothed) images of young kids and hand draw explicit things around them, would that already fall into the same category like This guy using 3d rendering/ai software? 20years ago I don’t think People considered cheap photoshopped fake nudes a real harm. But now with the photorealistic AI fakes, it gets all much trickier..people loosing jobs,friends/reputation
reddit AI Harm Incident 1730133423.0 ♥ 24
Coding Result
DimensionValue
Responsibilityuser
Reasoningconsequentialist
Policyliability
Emotionmixed
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_lu8cz6b","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"rdc_lu77nap","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"mixed"}, {"id":"rdc_lu6f3j5","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"}, {"id":"rdc_lu6ws3g","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"rdc_lu7h4ha","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"} ]