Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
See [perpetual lineup](https://www.perpetuallineup.org) Highlights: >**One in two** American adults is in a law enforcement face recognition network. >Law enforcement face recognition networks include over 117 million American adults. >Police face recognition will disproportionately affect African Americans. Many police departments do not realize that. In a Frequently Asked Questions document, the Seattle Police Department says that its face recognition system “does not see race.” Yet an FBI co-authored study suggests that face recognition may be less accurate on black people. Also, **due to disproportionately high arrest rates, systems that rely on mug shot databases likely include a disproportionate number of African Americans.** Despite these findings, there is no independent testing regime for racially biased error rates. In interviews, two major face recognition companies admitted that they did not run these tests internally, either. >Face recognition may be least accurate for those it is most likely to affect: African Americans. Edit: formatting
reddit AI Harm Incident 1562235012.0
Coding Result
DimensionValue
Responsibilitygovernment
Reasoningdeontological
Policyregulate
Emotionfear
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_esq4wj1","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"rdc_esrfaqq","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"outrage"}, {"id":"rdc_esrs754","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"rdc_esqk7hk","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"rdc_esqhhj7","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]