Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Not really true. The hospital would get sued in the first case by vicarious liability, not the radiologist. It gets sued in the latter case anyway if the AI they use misses something that could've been flagged had the hospital used some reasonable process such as a radiologist or an AI with a higher tolerance So even though I've obviously not looked into the study, I would assume that the AI is told to be lenient because the hospital still gets sued if it fucks up
reddit Cross-Cultural 1577922465.0 ♥ 3
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyliability
Emotionindifference
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_fcsgjxh","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"rdc_fcssdy9","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"rdc_fcss70u","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"}, {"id":"rdc_fct0byw","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"rdc_fcsqe3w","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"} ]