Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Yeah, these LLMs were trained on books and social media posts. Even if psychological textbooks and studies are included in it's training model, it's to give answers to questions about psychology, not to diagnose and treat psychological issues. How many teenagers post suicide fantasies to forums or written into fanfic? How many suicides in books are portrayed as noble or romanticized? From Hamlet on down to today. >To be, or not to be, that is the question, Whether 'tis nobler in the mind to suffer The slings and arrows of outrageous fortune, Or to take arms against a sea of troubles, And by opposing end them? To die: to sleep; No more; and by a sleep to say we end As soon as desperate people started turning to LLMs, this was entirely predictable.
reddit AI Governance 1762544880.0 ♥ 18
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyindustry_self
Emotionfear
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_nnosw5d","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"rdc_nnjrfzl","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"rdc_nnl9bv2","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"rdc_nnnfa6u","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"}, {"id":"rdc_nnl6jgt","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"mixed"} ]