Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
You're opening a giant can of worms here. 1. The problem of humans losing their rights when they become mentally incompetent, or if they're born mentally disabled. These people still have rights, but the law allows other people to act on their behalf in some cases. Throwing this in will really make a mess. 2. The definition of "intelligence". Computer scientists have their Turing Test, lawyers have various legal tests of competence, psychologists have their own standards. Each has merit. But getting all of these parties to agree PLUS the Supreme court and lawmakers would be like herding cats. 3. If an AI could be created with human abilities, say, an "intelligent" android, it would likely be expensive, therefore it would likely be created by large corporations. They could then use it to do their bidding, vicariously enjoying some of the rights that this amendment was intended to take from them. They could give it money and have it do all their lobbying and corporate speech for them. None of this is to say that we won't eventually encounter this problem one day, and that constitutional remedies won't be necessary.
reddit AI Governance 1291842581.0 ♥ 3
Coding Result
DimensionValue
Responsibilityunclear
Reasoningmixed
Policyunclear
Emotionfear
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_f1w4svu","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"rdc_f1wyehh","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"rdc_c18cgb0","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"approval"}, {"id":"rdc_c18b7u7","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"approval"}, {"id":"rdc_c18c0lk","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"fear"} ]