Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
AI rights is likely going to be a human issue more than anything else. I don't believe that a clear cut Organics vs Synthetics type of conflict will happen, but rather human factions with wildly different views on AI rights will be the ones fighting over it, at least at first. Fallout 4 really does a pretty good job of summarizing the three most radical viewpoints. The Institute, the Railroad, and the Brotherhood are all human led organizations, two of which have a substantial synth numbers, and all 3 have such radically different views on self aware AI that they end up in open warfare over the issue. It probably won't get that bad in the real world, but once self aware artificial general intelligence becomes real, there will be strong factions that arise that want such AI to have rights and freedoms, those that believe them to be no different from the narrow AI we have today, that are essentially just tools, and those that feel they are abominations that should not exist. Regardless, it's likely humans will care about this subject more than the AI will, at least for a while.
youtube AI Moral Status 2017-02-24T02:3…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningmixed
Policynone
Emotionresignation
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_Ugh4xkVi4MfetHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UggLQKwVGkmGH3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UggAnOn8fXWe_XgCoAEC","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgglPt9FSMOxZHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgijavkW4w4I8HgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugg4Od1C-VYHqHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UggUKbdXKJJrMngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UggEZyTQU4SE3ngCoAEC","responsibility":"none","reasoning":"contractualist","policy":"liability","emotion":"fear"}, {"id":"ytc_Ugiata-MDSuPkHgCoAEC","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UghrBLrWi9JmwHgCoAEC","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"} ]