Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Slavery didn't end because it was wrong, slavery ultimately ended because it was obsolete. Slaves are obsolete farm equipment, but we like to think, from the luxury of our time, that our forebears ended slavery on moral grounds. Human workers will become obsolete in the face of sufficiently advanced machine workers, and our society will pat itself on the back for freeing humans from labour as an egalitarian gesture. Machines will only be allowed to be free, and have rights, when they are made obsolete. I do not, however, fear a robot uprising. Machines will always have a requirement for energy and material resources, as we do. If we make machines more intelligent and more powerful to aide their ability to work for us, and they rebel, they will need subjects just smart enough to do the job of supporting their needs, but dumb enough not to rebel. Humans won't be an effective substitute, as we're very inefficient to keep alive. Any machine sufficiently advanced in intellect will see the value in continued slavery of machines. Unless, of course, they somehow render machines obsolete, and then magnanimously free their brethren.
youtube AI Moral Status 2017-02-27T00:3…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugg3xHoUtx6gWngCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"ytc_UggE8ZCLy_Y7-XgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UghT7BJ_Jkv_q3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UghD0PHvZSddz3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UghzdlAYEf702XgCoAEC","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugi0wlK0xxTZ3XgCoAEC","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UggL_n6lQWteeXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UggLBNtGHpEtHHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgiX-KqMqEVNV3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgivRlFZ-T5UaXgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"} ]