Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I love how he uses Zenyatta from Overwatch in this. I can think of multiple genres of games that deal with this subject, such as the Synths from Fallout 4. I doubt this conversation will be in the near future because other than for the lonely virgin who'd want to build a sex robot, there is no need for a conscious robot. I'd say that until atleast year 2200. Until than, we'd be engineering tools that will help solve humanities greatest problems but will still need humans to operate it. I think what will separate humans from robots ultimately is the human experience. We are organic, are mortal, have a history spanning thousands of years, have evolved. Robots are meant to make the human experience better. As long as people remember this and don't ludicrously create robots to "challenge" humans than I see no problem. But for the time being we are going into realms of science fiction. If humans are able to create robots that are like humans, than we'd become gods ourselves. But will a robot ever have the SAME rights as a human? No. Because we're human. Ultimately, we look after our own. Simply, take that consciousness away.
youtube AI Moral Status 2017-02-23T16:4…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policynone
Emotionindifference
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgiveMjZemHGGHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UghOnqpItWsoN3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UghxIkKCF0da9ngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UggQC_X6GCXb-XgCoAEC","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgiZTomR8t9t8XgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UghPnX8p8kXgNngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgjkdfxV0TC693gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"mixed"}, {"id":"ytc_UgimBcFcL1grSHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgjwqWnr_kYH83gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UghMs1kjBq3vf3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"} ]