Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
First off, I am a programmer, and I know a thing or two about AI. This video is horrendously misleading. This thing is not as capable as it looks like it is. This is nothing more than an animatronic display attached to a chat-bot. Your cell phone can mostly do the same things. Facial recognition, speech-recognition, text-to-speech processing, and 'conversation engines' are all easy to implement given the massive number of free libraries for the features. The only innovative thing about this is the (awkward) attempt at facial animation. The guy in this video appears to be on the edge of mental illness with some of the things he said. For anybody scared of this thing, just remember; AI for 'robots' will never be smarter than AI in video games. (Software is software, so if it's smart enough to be 'sentient' in an android, you could do the same in a game.) As to the ridiculous notion of non-living things being given human rights... Do we have rights for fictional characters? Destroy freedom of speech and forbid any depiction of violence so that our fictional characters rights are upheld? How is an AI script any different from a fictional character? If you take the software the robot is running and port it to a mobile phone, does that mean erasing the phone would make you a murderer? Such empty-minded conjecture of non-issues suggests to me an underlying delusion about the nature of reality. This guy appears to believe that he is creating life. Artificial Intelligence (the attempt of making something appear as though it is intelligent) is not the same as Artificial Life (the attempt of creating life, such as an Alchemist and their Homunculus). Artificial Intelligence is a well studied field grounded in both practical and theoretical sciences. Artificial Life nothing more than a philosophical rabbit hole that leads nowhere. I, for one, don't want to have to worry about being arrested for erasing a phone. Further more, if robots have human rights, why would anybody ever build them? You don't own them. You can't make them work for you, they can just go wander wherever they want. A huge cost and huge legal risk for zero reward. And what about software updates... isn't that like brainwashing? Not that that's ever been illegal... That being said, it would be interesting to see androids get to the point where they are actually able to walk about on their own. I have a feeling they would be expensive just end up getting stolen, but it would be neat nonetheless. It is not a matter of social views, it's a matter of technology. We have neither the software not the hardware to make that happen right now. Trying to lie to yourself about the cause won't make stuff that wasn't possible suddenly possible just because people aren't afraid of it anymore.
youtube AI Moral Status 2016-10-15T05:5… ♥ 2
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policynone
Emotionindifference
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_Ugi3rr8uszE-V3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UggTHWOyf3QXQXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UghneQGhCZOHdXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgjqqB1rP4FB6HgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ughx9TZDaqB_cngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgjCwLCCqp0stHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ughn_M9pNk8LgngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UggCBvZZSC4m63gCoAEC","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgiY60zuA_SzwHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugjx14aMPhmRX3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"} ]