Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I've been studying neural networks recently, tons and tons and tons of reading. After understanding how a neural network is constructed, maintained, and auto-improved, my view of things like AI changed quite a lot. Prior to this, I was on the middle of the fence. Now? I think it's based solely on what parts construct consciousness itself. ...And this is one side to the argument people will face. After looking at this in many perspectives, I've decided to use this as an example for what people will do: Some will apply a form of glorification to the mind and who we are as members of a society, pointing out things like social interaction and things of a generally social nature into the argument. One might say something like, "We're special because of who we are as people. We have experiences and feelings so complex and fine tuned that no bot could possibly feel that." Likewise, others may view it on a low level, looking at the very building blocks of life. They might say "Feeling, emotion, instinct, and everything biological are all just a set of chemical reactions in a controlled set of bounds defined by other chemicals - If that's the case, then why can't a robot be deemed conscious in their own electronic pathways and systems?" If I had to take a stab at guessing peoples' reactions when the day of AI comes, I think the events may be quite similar to those when people were fighting for rights around the 60s, except in this case, some mandate requires bots have rights, but the mindset of the people contradicts that, and as a result, they think bots are just a machine like your oven. It's an interesting concept.
youtube AI Moral Status 2018-04-18T03:0…
Coding Result
DimensionValue
Responsibilitynone
Reasoningmixed
Policynone
Emotionindifference
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgzHeP5PuF__paoccSV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UgzZllfuTlv47zYMWPN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwohNn0l0WX4TmzveF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzC_pylmq5gOFFIunl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgxmjLPpZesA_tUGjpl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyorlZZ8G8SoYY8hCB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgyQbO8ZCx5Bk7mVcGh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxDqx5ZEsC8233V4bR4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxQHHfA9qCXPW0wg-94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzEZ1V5gxmaROZCJoF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]