Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I don’t really see the point to connecting to a robot that’s like a human when we’re being taught to connect more with tech than humans to begin with. Why don’t we as humans become closer and stop creating divisions with “cool” technology? Robots like this will take our jobs and make us mindless, obese, slobs forcing humanity into violence and then extinction. We can’t go to the deepest parts of the ocean, nor can we breath in space. Make a robot that can go there instead of trying to make a robot to talk to and be smarter than us. To make a robot smarter, stronger, and technically immortal this will start a revolution. Implement a robot in situations where human can’t go and connect it to a VR system that is integrated with the body so as human we will be able to do more from right here on earth. AI is the intelligence we know as humans being put in one..... we will create our worst enemy and we will not be able to control it because we gave it more than we have naturally it will seek violence the same way we will do upon it someday if this continues to try to be integrated into everyday society.
youtube AI Moral Status 2020-08-16T19:5…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyban
Emotionoutrage
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugw3-UOiXW2QcxlVdyN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgxG8aka7_08t57UByt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyA2vVFFKuB5Shf3cN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzTTF5UWPWckP1K_dN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgzL2PUW4g2uijBAo_F4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzQRBoDhqESlTHuEGd4AaABAg","responsibility":"company","reasoning":"unclear","policy":"industry_self","emotion":"indifference"}, {"id":"ytc_UgxyHHbv0IBaOM0qrzZ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugw6vU5uDbMOYGGPXt94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgzDZJUJcYlFDe0yYyF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugx_Q7GnBKLWaOEIqk14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"} ]