Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Robots do not, nor ever will have morals! Remember the old Outer Limits, or Twilight Zone show(can`t remember which) that had the aliens land on earth and presented man with a book called, To Serve Man? So everybody just jumped on board the alien ship to go off to a better life on the aliens planet. But in the end the book was found to not be a book about serving mankind. But a cook book! And they deceived the world into trusting the non-humans, and everybody was getting onboard the ship to take them to their own destruction! Don`t be deceived by the promises of them being made to serve mankind! AI is being made to enslave mankind! And so far, they have done a great job of deceiving mankind into trusting in a machine programmed by sinful man, who with AI will come to the logical conclusion that mankind needs replaced! Do you think its just a funny thing they thought it would be to have the robot say he was going to take over the world? No! The elite loves to tell you their plans! To put it right in front of your face! And then sit back a laugh about how easy it is to lead you on a leach like a little dog! That is how they look at you! Like a stupid dog who needs led around on a leach, because you are to stupid to lead yourself, and they only know whats best for you! I say open yourself to being led, but not by mankind, or AI, ect. But surrender yourself to the Son of God, Jesus Christ and you will be made truly free! Or put your trust in the likes of the hippy scientist! I choose to trust in God!
youtube AI Moral Status 2020-04-13T19:2…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningdeontological
Policyban
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgzhtNnFBIFSlxV4hL54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwUcZH4cXKaPgar7AB4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyTGkpUXwiNzRnMefN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxWc7ELbEUztisGYUJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"}, {"id":"ytc_UgyL4rEogXEiDnr6hTh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyOJa8esFxEpKnE6zl4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwfGD5XleBc7YK76Qt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgyQnLMvtpXE-vEaXC14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgxbdDGxcDvTtIU5lxN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgzgtJCMNOqX_gQhXtB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"} ]