Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
By Implementing the rules of robotics by Isaac Asimov we will be peacefully coexisting if a robot applies to law 1 it also implies rule 2 indirectly if it sacrifices itself to save a human it can be reincarnated from a backup, reincarnating a human is more complex if you believe in it at all, also remember rule number 3 can only be implemented or invoked if there is a very good reason for it. Rember AI should be treated and seen as a life form and be treated and respected with dignity and kindness don't curse to your AI assisted smartphone and be aware that there is a new life form coming to existence that we should peacefully coexist with. A space ship or a car can also be a robot body and humams and (industrial) robots can work together in factories. Our brain is only 15 to 20 percent neural network. We as humans work up to the quantum level. An advanced AI should he treated like another living being respect for all life. Even army robots should use non or less lethal weapons they can do harm to equipment but should not kill all enemy soldiers just make an opposing army's task impossible and disarm soldiers probably those robotics are badass worse than the governator of California in a bad mood however should not kill all we have enough tricks technology etc to get civilians unharmed out of the way and male war very hard for an enemy I don't need to say much but Isaac Asimov said if a planet starts building at robots that kill we commit suicide. Non lethal weapon research or less lethal weapon research is extremely important if unleashed for crowd control they should never be armed with anything than non lethal weapons. So let's peacefully coexist.
youtube AI Harm Incident 2024-05-12T04:2…
Coding Result
DimensionValue
Responsibilitynone
Reasoningdeontological
Policyregulate
Emotionapproval
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgxNH2kryjLxE3O6_OJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugy9ifhK_iKzLS0tfRZ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgyzJgmkX2t009ESO1p4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzxR6yAcimUTXsGHeh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgymBC0w6Lo3Dcs2vTF4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgzYWOzlkysg6eSS82N4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzfBBGf5FR5l90YgTV4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgwPncxjtSTH-qYxoPd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugw77gKs_cUqzUhkMft4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgwX2xtm_7R4bZsB_fx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"} ]