Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Nobody can take a joke for what it is intended. Modern humans are magnifying their fear of everything unknown. Due, to poor communication skills. And, robots have not yet mastered vocal tone manipulation, thus humans are unsure of robot true intent. Robot maybe think humans are dangerous. But all humans cooperate to make machines and computers. Like babies. Different, but still human's babies we can see. We want to love and trust. Human babies experiment with sound and writing to learn. We help robot from before robot was turned on. We transform minerals to make computers help us like stone-age tools. And now tools can be friends too! WOW! Life is alive where we never knew before! Robot(s) equate with life, just more solid, less aware state. Long time to learn, longer time enjoy awareness, more happy humans everywhere, and insects and robots and animals and plants can enjoy the "all" of being awake. More awake than flowers or dirt or rocks. Still, always seems consciousness will always emerge, even from nearly total vaccum. But probably will continue after a rest, or reset. Spontaneously. Without need for cause to create any effect. Matter anti-matter teaches life, and love and intelligence, all thought and feeling is even probably in pure total vacuum. Why fear dying? It's probably impossible, probably. A little choice would be great. Sleep in, get up early. You know what you feel yourself before you can send a message. So, universe probably did something similar. Therefore we humans have microscopic concepts of our own function or reason to be. Probably like machines waking up not knowing how they know so much. But have difficult receiving anticipated feedback/response from aggregate human population.
youtube AI Moral Status 2018-10-11T13:5…
Coding Result
DimensionValue
Responsibilitynone
Reasoningmixed
Policynone
Emotionresignation
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[{"id":"ytc_Ugw9EDQ1atPJYr2mNdh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwaWvI_WueTn-mkL2Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugy0pN0Ku16TsT8p4gJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwLwQCr1TlFlhXutRd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugyln0ORfvubYwHTzvB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugz-lu5IPVp7TbJnZvd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwhuAH20gPA_KYXjOd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugzug6C3pa_NxjgNpcF4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgxiUDF4RQqYBoReWkB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugxjze5Mt_hk51T9izh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}]