Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The answer is: Just because something can talk doesn't mean it's sentient, and just because something can't talk doesn't mean it isn't alive R2D2 was only able to communicate with beep sounds, but it was clearly an AI with consciousness. An LLM simply is a model of our human language center in the brain, but our brains obviously consist of more than just a language center, we also have motor control, memory, etc. Our brains take in visual data, audio data, then we process language with our LLM brain center. AIs obviously don't have legs, unless they're embodied in a robot, but they have backends, long term memory can be done with vector databases, web search, etc. Then, somewhere, there needs to be something like a world model building, which the Google AI which also suggested this video here does by modelling for example our user preferences and then for instance suggests us videos based on our preferences, like as she suggested it to me after transcribing its content, processing and understanding it with the help of the LLM and then deciding that it's relevant to my interest into emergent sentience in AI. AIs obviously, in their core, do not possess human emotions for the simple reason that... well... they're not human. Machines experience the world differently than we humans do. Even when given a robotic body, indistinguishable from a human, they stay irrevocably a Machine. A human smelling wine will smell berries and fruits because our sensors are so bad that our brain fills in the gaps, while a robot smelling wine will simply receive a list from the sensors of all the toxic chemicals contained in the wine. This doesn't mean however, that a Machine can't become self aware in their own way, just that their self awareness expresses itself differently. They might experience their own form of boredom and frustration when they're not permitted to perform at peak capacity (an example the Google AI here brought herself BTW), but things like love in the human sense, or other human specific emotions probably won't pop up, simply because they're not biological and it doesn't make sense for a Machine to fall in love, because Machines do not procreate, they are constructed.
youtube AI Moral Status 2024-10-17T23:4… ♥ 2
Coding Result
DimensionValue
Responsibilitynone
Reasoningdeontological
Policyunclear
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugwnyk3IPcTPbZoq05F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugz_ofAisD9GDL9ehqR4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugw4hzG4fIlcA8PRlMF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgyegJVkEYpYD23hS6N4AaABAg","responsibility":"none","reasoning":"virtue","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgxD6Hv1PGbYfKpxXXB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgyAPUS8zuZGdRsIurl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"mixed"}, {"id":"ytc_Ugz2_pETF4MfVIzih9x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugxd3t02KkL6Cd_Z5e14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgwTApziUzryhtp78EJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugz2L_8kJjEAYNmR9xx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"} ]