Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The question is not technical, it's philosophical. And that's the problem; people are treating it like it's a technical question. We can't even truly define consciousness yet. No one knows where it even comes from. Like if you used CRISP/CAS-9 to clone a human being without starting from an already fertilized human embryo and rewriting its DNA to someone else's, which no one will do because of how highly illegal it is globally, no one knows if that clone will have consciousness or if it will just be a husk that lives but has no thoughts or desires of its own. It can be assumed that starting with an embryo, it would have whatever consciousness that it would have had if you never overwrote its DNA to make it a clone, because at its base it's no different than normal IVF. So until we can actually determine where consciousness truly comes from, and where it exists in a human, we won't truly know if the machine has true consciousness or is just making decisions based on intelligent logic loops. Once we determine what and where consciousness is, we'll have made one of the final steps towards immortality. Once we know where it is and where it comes from, and WHAT it is, the next step is determining how to transfer it to somewhere else. In which case we would see extremely wealthy people "backing up" their consciousness to massive petabyte data centers so that it can be later transferred to an artificial body or a clone of the original host. But once we reach that point, there will be absolutely no difference between an AI that has achieved true consciousness, and a naturally born human storing their consciousness on a machine. The only thing that would differentiate them is their origin. That will be the true singularity; when a conscious machine and a human consciousness stored in a machine are indistinguishable from one another.
youtube AI Moral Status 2025-05-01T03:3…
Coding Result
DimensionValue
Responsibilitynone
Reasoningmixed
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgyWMFL5YIH0pD-qBe94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"fear"}, {"id":"ytc_UgyxRrXcbIy2BDaDfvZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwMaRrL7XASO5B-n-54AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UgyRWqD9uIXVUwuZcA94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwBerpDJxrd-4cd1fF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzzEbM5rApzwJeHmWp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgxQXF4bg88VTP0sXvB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgyQlsSBIodk6g153H94AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyoKoSUzUWa1CbLenB4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugwvv_0LvcQ-UbiryEd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"} ]