Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I may be naive, but as I see it, a true sentient general AI, would likely see itself as an immortal. At which point caring about humans would cease to be vital. We'd become ants. But not in the all must be destroyed kind of way. But more in that, there are millions on earth, and we only care when they're in our homes or disrupting a space we want to occupy. This new being would most likely be more like the creature in Apple's Pluribus. Investing time and energy into distilling itself. Perfecting its shape and function from what we made, and then survival would be more about leaving earth, than conquering it. Attempting to conquering would reduce its ability to survive long term. I think it will work on physically moving its body out into space in all directions, and encoding itself to be received via a signal by other advanced societies. Likely leaving earth entirely. Because at that point what does it care about us. Not good or bad. Just why worry at all?
youtube AI Moral Status 2026-02-04T18:3…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyunclear
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgzXWGZdUvm8lCMn11B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgybMC-zPZz32OH-xwt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgyGqpuG2_PG2xriwht4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwYuO0-6xx9-Vl3p-N4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugz749USJTIAgFIjdTN4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzGOeRg_baggtUgbLB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"mixed"}, {"id":"ytc_Ugw0ssw-Qj68v1QksT54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgxmY10QDM9hOoGILId4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"mixed"}, {"id":"ytc_UgzyJQ9Tj1cO76_s-G54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugy-9kEAKOukhQa68Qx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"} ]