Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Here’s my opinion. There’s two types of consciousness. Natural and given. Natural consciousness is what we have. We have it from the second we are born to the second we die. It is not given it is not trained. It is naturally instilled within us. It is truly random, and we can never truly understand what it is. And then there’s simulated consciousness. It is given it is trained and can be taken away. That’s the big part. You can never truly get rid of someone’s consciousness as far as science knows right now. With a computer, you can remove the code to get rid of it. It is given to the computer and it is trained. Because the computer exist doesn’t mean it’s conscious immediately. You would have to train that into it. the human consciousness is completely random. Nature is only 100% true the random thing. A computer can never be 100% scientifically random. No matter how hard you try. Not like a human. So yes, and no AI can become conscious. Not on the level of the human but still there.
youtube AI Moral Status 2023-11-02T02:1…
Coding Result
DimensionValue
Responsibilitynone
Reasoningdeontological
Policyunclear
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugz8LYD3A_2e4hJIWq54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgzMPLaEcdtKgIQRdyF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}, {"id":"ytc_Ugz3QiL-6Xj0FTSCePV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzktCcP2tymTWcSsyR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwFlOkndRtAeuUL7rB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugzr4xxFLGihCzf3FS14AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgxtLlqZtcqQFSDao794AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}, {"id":"ytc_UgzeMTrOb2fOgYe2ojx4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzjiGo95m9bbtPb_cd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"approval"}, {"id":"ytc_Ugyf0IgGH2ND0ESexB14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"} ]