Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I would love to see Penrose talk to Harari, because, Harari suggests that modern science is showing us that you can have very high intelligence with zero consciousness. Humans don’t have yet understood consciousness in ourselves, we have not understood consciousness in animals. Computers can run algorithms so hyper intelligent without the need to be conscious. For penrose conscious means that the algorithm can proof the concepts as true or false. If the algorithm is so accurate I don’t care if it’s aware of the fact. I don’t care if the machine has morals of true and false. Intelligence doesn’t need consciousness… the human algorithm in our brain produced the concept of splitting atoms and building atom bombs, this is maybe morally very wrong but still extremely intelligent. AI with GPT might at this point just guess or confabulate its reasoning, but at some stage this guessing can become so accurate that it doesn’t matter. All that we humans do is guess, You might be very certain that you ate Pizza yesterday but no, we had Pasta you idiot. So in a crooked way that even makes AI more human like if it’s not aware of its guesswork. 100% correct guesswork is still 100% correct. Doesn’t need to be aware of that….. but that’s just like my opinion man.
youtube AI Moral Status 2025-08-15T23:1…
Coding Result
DimensionValue
Responsibilitynone
Reasoningmixed
Policyunclear
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgztncoyzcihJwE8IBl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugyg5x4CjbVOhpl5iQd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgyymolfuVpVLJnt7ZJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzARTizWlm14XrDiYl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzzkNF9CX3o_Rg4Yop4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgyFvfKG1gRpPpOMCFh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugx-4Hb7BLKu-2fURcV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"approval"}, {"id":"ytc_Ugznh2EE5wFoqDfRxYN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgwgOun1Tu5KBN7EG3N4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgxOpP8qqvWToRG_RNt4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"approval"} ]