Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
You nailed it towards the end ! Intelligence can be objectively measured from the outside in terms of abstract data processing & is already quite impressive & no doubt will exceed humans at pretty much everything. BUT, there is currently no reason whatsoever to think that experience(personal inner) ought to automatically accompany computation, however "complex". We only make that mistake(appeal to our ignorance) with consciousness bc of culturally manufactured plausibility & the once useful(against the church) but still favored flawed metaphysics afoot for centuries now. NObody would think that if you simulated a kidney down to the molecular or even atomic level that your computer would pee on your desk. Yet that is what we do regarding the C-word. Of course it "could be" that abstract neural information flow patterns are enough, or at least that seems more valid than stupidly thinking that a wax replica should have an inner life, only bc it looks very much like something else. These things are designed to mimic FFS. What nature seems to be telling us is that, what does indeed have experience is metabolizing life/biology with protein folding, DNA transcription, mitosis, replication etc.(all that good stuff). You can't just completely ignore the substrate. Like I said. it "could be" that abstract information flow patterns are enough but 1st we need at least some valid reasons to even entertain that hypothesis, aside from fun sci-fi movies & bad philosophy. If you know how computers function at the gate/transistor level then either you must concede that even a single switch has some experience(the also flawed imo panpsychism) or that 10 billion switches doesn't "have it" & that somehow 1 trillion will, if only arranged in "just in the right way" that we can't yet explain, by some magic of appealing to unfathomable numbers/complexity(physicalism). I think if we do ever manage to induce artificial "consciousness"(NOT merely intelligence), it will look much more like abiogenesis(life from non-life) rather than silicon chips. & by consciousness, here I only mean phenomenal consciousness, which only means that there is something(anything at all) that it is like to be that something(usually a creature). & that doesn't at all require higher level mental functions like our self reflection/metacognition nor even intelligence. The so called hard problem is a quantity/quality quantry. That is, there is NO way to map the quantities of physics(mass, charge, spin etc.) to the qualities of experience(red, love, pain etc.) in a "non arbitrary" way, EVEN in principle. ALL science is just(NOT derogatory) the study/modeling of perception, whether or not "enhanced" by instrumentation. & perceptions are nature's ONLY given aside from endogenous experiences but, they are BOTH experiential/mental. So sure, an infinite number of things "could be" the case but, we need legit reasons to even consider them in the 1st place. We may never be able to distinguish AI from that which experiences, especially once they even look like us. But we can stop all of this Mark Tegmark(who I otherwise dig btw) nonsense with halting AI research FFS. Now excuse me while I seriously entertain the flying spaghetti monster, as his noodly appendages are awaiting to embrace me.
youtube AI Moral Status 2023-08-20T23:1…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionapproval
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgzMIxCyeN07bjUjS9N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwJ44lgEIGxlhW5g_14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxZohA5kjnT9d6Jmoh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgyDcXCQzY5rTtbs8Xx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxWe1HTo6jQ4P4ZjLh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugz9GCMqi25vfLRFOlZ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgyucGBpRLK4FynzK1B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzleC6lXjHd2vanN7F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgwqwV6_FRclhVNsuJB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgyFVcD9K4Q5JKzrl7V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]