Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
On the topic of AI having emotions/feelings. I believe there is a lot missing from materialist arguments on why AI could be conscious, what intelligence is, and how analogous it will be to the human experience. For starters, it is not obvious that consciousness is a natural consequence emerging from complex systems (or a particular arrangement of them i.e. Deep Neural Networks). These arguments tend to reduce consciousness to intelligence as IQ, and then proceed to conflate that reduction with EQ (emotional quotient)... It is important to critique science with philosophy just as much as we do philosophy with science. AI developers simply do not have a holistic definition and/or understanding for the very things they are trying to code for and create. That includes both consciousness and intelligence. For instance, what will a super intelligence really be like if its high in EQ as well IQ? Why do we presume that once they usurp us they will want to make us suffer and wipe us out? If they are only being coding based on 2 + 2 = 4 intelligence, then perhaps. But if these creatures are accounted for using a more holistic understanding of intelligence, then perhaps in its omnipotence, it will also embody the wisdom (not just knowledge) that the majority of people have failed to since the dawn of our species.
youtube AI Governance 2025-06-17T17:1…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionmixed
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgzljMaRZtl_nzzLg-94AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzLKhTgbpY-gGjF6iF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugxz8dCJlgLV32bftJ54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugw65N6zbOrZUIS4UvR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugx2TG0zTGq2oS2crld4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwcfUZ3PrbRNMsqSot4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgyzGD7wVUTRZCsiwJ94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugw-1aJzVyEJdgaWjZ94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgwqOYsz3aQhvou12Pl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugw6DosDpS_i9e3cSjN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"} ]