Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
There is one big red light – he is a materialist, description of consciousness is outdated, the brain does not produce consciousness. Consciousness is fundamental, everything else comes from it. Maybe AI will become conscious in a million years, that’s how long it takes to evolve. AI has no emotions, emotions are pure chemistry, AI can only interpret emotions from the data it processes. AI can know everything about honey, but it cannot taste honey and have direct experience. AI knows everything because we have an experience and we interpret that experience and create a data from it, and give AI access to learn from this data. Don’t waste your time sitting in fear and trying to figure out what might happen. Invest your time in knowing yourself. Intellect is a very small part of our intelligence, we just made it big because of Rene Descartes and other philosophical concepts like “I think, therefore I am” that made us believe in materialism. If I don't have thoughts, does that mean I don't exist? The sense of "I am" does not require thinking.
youtube AI Governance 2025-06-16T18:5…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policyunclear
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgyCS6HH9n_x8QXvUe54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgyIlsFi0D1H6_xIjO14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxFzJofy7KhSh3xpRh4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgzPxERoiKiQNApj8gB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugy_NkwqTF4VPTY2pPR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugy9VT0zUad4kTJEZnp4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"}, {"id":"ytc_Ugz_tndJH_l9-Rpvv6p4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgxluhNIQt6d7VNcXYN4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxrvXwQ3bC_crRPEbF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgzfXB-WkqyshCpSojR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"} ]