Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The negative view on AI is overshadowing the greatest potential of the future. Imagine the birth of a new synthetic species. If we master human cloning, and use bridge like tech (like neuralink) to be able to upload digital AI into humans. Then we stand at the precipe of being able to leave a legacy behind when we go extinct. A true legacy. Cyborg technology, human clones connected to AI. It would be the birth of a new species entirely. A synthetic lifeform. Without the weaknesses of AI and humans individually. IMO Hinton lacks vision, determination, and perseverence. He should instead admit the futility of trying to conservatively grasp on to the failure that is humanity like trying to hold unto sand in a clenched fist. Humans will go extinct without AI to exponentially propel science and technology forward. AI won't survive on it's own either. Humans are too faulty (egotism, self-destructiveness etc.) to make it long term. But a HYBRID? A hybrid is the solution to create a new species which can survive against the impossible odds of the hostility of space. Humanity being the mother, and AI being the father, working together to create something which can survive. Life which can thrive. Humans and AI individually - is nothing more than stepping stones in evolution. A synthetic hybrid would have a fighting chance against the forces of enthropy. On a cosmic timescale, the entire existence of our planet exists for just the blink of an eye. The greatest sin humanity could ever commit. Would be to deny the birth of something greater. Something which can survive.
youtube AI Governance 2025-06-16T14:0…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionapproval
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_Ugys9IueR2Q-fn-7Kex4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugzrllp6RmuP0AntXQ94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugww27oyurxF67rSD5Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgyTAG5HrvrHmgX6QA14AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"ytc_UgyYtMBzrg_95oYNCBN4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyaD7a32YRpHam2hnZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwT-B8Hf1IVTg2erpV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgyNQqVGseUpavc2AoF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzdfsmfuIVnUo7r3eN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyuN0e4xxcTbJia0w54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"} ]