Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
It's crazy that this guy is supposed to be an AI expert, but he doesn't really know what AGI is, or that scientists 20 years ago would NOT have said we have AGI if they saw what we have today. They would see we have complicated self feed back loop algorithms., but not AGI. Anyone can see AI is still totally dependent on data input from humans. No AI currently is anywhere close to being able to do what a human brain can do. They can excel the conscious frontal brain calculation of the human when restricted to figuring out a single problem, but they cannot yet do what the subconscious brain does or match it's power. Perhaps combing quantum computing with AI will achieve this someday. Your subconscious brain runs all the machinery of your entire body, the trillions of individual cells, controls all your autonomic functions, you can drive a car automatically absorbing all that sensory data and reacting to it, all while controlling all bodily functions and you being able to carry on deep conscious thinking at the same time. Your subconscious brain can predict things instantly, and can cross space as if it does not exist. Across the street, across town, across the earth, across the galaxy. It is far faster than the speed of light, it's quantum entanglement. There are more possible connections of neurons in the human brain than there are atoms in the universe. AI has a LONG way to go.
youtube AI Governance 2025-09-07T22:1…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningdeontological
Policynone
Emotionoutrage
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgwKPAWWXeFBhzYK5iR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgzaVKx-X0V5BOCx-gd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugw0LSoRjZp4smnpFS94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwDeEM0RtCxkal29Ah4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgzAUD39IGWr7dumf254AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugyur2HVjKWRlAa5hIJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugw6yq8Bff8bw8HY6r54AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgyCFOSjMKw2ZY93VgR4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyDG8chLgOu-Av04oV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgwtzPHOpX6Qrpf1o7h4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"fear"} ]