Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
"Some neuroscientists believe that any sufficiently advanced system can generate consciousness" That belief is also an entirely unfounded misnomer. If it were true, it'd be very likely that our internet would have developed its own consciousness. The networks that connect our computer are so complex and have so much power connected to them (both in networking hardware and in computers connected to the internet) that they should have done so. But they haven't. Neuroscientists that make such claims don't understand the limits of computers. Most people that fear or speculate a singularity also either don't understand those limits or don't realize how incredibly specialized current AI tech is. Even "deep neural networks" can really only be trained for very specialized tasks right now, and are very severely limited by the computing power necessary to run and train any sufficiently large ones. Maybe soft computing will solve some of these issues, but that area of research is limited and mostly theoretical. As of now, we don't have even a fraction of the power we'd need to develop a real consciousness with our computers. General AI is far, far off.
youtube AI Moral Status 2017-02-24T09:2…
Coding Result
DimensionValue
Responsibilitynone
Reasoningdeontological
Policyunclear
Emotionoutrage
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_Ugh0c4l23P6EYHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgizmdfK6BHeengCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"mixed"}, {"id":"ytc_UgiS9-lmbu6FW3gCoAEC","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"fear"}, {"id":"ytc_Uggg7_XeDnLEkXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"}, {"id":"ytc_UgjlRCoviv8l7XgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgiAi7l2Sx79l3gCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgiM-TwLKWJZ13gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"}, {"id":"ytc_UghE_QrjN0MWgHgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugi_n0NFADJiGngCoAEC","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"outrage"}, {"id":"ytc_Uggf753UlzgQ93gCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"} ]