Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I think it's dangerous when very intelligent and accredited people discuss about things they don't know. I feel this panel should not talk about AI as much of what they are saying is not really accurate nor good. Also Hinton is really biased as he has big investment in various AI companies/startups. Meaning he has all the wrong reason to say that AI is much more advanced than already is. At this point he is an echo chamber for billionaires techbros and companies and should probably not be taken seriously like every other CEO/Company. For isntance, they spoke about many different AI models that are completely different from each other. Not only, they didn't mention the majority of current AI talks are really about LLMs, which have massive problems and is seen by many more as a deadend. I'm not saying he is wrong, but he definitely fall in sensationalism and this is really souring the discussion around AI and benefit none. The talk now should be why we are investing trillions in a technoligy that doesn't really work. This would force us not to use this technology for important stuff like deciding where/when drop nuclear bombs because LLM are erratic, chaotic and extremely unpredictable.
youtube AI Moral Status 2026-03-10T15:2…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningvirtue
Policynone
Emotionoutrage
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgwGepLdmUwO8w81r254AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxshN67CzOXqtqcocB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyAhNXkZV_U3pbvrHJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugw-JlXnSPBgL9Eic354AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzOFOw9-dpol490u5l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugy-KpLzK4-_SSQV8Ox4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgxyKHoYm3AfXRkrHnl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgxioaXGzVQso_AThJx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugz4jhAdS8ZS0a2GquJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgyfBVo4ErOFWdMy1ud4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"} ]