Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The belief that we as a society, would relinquish the power of an AI or worse, AGI, to be exclusively used by a Worldwide Government, is absolutely ludacris. Why is Geoffrey blaming companies? Because companies and their utility of AGI would exceed that of the Government capability and would be able to, along with society in general, keep the Governments use of it in check. AI is the future "weapon" society will need to protect itself from bad actors that include Governments. Guns will basically be effectively rendered useless against this technology in terms of protecting yourself, your property, and your inaliable rights. If you lose, in effect, the 2nd Amendment by losing access to AI, society will be left defenseless. Geoffrey has substantial benefit to gain from his vision, holding a prominent political and technological role in a world-wide government as its chief AGI architect. I cannot imagine a worse situation. Giving an all powerful government, with absolute global control, absolute power. Geoffrey vision is worse than the modern day Oppenheimer.
youtube AI Governance 2025-06-16T16:2… ♥ 1
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policynone
Emotionapproval
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugzbfr58Vi_2kWOEvSN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgxSBX2ZWxLgE1SfIAh4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgxnjEJigcgIkcpmEmp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugyb4zVfTw9Z7ez4EIh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxHDJWYmnovNazeDh94AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"ytc_Ugxt41MZMXzszpssEPd4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgxXA3t6K4KFYSdhdbR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugz4oAsKkKsfWeFlhpJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgwT5Sv5doQu2QZnDe14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}, {"id":"ytc_UgyHjNkMj28YtJmmqLF4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"} ]