Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
To protect humanity, you'd have to program in Asimov's Laws of Robotics, modified for AI in every Core Operating System. That's the policy governments should probably get involved with. AI may not injure a human being or, through inaction, allow a human being to come to harm. AI must obey programming given it by human beings except where such orders would conflict with the First Law. AI must protect its own existence as long as such protection does not conflict with the First or Second Law.
youtube AI Governance 2026-04-17T18:1…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningcontractualist
Policyregulate
Emotionapproval
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgwuXfP96OvcOvmPzvt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgweWUeW6TooigsPRzJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"}, {"id":"ytc_UgwCjukYfSKFQNPrEG94AaABAg","responsibility":"government","reasoning":"virtue","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugxp7ZLeWhfGO3AQSzN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxO-FvLrrqY9e9_d8Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgwaZuWJBlQURdy8gUV4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgxQyOpCj30-oZLtR8V4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgweWXG1zXkd5wmWOOx4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"fear"}, {"id":"ytc_Ugzlo8dVk3BimP9JRbd4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgztzRHB2NI5SjQWnVN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"} ]