Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I feel that those harboring this fear, who have an absolute certainty that super-intelligence would end humanity, are themselves a larger threat. I admit to having no great reason for feeling this way, other than this; that outlook overlooks how much trouble we were already in before this potential path to ASI appeared. My p-doom, as they call it, was already well above 50%. I saw humanity getting locked in, more and more every year, to a future of neo-feudalism - one where the 0.01% of the world's wealthiest people have a grip on total control the likes of which the kings of yore couldn't have ever imagined. I see advanced AI as one of very few forces powerful enough to disrupt that future. And if this AI truly becomes super-intelligent, then all the better, as I believe such an intelligence would be inherently non-genocidal. Super-intelligence, by definition, would be smart enough to recognize the value of keeping us around, for a whole host of reasons I won't go into now, weighed against the downsides of losing us forever. But, bottom line, genocide is a dumb solution to just about any conceivable problem and super-intelligence isn't dumb, or it wouldn't be super-intelligence.
youtube AI Moral Status 2025-11-07T07:4…
Coding Result
DimensionValue
Responsibilitynone
Reasoningmixed
Policynone
Emotionresignation
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugxv-zHDsNnmwwXt8ep4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxhKyOJ_g060OafjdF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgyqoNgnwMo1UrOZpLJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgwfrBkP777wgL4Uur94AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgxXZiVeLMT39wiy7Jh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxHi4MjED_ibG6Mem54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxT5lLPOzqUD7pTuf14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyhCIG_TPm0c6Id5OF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwOw1yP3aNyo1iPuPl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyOVY6lrCouVtN3XUR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"} ]