Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I don't think there's necessarily a difference in kind between AI and humans. I do however think there is a fundamental limit to what people will actually do. Yes, the AI companies are going nuts with building bigger and hungrier data centers, but we're not even close to AGI with those, let alone Superintelligence. To think that this bubble wouldn't burst well before we get to Superintelligence strikes me as a little silly. I'm worried about the economic and cultural ramifications of AI closer to the way it is now much more than I am about the idea of Superintelligence. I don't think it's completely impossible, but I'd classify Superintelligent AI in the same category as interstellar travel: technically possible, but so unreasonable to accomplish that it's extremely unlikely for us to actually do.
youtube AI Moral Status 2025-11-01T00:2…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugz6idoqSOMT011KrQ54AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugw6do0hhd3IvUePHQ14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgzNEnzhSgveLp2ys-d4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxRGPaiFomrZXKI3Fl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgxFSvkAbdRnDfCWTIp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugw1sIOM6nQI3GAX3UR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"liability","emotion":"fear"}, {"id":"ytc_UgzVouUmDwYZPSfiL7h4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzUgjrdxau62lzsYJ14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugy_UfQv7wTVXnaSLJB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwnWgz4MBt3ZkNVAj14AaABAg","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"approval"} ]