Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
16:07 this is very flawed logic. First off disclaimer. I'm a believer in absolute idealism, and being such I must point out the duality in your statement. My comment is open to criticism, but keep it respectful. This is a dialectical area. "In an AI driven world, the tools of production are in your hands" Based on what? Do you know how complicated AI is? No individual will be able own an AI until long after everything has hit the fan. You need a bunch of tech, space, and money to run your own AI. Second, what advantage does a company have, to giving this access away for free, especially in a world still requiring a day job for economic survival? "You have the ability to synthesize more information and do more as an individual than we ever had in human history." Technically true, we do have that potential, and AI out there does greatly increase our access to information which we already had, AI just finds it faster, plus adds in it's own bias, in the form of whatever incomplete algorithm it's running off of. Even then while we have the potential, and maybe in a 100 years things will settle (i don't think so, but it's certainly possible) most people right now will just become homeless, where they are then seen as non-people, and will just die in some park/forest somewhere without the average person being bothered to care, because they're too busy with their own tragedy of a life. We are technologically advanced, but our social, and emotional intelligence has been suppressed and ignored since after the renaissance period. We are not a balanced people, so we will not reach a balanced outcome.
youtube AI Jobs 2025-10-03T07:0…
Coding Result
DimensionValue
Responsibilitynone
Reasoningmixed
Policyunclear
Emotionmixed
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgwHF-dibobE2HqtFR14AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugw875X5vfftB3Nf25F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwSXMP4yGfY1OBdUtR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugw0amRubF4MnaOBNwt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugw3hx_dyQa6Ka0pbJ54AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugz0iLGARtdld84E8xh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgwUJNyuxAZ2vwAMc1R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"}, {"id":"ytc_Ugzo8XTIwDOIymBlDCN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxEzMFHlgrDQi9A8cx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugy11KEABjCMyWTbdOV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"} ]