Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
We have far to many that don't even know how to be good human beings in this world. Can we take some of the $ on this & transfer it to that. Since I often have to dumb it down & use analogies think of it the same way as premature e jaculation at this point. Others might be getting off & paying for it but the majority are left unsatisfied. As it is we can't even have safe rules laws & regulations for/on tech. They've been making their own so same will be for AI. Do we really want that. As with the analogy slow & steady together is more satisfying to all. Only selfish people only care about themselves. All the people you mentioned being in the pocket how is that going to help people & those jobs. Don't put the cart before the horse well need infrastructure also. The majority of people are really sick & disgusted with what tech & the overlords have already done & want out of techdom. Sure in some areas it can be good but unleashing a behemoth at such a fast pace I don't think it is wise. If I was raising small children now I would still want them to learn grow, problem solve, learn critical thinking etc on their own & human experiencial way. As with everything one can use it for good or bad the reason we have laws & rules. So many children are getting hurt on tech as it is. Let them learn to be good human beings
youtube 2024-07-25T23:2…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningvirtue
Policyregulate
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgzvoB7P6ru4b0aW09h4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwwiVE74T8CLBCTXFx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgygyUPhY8MINKQoU0l4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"}, {"id":"ytc_Ugwg1FzriWdaKBq0wIp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxQwRLxe8aJu4yY2d14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgztY1U7thSo-r7PC_l4AaABAg","responsibility":"company","reasoning":"mixed","policy":"liability","emotion":"mixed"}, {"id":"ytc_Ugxsj4RIg0qCBerierB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugz2NgJAWN0nm4qbGbh4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgzmFdwH7AYR9C5_Qbh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzZFYeanO88LHAqbv14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"} ]