Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I think the argument about if the computers can do things like play chess better than humans, is a useless line of inquiry. That's not the point of playing and it isn't the point about getting batter at chess. One thing is a machine, the other is someone using their skill against other people with more or less that same set of skills and trying to be better, both for fun and competitiveness. I mean, should professional weight-lifter also care that a crane can lift way more then they can or should a professional runner care that cars go faster than they can run? Its just not he point. As for Cenk's question about whether people would enjoy a movie made by an "AI" or not. It is also the wrong question. The issue comes from the distribution of revenue. The software used peoples data to produce whatever it produces, why aren't all these individuals compensated? Why is it that we consider that the person who brings money is the owner of a thing and that they should get all the revenu from what a thing does? That thing, AI, data mines through your information and uses it. Why don't we get compensated? Basically, the shift towards this "AI" technology is not really the real issue, the real issue is that it is used in a way to funnel even more money away from workers and towards the few richest people. If the revenu distribution was fair, then their wouldn't be an issue. Those that want to have these software create movies or books could get them, and the people who prefer to find actual people to create these wouldn't. But right now, it is impossible, because if the big companies get their way, there will be no way to survive on a writers wage, or artist wage or whatever other jobs wages that we try to use these software as human replacement, on.
youtube AI Jobs 2023-05-08T16:5…
Coding Result
DimensionValue
Responsibilitynone
Reasoningdeontological
Policyunclear
Emotionindifference
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgwY70R0ZplJyr-LqKF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugyq2oAt6zSgjHaf97B4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgxwE-e5lofY1uh5m2V4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzKBu9qRwbFwyzdUc14AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"industry_self","emotion":"approval"}, {"id":"ytc_Ugz5AEu9u2IuLfka24N4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugymr2fpAG9LyCHhUd94AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgyoFlvSmPxzabf5GnZ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgznytY1EmBO2ojxXl94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgyblGswMdCfiTHXtOh4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgxHcuw_DZTU__K7pfV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"} ]