Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
remote piloted vehicles, sure. AI? nope. sorry, AI will never happen, despite what the charlatan Musk seems to think.....AI is just a current buzzword that's repackaged from the 50s so computer programmers can try to stroke their own egos and make more money. Computers don't "think"...they never have, and never will. they simply execute a program, even extremely complicated ones, that can appear as ordered independent thought, but is still none the less just a program. Computers don't in ANY WAY process information like a living being does, and is completely incapable of cognitive reasoning and self awareness. If we ever find a way of making true AI, it will be done by biologists, not computer coders. And even then, why does everyone assume that real AI would be any more intelligent than an actual human brain, or for that matter a faster thinker? Just because a computer processes information fast now, doesn't mean it can do it at the same rate if it had the abilities of real brains. AI has always been a joke to me, and i laugh when people make claims that there have been all these "miraculous" advances on the subject.....AI has been touted as being achievable since the invention of the transistor.....and it still hasn't happened yet.
youtube 2019-05-14T08:4…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningdeontological
Policyban
Emotionoutrage
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugy3boToPB_xWnwDHgh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugyh2gJc47ez_S9dRKN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyIE2I8RCmsT7k9A9Z4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgzldFL3hAVhJj1xO9B4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgzXHaMIBlkxJAOtj8d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgzFQY3tdogCJuB7cOR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugz9HDU1etCXTMNqZPJ4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugy5xzYdlGdJWiWktBp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"}, {"id":"ytc_UgwP-7b0tk7S3HzwoAB4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyyVZ6vNhke3sRzYqV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]