Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
A lot of L takes. I get brandon has to make a business decision. He has a large following and those people care about this. He doesn't actually care about AI, because he was using it until he got backlash, this isn't like napster 20 years ago where it was a little more murky about what the technology was doing. He has artists on payroll. He has money to not care about the cost-saving of AI. Unfortunately the cat IS out of the bag, and it's not something as dumb as "if i don't rob him, someone else will." It's a tool that will be used to out compete anyone who doesn't use it. America will never legislate against AI because the rest of the world won't. It's technology. And unfortunately technology is seldom "ethical." AI is here. It's going to get better, and it's going to impact every industry on the planet. The only reason this is controversial is because it started to impact our "precious" artists. No one cared about LLMs. No one cared about self driving cards. It was only when image generators became good that this became a big deal. Before that, it was all fun and games. But now because an artists can't charge an author 1,000$ for a image anymore, it's unethical. So go ahead and take your stances. Meanwhile Disney, Amazon, and every other major company will use AI. Even TOR, which is/was one of Brandon's publishers, uses AI. They got caught and just shrugged their shoulders. This is just the current online witch hunt and will continue to be for the next 2 years.
youtube 2025-06-26T04:3…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningvirtue
Policyliability
Emotionoutrage
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgyxVzRafyvHRPAx9pl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwvnoXSS6G8nVVcBYB4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgwS02gdgtkIJKWJ8K94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgzSLg-QB4WqPBsxLop4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgybQfb9EFGLMCyz_z94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgytxqlBGtxpFyiP7xt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugzk_0VSKFtor4BDQzZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzgIH1W4yPK6iXbVct4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgyQYluhWtxbxze1NHx4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugy_WpF22Q7FJa0JMCl4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"} ]