Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I think this study was written by people living in the past. Good or bad, the old world is quickly disappearing, and it will in any case soon be impossible to know or prove what has been created by human intelligence or AI, and also it will be increasingly irrelevant. Also, if an AI invents something new, it can't be protected - but this will just lead to humans "stealing" the ideas of the AIs and claiming they invented them, i.e. completely unfair. But what happens when a billion bots create music, text and images nonstop every second, in every country? We will all be influenced or "steal" from them. So I think in 5-10 years, we might see a collapse of the current copyright regime, and we'll end up with: you can create whatever way and sell whatever you can sell. Anyone using e.g. Suno today knows that your own input plus detailed prompting plus making 100 versions plus selecting which outputs to use plus merging them plus tweaking them plus mastering them results in a unique product although there is clearly AI involvement. The entire regime will collapse as people sue each other for infringing on "their" ideas, when in fact AIs and humans will create like never before, so just let them. We'll need a citizen salary anyway as most human labor won't be needed as before, so ironically, AI will allow musicians to finally be able to make music and also pay their bills.
youtube 2025-02-06T19:4…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgyD9ZrqSA5_CK7d-4B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgyvSSkub5pQoRtfg2N4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugx9iW1wExxO32CqGc54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwItxVvZ6FXpB7Mc394AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugwy95wi3OYh1W5RQMF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"approval"}, {"id":"ytc_Ugy0cim6xbavsbxMsht4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugy9SArm5fFCzSV1lVZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugwu-WvB0JEMMVYP_gl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugw1SXgzi0s_bHZYZsJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzCzGiB9xoXkYKHam14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"} ]