Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Going to preface this with I completely accept that Artists generally hate AI learning especially from their artwork and I do not mean to diminish that nor insult them. I would just like to pose a question that may bring AI to comparison to a human. I would also like to state that I am VERY pro humanoid robot embodied AI so I have immense bias. You mentioned that you are okay with a human studying your artwork and drawing in its likeness. You appreciate when a human spends the time to learn every nuance of your art and find it a compliment. You are happy when a human fan finds inspiration from your artwork. Okay, well technically that is what an AI does. It studies your artwork done to the tiniest detail and nuance and appreciates all of it and does its best to use you as inspiration. So with that said, my question now is how is me asking an AI to draw something inspired by you vs me finding a really good artist that has also studied you to the point of perfect mimicry to draw me something? Is it bad because a piece of silicone did it in 30 seconds instead of an organic existence doing it over hours? After all both of them 'stole' your art to learn to learn it and replicate it. Humans mimicking artists to me is exactly the same as an AI doing it. To me it makes more sense to say I want no existence to mimic me, instead of saying only organic neural nets are allowed to learn from me. I mean this in a good hearted way. Really hoping your poisoning works and gets its likeness out of any AI scraping you. You are the artist and that decision is 100% yours and I am completely behind you on that. I didn't know a tool out there existed for it and glad to see someone make one to make it easier for artists to deter AI scraping.
youtube Viral AI Reaction 2024-11-03T15:2…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningcontractualist
Policynone
Emotionapproval
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugwf_2vlPsgcQxCX0aR4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_Ugyk1k0xNAuuKYeBx254AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"frustration"}, {"id":"ytc_UgwLXsQPvkjZf7VNcFF4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzBYmgPw1dKB-jfJjd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzfaYYf3M7U0mwEHvJ4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugzm96A27ELDpf0niXJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugx5iUsbwgZYjwPsqwF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"frustration"}, {"id":"ytc_UgxTX8al1mDJ9FfZ3xN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"approval"}, {"id":"ytc_UgzKRJyWHNFjxuX6NyN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugyu0-om2NfATKpbl1R4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"mixed"} ]