Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ppl from all industries put too much faith in Chatgpt. It's both funny and conce…
ytc_Ugx4n1Qmf…
G
Its so scary with AI, i had seen something like ai audio, someone was using AI a…
ytc_UgxMToVHQ…
G
Sophia, the AI robot, does not sweat as she is not a human. She is a sophisticat…
ytr_UgzFbIsjY…
G
I'm seeing a noticeable trend, besides the Greg. D&D.... Which we know they were…
ytc_UgyE9zfdK…
G
Bro, if your stupid kid is asking chatgpt for life advice other than the parents…
ytc_Ugw2ppS7b…
G
It's the exact opposite. Creative jobs are the most needed ones that cannot be g…
ytc_UgympXp9d…
G
Right, instead of having a human do all the stuff it just mentioned and get paid…
ytc_UggQGMjlE…
G
Use AI at your own risk. I have been for years. I do not own any of it and I cla…
ytc_UgxAVRmht…
Comment
I really like your approach in comforting people, as you point out that generated images or generated stuff in general isn’t copyrightable. And I see the way you see it. Human art will not disappear.
However, I don’t agree with your comparison to 3D. It’s a different set of skills that you need, but you still need the basics in art (anatomy, lighting etc.). As for image generators, you need none of them. You don’t need any skills and you don’t have to do any work. Prompters simply think they have control while they actually don’t. The only thing you need is a computer and a keyboard. No need for studies in anatomy, no need to observe your environment, nothing because the AI did your thinking and your work. It’s like commissioning an artist. Communication is also done via text. Every person can use it and that’s one of the problems. If they were properly labeled as generated and not handmade, that would be completely different. People aren’t fooled and can decide for themselves if they want to look at generated images. This isn’t just concerning visual art. AI generators have been used for spreading hate by generating images, which show aggressive behavior of foreigners. I’m really glad that lawsuits are being made and I really hope there will be forced visible watermarks as a result. Especially one that can’t be erased. I hope it will also help voice actors, as they are forced to sign away rights on their voice. Or let's say: I hope the lawsuits help those that need said help.
I’m honestly not afraid of AI itself, there are so many uses for it where it will be a huge advantage. Detecting cancer for example. I’m more afraid of humans, as I know by now what they are capable of. And tech bros are some of them. They are not your usual prompter. They are predators. And I really mean it that way. They are harassing and gaslighting artists. They don’t know what consent is and they will take whatever they can get. Unstable Diffusion was one of their projects to create NSFW deep fakes. And they were very open about using around 25 million of cosplay photos. Photos of real people. This is one of the problems that artists face. Not the AI, but humans. We were able to report them wherever we could and pushed them back.
And we adapt to it. By fighting for what we love. Human made art will never disappear, that’s true. That would be horrible for developers of AI generators as they don’t use generated images for training. Only art done by humans. They rely on us. With that knowledge I raised my prices as I now see how valuable my work is. My art might not be good, but it has value. Each one of my pictures is worth so much more than a ton of generated images. Of course, I will continue to draw. It’s my life. But I will still fight for a future in which AI and artists can coexist. If developers had asked artists and only used their work WITH consent, I think this discussion wouldn’t be as big. Bruce Willis consented to training an AI so he could still have a career as actor and that’s how it should have been. With consent.
There are also a few things you can do:
- use noise in your pictures as this can disrupt training (developers add noise themselves to prevent overfitting, but you can use that, too. It already has an effect)
- look for ‘Glaze’. It’s a tool that disrupts training on your style. The noise that you see isn’t the actual disruption, but if tech bros try to erase that noise, the quality of the picture significantly drops. Works good on details and textures.
- look for websites that apply invisible watermarks of generated images to your art. They won’t be used in datasets as bots ignore them. Generated images won’t be used for training. Just be careful as resizing can already destroy said watermark.
- change your language:
AI ‘art’ -> AI generated images
AI ‘artists’ -> prompters
AI ‚art‘ generator -> AI image generator
Because it’s not art. It’s just based on art.
-if you are unsure whether someone simply generated an image, ask them to draw in a livestream or ask them for a pencil drawing. If they can't produce the same quality, chances are they just generated the image. Even digital artists can draw on paper. Prompters can't. Just don't go on a witch hunt and accuse artists of something they didn't do (happened lately).
Those are just some possibilities to protect your work. They are not perfect, but it's a start and they are constantly improved on which might give us time.
youtube
Viral AI Reaction
2023-04-05T17:0…
♥ 179
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxLc1aJXBAdW8iun1N4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzuA4ql7hsbR31wrsF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwfeFtC8MbXzL2rDk14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwqRk1tzGlO1DDXEuF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy8XdWKn9lPovlaftJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy1gixC1o4OpqpBSb54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwkdB7MQNu308GgnrR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyV6E7Hs22qj9EWoih4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy9p_1C5FxVfShp1HV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxsaHocMbfM35DUHkZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"}
]