zlacker

[parent] [thread] 4 comments
1. Taywee+(OP)[view] [source] 2022-12-15 13:11:29
If you can find a copyrighted work in that model that wasn't put there with permission, then why would that model and its output not violate the copyright?
replies(2): >>astran+e >>mcv+cg
2. astran+e[view] [source] 2022-12-15 13:12:30
>>Taywee+(OP)
https://en.wikipedia.org/wiki/The_Library_of_Babel

A latent space that contains every image contains every copyrighted image. But the concept of sRGB is not copyrighted by Disney just yet.

replies(1): >>Taywee+D2
◧◩
3. Taywee+D2[view] [source] [discussion] 2022-12-15 13:23:50
>>astran+e
Sure, but this isn't philosophy. An AI model that contains every image is a copyright derivative of all those images and so is the output generated from it. It's not an abstract concept or a human brain. It's a pile of real binary data generated from real input.
replies(1): >>astran+l5
◧◩◪
4. astran+l5[view] [source] [discussion] 2022-12-15 13:38:11
>>Taywee+D2
StableDiffusion is 4GB which is approximately two bytes per training image. That's not very derivative, it's actual generalization.

"Mickey" does work as a prompt, but if they took that word out of the text encoder he'd still be there in the latent space, and it's not hard to find a way to construct him out of a few circles and a pair of red shorts.

5. mcv+cg[view] [source] 2022-12-15 14:29:03
>>Taywee+(OP)
The idea behind that is probably that any artist learns from seeing other artists' copyrighted art, even if they're not allowed to reproduce it. This is easily seen from the fact that art goes through fashions; artists copy styles and ideas from each other and expand on that.

Of course that probably means that those copyrighted images exist in some encoded form in the data or neural network of the AI, and also in our brain. Is that legal? With humans it's unavoidable, but that doesn't have to mean that it's also legal for AI. But even if those copyrighted images exist in some form in our brains, we know not to reproduce them and pass them off as original. The AI does that. Maybe it needs a feedback mechanism to ensure its generated images don't look too much like copyrighted images from its data set. Maybe art-AI necessarily also has to become a bit of a legal-AI.

[go to top]