For example, I know artists who are vehemently against DALL-E, Stable Diffusion, etc. and regard it as stealing, but they view Copilot and GPT-3 as merely useful tools. I also know software devs who are extremely excited about AI art and GPT-3 but are outraged by Copilot.
For myself, I am skeptical of intellectual property in the first place. I say go for it.
I am also not a hypocrite; I do not like DALL-E or Stable Diffusion either.
As a sibling comment implies, these AI tools give more power to people who control data, i.e., big companies or wealthy people, while at the same time, they take power away from individuals.
Copilot is bad for society. DALL-E and Stable Diffusion are bad for society.
I don't know what the answer is, but I sure wish I had the resources to sue these powerful entities.
Not sure I agree, but I can at least see the point for Copilot and DALL-E - but Stable Diffusion? It's open source, it runs on (some) home-use laptops. How is that taking away power from indies?
Just look at the sheer number of apps building on or extending SD that were published on HN, and that's probably just the tip of the iceberg. Quite a few of them at least looked like side projects by solo devs.
I imagine that Disney would take issue with SD if material that Disney owned the copyright to was used in SD. They would sue. SD would have to be taken off the market.
Thus, Disney has the power to ensure that their copyrighted material remains protected from outside interests, and they can still create unique things that bring in audiences.
Any small-time artist that produces something unique will find their material eaten up by SD in time, and then, because of the sheer number of people using SD, that original material will soon have companions that are like it because they are based on it in some form. Then, the original won't be as unique.
Anyone using SD will not, by definition, be creating anything unique.
And when it comes to art, music, photography, and movies, uniqueness is the best selling point; once something is not unique, it becomes worth less because something like it could be gotten somewhere else.
SD still has the power to devalue original work; it just gives normal people that power on top of giving it to the big companies, while the original works of big companies remain safe because of their armies of lawyers.
Are you sure?
I'm not familiar with the exact data set they used for SD and whether or not Disney art was included, but my understanding is that their claim to legality comes from arguing that the use of images as training data is 'fair use'.
Anyone can use Disney art for their projects as long as it's fair use, so even if they happened to not include Disney art in SD, it doesn't fully validate your point, because they could have done so if they wanted. As long as training constitutes fair use, which I think it should - it's pretty much the AI equivalent of 'looking at others' works', which is part of a human artist's training as well.
Yes, I'm sure.
> I'm not familiar with the exact data set they used for SD and whether or not Disney art was included, but my understanding is that their claim to legality comes from arguing that the use of images as training data is 'fair use'.
They could argue that. But since the American court system is currently (almost) de facto "richest wins," their argument will probably not mean much.
The way to tell if something was in the dataset would be to use the name of a famous Disney character and see what it pulls up. If it's there, then once the Disney beast finds out, I'm sure they'll take issue with it.
And by the way, I don't buy all of the arguments for machine learning as fair use. Sure, for the training itself, yes, but once the model is used by others, you now have a distribution problem.
More in my whitepaper against Copilot at [1].