zlacker

[return to "Who knew the first AI battles would be fought by artists?"]
1. meebob+kc[view] [source] 2022-12-15 13:03:10
>>dredmo+(OP)
I've been finding that the strangest part of discussions around art AI among technical people is the complete lack of identification or empathy: it seems to me that most computer programmers should be just as afraid as artists, in the face of technology like this!!! I am a failed artist (read, I studied painting in school and tried to make a go at being a commercial artist in animation and couldn't make the cut), and so I decided to do something easier and became a computer programmer, working for FAANG and other large companies and making absurd (to me!!) amounts of cash. In my humble estimation, making art is vastly more difficult than the huge majority of computer programming that is done. Art AI is terrifying if you want to make art for a living- and, if AI is able to do these astonishingly difficult things, why shouldn't it, with some finagling, also be able to do the dumb, simple things most programmers do for their jobs?

The lack of empathy is incredibly depressing...

◧◩
2. strken+Kx[view] [source] 2022-12-15 14:42:58
>>meebob+kc
My empathy for artists is fighting with my concern for everyone else's future, and losing.

It would be very easy to make training ML models on publicly available data illegal. I think that would be a very bad thing because it would legally enshrine a difference between human learning and machine learning in a broader sense, and I think machine learning has huge potential to improve everyone's lives.

Artists are in a similar position to grooms and farriers demanding the combustion engine be banned from the roads for spooking horses. They have a good point, but could easily screw everyone else over and halt technological progress for decades. I want to help them, but want to unblock ML progress more.

◧◩◪
3. 6gvONx+C71[view] [source] 2022-12-15 17:01:07
>>strken+Kx
> I think that would be a very bad thing because it would legally enshrine a difference between human learning and machine learning in a broader sense, and I think machine learning has huge potential to improve everyone's lives.

How about we legally enshrine a difference between human learning and corporate product learning? If you want to use things others made for free, you should give back for free. Otherwise if you’re profiting off of it, you have to come to some agreement with the people whose work you’re profiting off of.

◧◩◪◨
4. Negiti+Gd1[view] [source] 2022-12-15 17:27:18
>>6gvONx+C71
Well Stable Diffusion did give back.

This doesn’t seem to satisfy the artists.

◧◩◪◨⬒
5. 6gvONx+vm1[view] [source] 2022-12-15 18:10:45
>>Negiti+Gd1
I’m thinking about the people who use SD commercially. There’s a transitive aspect to this that upsets people. If it’s unacceptable for a company to profit off your work without compensating you or asking for your permission, then it doesn’t become suddenly acceptable if some third party hands your work to the company.

Ideally we’d see something opt-in to decide exactly how much you have to give back, and how much you have to constrain your own downstream users. And in fact we do see that. We have copyleft licenses for tons of code and media released to the public (e.g. GPL, CC-BY-SA NC, etc). It lets you define how someone can use your stuff without talking to you, and lays out the parameters for exactly how/whether you have to give back.

[go to top]