zlacker

[return to "Who knew the first AI battles would be fought by artists?"]
1. meebob+kc[view] [source] 2022-12-15 13:03:10
>>dredmo+(OP)
I've been finding that the strangest part of discussions around art AI among technical people is the complete lack of identification or empathy: it seems to me that most computer programmers should be just as afraid as artists, in the face of technology like this!!! I am a failed artist (read, I studied painting in school and tried to make a go at being a commercial artist in animation and couldn't make the cut), and so I decided to do something easier and became a computer programmer, working for FAANG and other large companies and making absurd (to me!!) amounts of cash. In my humble estimation, making art is vastly more difficult than the huge majority of computer programming that is done. Art AI is terrifying if you want to make art for a living- and, if AI is able to do these astonishingly difficult things, why shouldn't it, with some finagling, also be able to do the dumb, simple things most programmers do for their jobs?

The lack of empathy is incredibly depressing...

◧◩
2. Alexan+Xh1[view] [source] 2022-12-15 17:47:59
>>meebob+kc
Setting aside questions of whether there is copyright infringement going on, I think this is an unprecedented case in the history of automation replacing human labor.

Jobs have been automated since the industrial revolution, but this usually takes the form of someone inventing a widget that makes human labor unnecessary. From a worker's perspective, the automation is coming from "the outside". What's novel with AI models is that the workers' own work is used to create the thing that replaces them. It's one thing to be automated away, it's another to have your own work used against you like this, and I'm sure it feels extra-shitty as a result.

◧◩◪
3. wwwest+gr1[view] [source] 2022-12-15 18:32:55
>>Alexan+Xh1
Absolutely this -- and in many (maybe most cases), there was no consent for the use of the work in training the model, and quite possibly no notice or compensation at all.

That's a huge ethical issue whether or not it's explicitly addressed in copyright/ip law.

◧◩◪◨
4. archon+Ww1[view] [source] 2022-12-15 18:58:54
>>wwwest+gr1
It is not a huge ethical issue. The artists have always been at risk of someone learning their style if they make their work available for public viewing.

We've just made "learning style" easier, so a thing that was always a risk is now happening.

◧◩◪◨⬒
5. wwwest+IH1[view] [source] 2022-12-15 19:48:37
>>archon+Ww1
Let's shift your risk of immediate assault and death up by a few orders of magnitude. I'm sure that you'll see that as "just" something that was always a risk, pretty much status quo, right right?

Oh, life & death is different? Don't be so sure; there's good reasons to believe that livelihood (not to mention social credit) and life are closely related -- and also, the fundamental point doesn't depend on the specific example: you can't point to an orders-of-magnitude change and then claim we're dealing with a situation that's qualitatively like it's "always" been.

"Easier" doesn't begin to honestly represent what's happened here: we've crossed a threshold where we have technology for production by automated imitation at scale. And where that tech works primarily because of imitation, the work of those imitated has been a crucial part of that. Where that work has a reasonable claim of ownership, those who own it deserve to be recognized & compensated.

◧◩◪◨⬒⬓
6. archon+aJ1[view] [source] 2022-12-15 19:54:43
>>wwwest+IH1
The 'reasonable claim of ownership' extends to restricting transmission, not use after transmission.

Artists are poets, and they're railing against Trurl's electronic bard.

[https://electricliterature.com/wp-content/uploads/2017/11/Tr...]

◧◩◪◨⬒⬓⬔
7. wwwest+cW1[view] [source] 2022-12-15 20:51:36
>>archon+aJ1
> The 'reasonable claim of ownership' extends to restricting transmission, not use after transmission.

It's not even clear you're correct by the apparent (if limited) support of your own argument. "Transmission" of some sort is certainly occurring when the work is given as input. It's probably even tenable to argue that a copy is created in the representation of the model.

You probably mean to argue something to the effect that dissemination by the model is the key threshold by which we'd recognize something like the current copyright law might fail to apply, the transformative nature of output being a key distinction. But some people have already shown that some outputs are much less transformative than others -- and even that's not the overall point, which is that this is a qualitative change much like those that gave birth to industrial-revolution copyright itself, and calls for a similar kind of renegotiation to protect the underlying ethics.

People should have a say in how the fruits of their labor are bargained for and used. Including into how machines and models that drive them are used. That's part of intentionally creating a society that's built for humans, including artists and poets.

◧◩◪◨⬒⬓⬔⧯
8. archon+Ce2[view] [source] 2022-12-15 22:27:36
>>wwwest+cW1
I wasn't speaking about dissemination by the model at all. It's possible for an AI to create an infringing work.

It's not possible for training an AI using data that was obtained legally to be copyright infringement. This is what I was talking about regarding transmission. Copyright provides a legal means for a rights holder to limit the creation of a copy of their image in order to be transmitted to me. If a rights holder has placed their image on the internet for me to view, then copyright does not provide them a means to restrict how I choose to consume that image.

The AI may or may not create outputs that can be considered derivative works, or contain characters protected by copyright.

You seem to be making an argument that we should be changing this somehow. I suppose I'll say "maybe". But it is apparent to me that many people don't know how intellectual property works.

[go to top]