zlacker

[parent] [thread] 6 comments
1. wwwest+(OP)[view] [source] 2022-12-15 19:48:37
Let's shift your risk of immediate assault and death up by a few orders of magnitude. I'm sure that you'll see that as "just" something that was always a risk, pretty much status quo, right right?

Oh, life & death is different? Don't be so sure; there's good reasons to believe that livelihood (not to mention social credit) and life are closely related -- and also, the fundamental point doesn't depend on the specific example: you can't point to an orders-of-magnitude change and then claim we're dealing with a situation that's qualitatively like it's "always" been.

"Easier" doesn't begin to honestly represent what's happened here: we've crossed a threshold where we have technology for production by automated imitation at scale. And where that tech works primarily because of imitation, the work of those imitated has been a crucial part of that. Where that work has a reasonable claim of ownership, those who own it deserve to be recognized & compensated.

replies(1): >>archon+s1
2. archon+s1[view] [source] 2022-12-15 19:54:43
>>wwwest+(OP)
The 'reasonable claim of ownership' extends to restricting transmission, not use after transmission.

Artists are poets, and they're railing against Trurl's electronic bard.

[https://electricliterature.com/wp-content/uploads/2017/11/Tr...]

replies(1): >>wwwest+ue
◧◩
3. wwwest+ue[view] [source] [discussion] 2022-12-15 20:51:36
>>archon+s1
> The 'reasonable claim of ownership' extends to restricting transmission, not use after transmission.

It's not even clear you're correct by the apparent (if limited) support of your own argument. "Transmission" of some sort is certainly occurring when the work is given as input. It's probably even tenable to argue that a copy is created in the representation of the model.

You probably mean to argue something to the effect that dissemination by the model is the key threshold by which we'd recognize something like the current copyright law might fail to apply, the transformative nature of output being a key distinction. But some people have already shown that some outputs are much less transformative than others -- and even that's not the overall point, which is that this is a qualitative change much like those that gave birth to industrial-revolution copyright itself, and calls for a similar kind of renegotiation to protect the underlying ethics.

People should have a say in how the fruits of their labor are bargained for and used. Including into how machines and models that drive them are used. That's part of intentionally creating a society that's built for humans, including artists and poets.

replies(1): >>archon+Uw
◧◩◪
4. archon+Uw[view] [source] [discussion] 2022-12-15 22:27:36
>>wwwest+ue
I wasn't speaking about dissemination by the model at all. It's possible for an AI to create an infringing work.

It's not possible for training an AI using data that was obtained legally to be copyright infringement. This is what I was talking about regarding transmission. Copyright provides a legal means for a rights holder to limit the creation of a copy of their image in order to be transmitted to me. If a rights holder has placed their image on the internet for me to view, then copyright does not provide them a means to restrict how I choose to consume that image.

The AI may or may not create outputs that can be considered derivative works, or contain characters protected by copyright.

You seem to be making an argument that we should be changing this somehow. I suppose I'll say "maybe". But it is apparent to me that many people don't know how intellectual property works.

replies(1): >>int_19+qU
◧◩◪◨
5. int_19+qU[view] [source] [discussion] 2022-12-16 01:07:43
>>archon+Uw
There's a separate question of whether the AI model, once trained on a copyrighted input, constitutes a derived work of that input. In cases where the model can, with the right prompt, produce a near-identical (as far as humans are concerned) image to the input, it's hard to see how it is not just a special case of compression; and, of course, compressed images are still protected by copyright.
replies(1): >>archon+6L2
◧◩◪◨⬒
6. archon+6L2[view] [source] [discussion] 2022-12-16 15:03:06
>>int_19+qU
You mean the AI model itself, the weights?

A derivative work is a creative expression based on another work that receives its own copyright protection. It's very unlikely that AI weights would be considered a creative expression, and would thus not be considered a derivative work. At this point, you probably can't copyright your AI weights.

An AI might create work that could be considered derivative if it were the creative output of a human, but it's not a human, and thus the outputs are unlikely to be considered derivative works, though they may be infringing.

replies(1): >>int_19+6K4
◧◩◪◨⬒⬓
7. int_19+6K4[view] [source] [discussion] 2022-12-17 00:55:38
>>archon+6L2
Yes, I mean the weights.

If the original is a creative expression, then recording it using some different tech is still a creative expression. I don't see the qualitative difference between a bunch of numbers that constitutes weights in a neural net, and a bunch of numbers that constitute bytes in a compressed image file, if both can be used to recreate the original with minor deviations (like compression artifacts in the latter case).

[go to top]