zlacker

[parent] [thread] 17 comments
1. wwwest+(OP)[view] [source] 2022-12-15 18:32:55
Absolutely this -- and in many (maybe most cases), there was no consent for the use of the work in training the model, and quite possibly no notice or compensation at all.

That's a huge ethical issue whether or not it's explicitly addressed in copyright/ip law.

replies(3): >>api+j5 >>archon+G5 >>myrryr+Z5
2. api+j5[view] [source] 2022-12-15 18:57:49
>>wwwest+(OP)
I really think there's likely to be gigantic class action lawsuits in the near future, and I support them. People did not consent for their data and work to be used in this way. In many cases people have already demonstrated using custom tailored prompts that these models have been trained on copyrighted works that are not public domain.
replies(2): >>archon+P5 >>Octopu+wn
3. archon+G5[view] [source] 2022-12-15 18:58:54
>>wwwest+(OP)
It is not a huge ethical issue. The artists have always been at risk of someone learning their style if they make their work available for public viewing.

We've just made "learning style" easier, so a thing that was always a risk is now happening.

replies(3): >>wwwest+sg >>ilammy+cL >>noober+5f1
◧◩
4. archon+P5[view] [source] [discussion] 2022-12-15 18:59:34
>>api+j5
Consent isn't required if they're making their work available for public viewing.
replies(1): >>gransh+1q
5. myrryr+Z5[view] [source] 2022-12-15 19:00:09
>>wwwest+(OP)
That is a hard fight to have, since it is the same for people. An artist will have watched some Disney movie, and that could influence their art in some small way. Does Disney have a right to take a small amount from every bit of art which they produce from then on? Obviously not.

The real answer is AI are not people, and it is ok to have different rules for them, and that is where the fight would need to be.

◧◩
6. wwwest+sg[view] [source] [discussion] 2022-12-15 19:48:37
>>archon+G5
Let's shift your risk of immediate assault and death up by a few orders of magnitude. I'm sure that you'll see that as "just" something that was always a risk, pretty much status quo, right right?

Oh, life & death is different? Don't be so sure; there's good reasons to believe that livelihood (not to mention social credit) and life are closely related -- and also, the fundamental point doesn't depend on the specific example: you can't point to an orders-of-magnitude change and then claim we're dealing with a situation that's qualitatively like it's "always" been.

"Easier" doesn't begin to honestly represent what's happened here: we've crossed a threshold where we have technology for production by automated imitation at scale. And where that tech works primarily because of imitation, the work of those imitated has been a crucial part of that. Where that work has a reasonable claim of ownership, those who own it deserve to be recognized & compensated.

replies(1): >>archon+Uh
◧◩◪
7. archon+Uh[view] [source] [discussion] 2022-12-15 19:54:43
>>wwwest+sg
The 'reasonable claim of ownership' extends to restricting transmission, not use after transmission.

Artists are poets, and they're railing against Trurl's electronic bard.

[https://electricliterature.com/wp-content/uploads/2017/11/Tr...]

replies(1): >>wwwest+Wu
◧◩
8. Octopu+wn[view] [source] [discussion] 2022-12-15 20:19:02
>>api+j5
It's already explicitly legal to train AI using copyrighted data in many countries. You can ignore opt-outs too, especially if you're training AI for non-commercial purposes. Search up TDM exceptions.
◧◩◪
9. gransh+1q[view] [source] [discussion] 2022-12-15 20:29:43
>>archon+P5
For VIEWING. This is like blatantly taking your gpl licensed code and using it for commercial purposes
replies(1): >>archon+RB
◧◩◪◨
10. wwwest+Wu[view] [source] [discussion] 2022-12-15 20:51:36
>>archon+Uh
> The 'reasonable claim of ownership' extends to restricting transmission, not use after transmission.

It's not even clear you're correct by the apparent (if limited) support of your own argument. "Transmission" of some sort is certainly occurring when the work is given as input. It's probably even tenable to argue that a copy is created in the representation of the model.

You probably mean to argue something to the effect that dissemination by the model is the key threshold by which we'd recognize something like the current copyright law might fail to apply, the transformative nature of output being a key distinction. But some people have already shown that some outputs are much less transformative than others -- and even that's not the overall point, which is that this is a qualitative change much like those that gave birth to industrial-revolution copyright itself, and calls for a similar kind of renegotiation to protect the underlying ethics.

People should have a say in how the fruits of their labor are bargained for and used. Including into how machines and models that drive them are used. That's part of intentionally creating a society that's built for humans, including artists and poets.

replies(1): >>archon+mN
◧◩◪◨
11. archon+RB[view] [source] [discussion] 2022-12-15 21:25:52
>>gransh+1q
A thing that can be viewed can be learned from.

I can't copy your GPL code. I might be able to write my own code that does the same thing.

I'm going to defend this statement in advance. A lot of software developers white knight more than they strictly have to; they claim that learning from GPL code unavoidably results in infringing reproduction of that code.

Courts, however, apply a test [1], in an attempt to determine the degree to which the idea is separable from the expression of that idea. Copyright protects particular expression, not idea, and in the case that the idea cannot be separated from the expression, the expression cannot be copyrighted. So either I'm able to produce a non-infringing expression of the idea, or the expression cannot be copyrighted, and the GPL license is redundant.

[1] https://en.wikipedia.org/wiki/Abstraction-Filtration-Compari...

◧◩
12. ilammy+cL[view] [source] [discussion] 2022-12-15 22:15:46
>>archon+G5
This is like saying that continuously surveilling people when they are outside of their private property and live-reporting it to the internet is not a huge ethical issue. For you are always at risk of being seen when in public and the rest is merely exercising freedom of speech.

Something being currently legal and possible doesn’t mean being morally right.

Technology enables things and sometimes the change is qualitatively different.

◧◩◪◨⬒
13. archon+mN[view] [source] [discussion] 2022-12-15 22:27:36
>>wwwest+Wu
I wasn't speaking about dissemination by the model at all. It's possible for an AI to create an infringing work.

It's not possible for training an AI using data that was obtained legally to be copyright infringement. This is what I was talking about regarding transmission. Copyright provides a legal means for a rights holder to limit the creation of a copy of their image in order to be transmitted to me. If a rights holder has placed their image on the internet for me to view, then copyright does not provide them a means to restrict how I choose to consume that image.

The AI may or may not create outputs that can be considered derivative works, or contain characters protected by copyright.

You seem to be making an argument that we should be changing this somehow. I suppose I'll say "maybe". But it is apparent to me that many people don't know how intellectual property works.

replies(1): >>int_19+Sa1
◧◩◪◨⬒⬓
14. int_19+Sa1[view] [source] [discussion] 2022-12-16 01:07:43
>>archon+mN
There's a separate question of whether the AI model, once trained on a copyrighted input, constitutes a derived work of that input. In cases where the model can, with the right prompt, produce a near-identical (as far as humans are concerned) image to the input, it's hard to see how it is not just a special case of compression; and, of course, compressed images are still protected by copyright.
replies(1): >>archon+y13
◧◩
15. noober+5f1[view] [source] [discussion] 2022-12-16 01:36:32
>>archon+G5
Make open source code open source always has the risk of someone copying it and distributing it in proprietary code. That doesn't make it right or ethical. Stealing an unlocked car is unethical. Raping someone who is weaker than you is unethical. Just because something isn't difficult doesn't make something ethical.
replies(1): >>archon+0n3
◧◩◪◨⬒⬓⬔
16. archon+y13[view] [source] [discussion] 2022-12-16 15:03:06
>>int_19+Sa1
You mean the AI model itself, the weights?

A derivative work is a creative expression based on another work that receives its own copyright protection. It's very unlikely that AI weights would be considered a creative expression, and would thus not be considered a derivative work. At this point, you probably can't copyright your AI weights.

An AI might create work that could be considered derivative if it were the creative output of a human, but it's not a human, and thus the outputs are unlikely to be considered derivative works, though they may be infringing.

replies(1): >>int_19+y05
◧◩◪
17. archon+0n3[view] [source] [discussion] 2022-12-16 16:37:26
>>noober+5f1
This is kind of silly.

Both personal autonomy and private property are social constructs we agree are valuable. Stealing a car and raping a person are things we've identified as unacceptable and codified into law.

And in stark contrast, intellectual property is something we've identified as being valuable to extend limited protections to in order to incentivize creative and technological development. It is not a sacred right, it's a gambit.

It's us saying, "We identify that if we have no IP protection whatsoever, many people will have no incentive to create, and nobody will ever have an incentive to share. Therefore, we will create some protection in these specific ways in order to spur on creativity and development."

There's no (or very little) ethics to it. We've created a system not out of respect for people's connections to their creations, but in order to entice them to create so we can ultimately expropriate it for society as a whole. And that system affords protection in particular ways. Any usage that is permitted by the system is not only not unethical, it is the system working.

◧◩◪◨⬒⬓⬔⧯
18. int_19+y05[view] [source] [discussion] 2022-12-17 00:55:38
>>archon+y13
Yes, I mean the weights.

If the original is a creative expression, then recording it using some different tech is still a creative expression. I don't see the qualitative difference between a bunch of numbers that constitutes weights in a neural net, and a bunch of numbers that constitute bytes in a compressed image file, if both can be used to recreate the original with minor deviations (like compression artifacts in the latter case).

[go to top]