zlacker

[parent] [thread] 10 comments
1. archon+(OP)[view] [source] 2022-12-15 18:58:54
It is not a huge ethical issue. The artists have always been at risk of someone learning their style if they make their work available for public viewing.

We've just made "learning style" easier, so a thing that was always a risk is now happening.

replies(3): >>wwwest+Ma >>ilammy+wF >>noober+p91
2. wwwest+Ma[view] [source] 2022-12-15 19:48:37
>>archon+(OP)
Let's shift your risk of immediate assault and death up by a few orders of magnitude. I'm sure that you'll see that as "just" something that was always a risk, pretty much status quo, right right?

Oh, life & death is different? Don't be so sure; there's good reasons to believe that livelihood (not to mention social credit) and life are closely related -- and also, the fundamental point doesn't depend on the specific example: you can't point to an orders-of-magnitude change and then claim we're dealing with a situation that's qualitatively like it's "always" been.

"Easier" doesn't begin to honestly represent what's happened here: we've crossed a threshold where we have technology for production by automated imitation at scale. And where that tech works primarily because of imitation, the work of those imitated has been a crucial part of that. Where that work has a reasonable claim of ownership, those who own it deserve to be recognized & compensated.

replies(1): >>archon+ec
◧◩
3. archon+ec[view] [source] [discussion] 2022-12-15 19:54:43
>>wwwest+Ma
The 'reasonable claim of ownership' extends to restricting transmission, not use after transmission.

Artists are poets, and they're railing against Trurl's electronic bard.

[https://electricliterature.com/wp-content/uploads/2017/11/Tr...]

replies(1): >>wwwest+gp
◧◩◪
4. wwwest+gp[view] [source] [discussion] 2022-12-15 20:51:36
>>archon+ec
> The 'reasonable claim of ownership' extends to restricting transmission, not use after transmission.

It's not even clear you're correct by the apparent (if limited) support of your own argument. "Transmission" of some sort is certainly occurring when the work is given as input. It's probably even tenable to argue that a copy is created in the representation of the model.

You probably mean to argue something to the effect that dissemination by the model is the key threshold by which we'd recognize something like the current copyright law might fail to apply, the transformative nature of output being a key distinction. But some people have already shown that some outputs are much less transformative than others -- and even that's not the overall point, which is that this is a qualitative change much like those that gave birth to industrial-revolution copyright itself, and calls for a similar kind of renegotiation to protect the underlying ethics.

People should have a say in how the fruits of their labor are bargained for and used. Including into how machines and models that drive them are used. That's part of intentionally creating a society that's built for humans, including artists and poets.

replies(1): >>archon+GH
5. ilammy+wF[view] [source] 2022-12-15 22:15:46
>>archon+(OP)
This is like saying that continuously surveilling people when they are outside of their private property and live-reporting it to the internet is not a huge ethical issue. For you are always at risk of being seen when in public and the rest is merely exercising freedom of speech.

Something being currently legal and possible doesn’t mean being morally right.

Technology enables things and sometimes the change is qualitatively different.

◧◩◪◨
6. archon+GH[view] [source] [discussion] 2022-12-15 22:27:36
>>wwwest+gp
I wasn't speaking about dissemination by the model at all. It's possible for an AI to create an infringing work.

It's not possible for training an AI using data that was obtained legally to be copyright infringement. This is what I was talking about regarding transmission. Copyright provides a legal means for a rights holder to limit the creation of a copy of their image in order to be transmitted to me. If a rights holder has placed their image on the internet for me to view, then copyright does not provide them a means to restrict how I choose to consume that image.

The AI may or may not create outputs that can be considered derivative works, or contain characters protected by copyright.

You seem to be making an argument that we should be changing this somehow. I suppose I'll say "maybe". But it is apparent to me that many people don't know how intellectual property works.

replies(1): >>int_19+c51
◧◩◪◨⬒
7. int_19+c51[view] [source] [discussion] 2022-12-16 01:07:43
>>archon+GH
There's a separate question of whether the AI model, once trained on a copyrighted input, constitutes a derived work of that input. In cases where the model can, with the right prompt, produce a near-identical (as far as humans are concerned) image to the input, it's hard to see how it is not just a special case of compression; and, of course, compressed images are still protected by copyright.
replies(1): >>archon+SV2
8. noober+p91[view] [source] 2022-12-16 01:36:32
>>archon+(OP)
Make open source code open source always has the risk of someone copying it and distributing it in proprietary code. That doesn't make it right or ethical. Stealing an unlocked car is unethical. Raping someone who is weaker than you is unethical. Just because something isn't difficult doesn't make something ethical.
replies(1): >>archon+kh3
◧◩◪◨⬒⬓
9. archon+SV2[view] [source] [discussion] 2022-12-16 15:03:06
>>int_19+c51
You mean the AI model itself, the weights?

A derivative work is a creative expression based on another work that receives its own copyright protection. It's very unlikely that AI weights would be considered a creative expression, and would thus not be considered a derivative work. At this point, you probably can't copyright your AI weights.

An AI might create work that could be considered derivative if it were the creative output of a human, but it's not a human, and thus the outputs are unlikely to be considered derivative works, though they may be infringing.

replies(1): >>int_19+SU4
◧◩
10. archon+kh3[view] [source] [discussion] 2022-12-16 16:37:26
>>noober+p91
This is kind of silly.

Both personal autonomy and private property are social constructs we agree are valuable. Stealing a car and raping a person are things we've identified as unacceptable and codified into law.

And in stark contrast, intellectual property is something we've identified as being valuable to extend limited protections to in order to incentivize creative and technological development. It is not a sacred right, it's a gambit.

It's us saying, "We identify that if we have no IP protection whatsoever, many people will have no incentive to create, and nobody will ever have an incentive to share. Therefore, we will create some protection in these specific ways in order to spur on creativity and development."

There's no (or very little) ethics to it. We've created a system not out of respect for people's connections to their creations, but in order to entice them to create so we can ultimately expropriate it for society as a whole. And that system affords protection in particular ways. Any usage that is permitted by the system is not only not unethical, it is the system working.

◧◩◪◨⬒⬓⬔
11. int_19+SU4[view] [source] [discussion] 2022-12-17 00:55:38
>>archon+SV2
Yes, I mean the weights.

If the original is a creative expression, then recording it using some different tech is still a creative expression. I don't see the qualitative difference between a bunch of numbers that constitutes weights in a neural net, and a bunch of numbers that constitute bytes in a compressed image file, if both can be used to recreate the original with minor deviations (like compression artifacts in the latter case).

[go to top]