zlacker

[parent] [thread] 3 comments
1. nieman+(OP)[view] [source] 2023-07-15 17:00:52
My reading of the relevant laws would actually lead me to believe that this is not a problem, as long as those reproductions are not returned and the eights holder did not opt out. But courts might decide differently.

Regarding the copyright of returned material here is a good discussion:

https://copyrightblog.kluweriplaw.com/2023/05/09/generative-...

replies(2): >>JumpCr+s5 >>pedroc+gn
2. JumpCr+s5[view] [source] 2023-07-15 17:31:25
>>nieman+(OP)
> as long as those reproductions are not returned

That’s the author’s entire gripe. Brave reproduced a Wikipedia entry without attribution and then slapped a copyright on it to boot.

3. pedroc+gn[view] [source] 2023-07-15 19:40:32
>>nieman+(OP)
That's clearly not enough. There's a continuum between producing exact input copy and having genuine creativity because the model actually learned something. A model that just reformats code and changes all the variable names would pass your test and yet be clearly a copyright violation. This whole argument requires that the neural network weights do something creative because they learned from the code instead of just transforming it. We're even careful about this with humans with things like clean room reimplementations to make sure.
replies(1): >>hakfoo+tZ
◧◩
4. hakfoo+tZ[view] [source] [discussion] 2023-07-16 01:27:36
>>pedroc+gn
The window of possible actual creativity may be limited and variable.

There are a lot of pretty complex prompts, where if you asked a group of reasonably skilled programmers to implement, they'd produce code that was "reformatted and changed variable names" but otherwise identical. Many of us learned from the same foundational materials, and there are only a handful of non-pathological ways to implement a linked list of integers, for example.

With code it may be more obvious, in that you can't as easily obfuscate things with synonyms and sentence structure changes. Even with prose, there is going to be a tendency to use "conventional" language choices, driving you back towards a familiar-looking mean.

[go to top]