zlacker

[parent] [thread] 17 comments
1. gus_ma+(OP)[view] [source] 2022-12-15 13:16:38
I think the correct way to get empathy is to use an equivalent that technical people understand, like Copilot:

* Can a Copilot-like generator be trained with the GPL code of RMS? What is the license of the output?

* Can a Copilot-like generator be trained with the leaked source code of MS Windows? What is the license of the output?

replies(3): >>imgabe+11 >>Terret+g3 >>Peteri+c9
2. imgabe+11[view] [source] 2022-12-15 13:21:12
>>gus_ma+(OP)
If a human learns to program by reading GPL code, what is the license of future code they write?
replies(4): >>zorked+F1 >>gus_ma+j7 >>Alexan+ve >>6gvONx+8T
◧◩
3. zorked+F1[view] [source] [discussion] 2022-12-15 13:24:48
>>imgabe+11
A language model is not a human. You at least have the possiblity that the human learned something. The language model is a parrot with a large memory.

That said Microsoft didn't allow their kernel developers to look at Linux code for a reason.

replies(1): >>ben_w+Z2
◧◩◪
4. ben_w+Z2[view] [source] [discussion] 2022-12-15 13:31:17
>>zorked+F1
What definition of learning are you using that makes humans not parrots and a deep learning system not learning?

I know current AI is very different from an organic brain at many levels, but I don't know if any of those differences really matters.

replies(2): >>zorked+Cl >>NateEa+3w
5. Terret+g3[view] [source] 2022-12-15 13:32:38
>>gus_ma+(OP)
Your example is like saying we should have empathy for people who can whittle when a 3D printer can now extrude the same design in bulk. Or like empathy for London cabbies having to learn roads when "anyone" can A-to-B now with a phone.

Code should not need to be done by humans at all. There's no reason coding as it exists today should exist as a job in the future.

Any time I or a colleague are "debugging" something, I'm just sad we are so "dark ages" that the IDE isn't saying "THERE, humans, the bug is THERE!" in flashing red. The IDE has the potential to have perfect information, so where is the bug is solvable.

The job of coding today should continue to rise up the stack tomorrow to where modules and libraries and frameworks are just things machines generate in response to a dialog about “the job to be done”.

The primary problem space of software is in the business domain, today requiring people who speak barely abstracted machine language to implement -- still such painfully early days.

We're cavemen chipping at rocks to make fire still amazed at the trick. No empathy, just, self-awareness sufficient to provoke us into researching fusion.

replies(1): >>Kalium+Wh
◧◩
6. gus_ma+j7[view] [source] [discussion] 2022-12-15 13:54:58
>>imgabe+11
It's more complicated, even if humans are involved. From https://wiki.winehq.org/Developer_FAQ#Copyright_Issues

> Who can't contribute to Wine?

> Some people cannot contribute to Wine because of potential copyright violation. This would be anyone who has seen Microsoft Windows source code (stolen, under an NDA, disassembled, or otherwise). There are some exceptions for the source code of add-on components (ATL, MFC, msvcrt); see the next question.

I've seen a few MIT/BSD projects that ask people not to contribute if they have seen the equivalent GPL project. It's a problem because Copilot has seen "all" GPL projects.

7. Peteri+c9[view] [source] 2022-12-15 14:03:29
>>gus_ma+(OP)
I don't think that's a road to empathy, because if we're talking about the matter of empathy i.e. "emotional should's" instead of nuances of current legal policy, then I'd expect a nontrivial part of technical people to say that a morally reasonable answer to both these scenarios could (or should) be "Yes, and whatever you want - not treated as derivative work bound by the license of the training data", which probably is the opposite of what artists would want.

While technically both artists and developers make their living by producing copyrighted works, our relationship to copyright is very different; while artists rely on copyright and overwhelmingly support its enforcement as-is, many developers (including myself) would argue for a significant reduction of its length or scale.

For tech workers (tech company owners could have a different perspective) copyright is just an accidental fact of life, and since most of paid development work is done as work-for-hire for custom stuff needed by one company, that model would work just as well even if copyright didn't exist or didn't extend to software. While in many cases copyright benefits our profession, in many other cases it harms our profession, and while things like GPL rely on copyright, they are also in large part a reaction to copyright that wouldn't be needed if copyright for code didn't exist or was significantly restricted.

replies(1): >>gus_ma+RI
◧◩
8. Alexan+ve[view] [source] [discussion] 2022-12-15 14:27:11
>>imgabe+11
Humans have rights, machines don't. Copyright is a system for protecting human intellectual property rights. You can't copyright things created by a monkey[1] for example. Thus it's not a contradiction to say that an action performed by a human is "transformative" while the same action performed by a machine is not.

But that is giving AI too much credit. As advanced as modern AI models are, they are not AGIs comparable to human cognition. I don't get the impulse to elevate/equate the output of trained AI models to that of human beings.

[1] https://thecopyrightdetective.com/animal-copyrights/

replies(2): >>imgabe+kh >>amanap+Rx
◧◩◪
9. imgabe+kh[view] [source] [discussion] 2022-12-15 14:37:50
>>Alexan+ve
The AI did not create anything. It responded to a prompt given by a human to generate an output. Just like photoshop responds to someone moving the mouse and clicking or a paintbrush responds to being dragged across a canvas.

So any transformativity of the action should be attributed to the human and the same copyright laws would apply.

replies(1): >>Alexan+P61
◧◩
10. Kalium+Wh[view] [source] [discussion] 2022-12-15 14:40:10
>>Terret+g3
We can and should have empathy for all those people.

The question is perhaps not if we should have empathy for them. The question is what we should do with it once we have it. I have empathy for the cabbies with the Knowledge of London, but I don't think making any policy based on or around that empathy is wise.

This is tricky in practice. A surprising number of people regard prioritizing the internal emotional experience of empathy in policy as experiencing empathy.

replies(1): >>Terret+er8
◧◩◪◨
11. zorked+Cl[view] [source] [discussion] 2022-12-15 14:52:45
>>ben_w+Z2
Go to a judge in a copyright case and argue that humans are parrots. Then tell me how it went.
◧◩◪◨
12. NateEa+3w[view] [source] [discussion] 2022-12-15 15:30:17
>>ben_w+Z2
And since you don't know if they matter, you should not presume that they don't.
◧◩◪
13. amanap+Rx[view] [source] [discussion] 2022-12-15 15:36:20
>>Alexan+ve
I believe that you can copyright the image, it's the monkey that can't copyright it.
◧◩
14. gus_ma+RI[view] [source] [discussion] 2022-12-15 16:23:17
>>Peteri+c9
It depends a lot of the type of software you are making. If it's custom software for a single client, then probably copyright is not important. (Anyway, I think a lot of custom software is send without the source code or with obfuscated code, so they have to hire the developer again.)

Part of my job is something like that. I make custom programs for my department in the university. I don't care how long is the copyright. Anyway, I like to milk the work for a few years. There are some programs I made 5 or 10 years ago that we are still using and saving time of my coworkers and I like to use that leverage to get more freedom with my time. (How many 20% projects can I have?) Anyway, most of them need some updating because the requirements change of the environment changes, so it's not zero work on them.

There are very few projects that have a long term value. Games sell a lot of copies in a short time. MS Office gets an update every other year (Hello Clippy! Bye Clippy!) , and the online version is eating them. I think it's very hard to think programs that will have a lot of value in 50 years, but I'm still running some code in Classic VB6.

◧◩
15. 6gvONx+8T[view] [source] [discussion] 2022-12-15 17:03:05
>>imgabe+11
Why’s this matter? Corporations aren’t people.
◧◩◪◨
16. Alexan+P61[view] [source] [discussion] 2022-12-15 18:07:02
>>imgabe+kh
But under this model, the comparisons to human learning don't apply either. What matters is whether the output is transformative - so it's fair to compare the outputs of AI systems to one of the many inputs and say "these are too similar, therefore infringement occurred". It doesn't matter what kind of mixing happened between inputs and outputs, just like it doesn't matter how many Photoshop filters I apply to an image if the result resembles what I started with "too much".
replies(1): >>imgabe+Ka2
◧◩◪◨⬒
17. imgabe+Ka2[view] [source] [discussion] 2022-12-15 23:36:29
>>Alexan+P61
Sure, just like a human can manually draw something that infringes copyright, they can use the AI to draw something that infringes copyright. It's the human infringing the copyright, not the AI.

But the fact that the human looked at a bunch of Mickey Mouse pictures and gained the ability to draw Mickey Mouse does not infringe copyright because that's just potential inside their brain.

I don't think the potential inside a learning model should infringe copyright either. It's a matter of how it's used.

◧◩◪
18. Terret+er8[view] [source] [discussion] 2022-12-17 19:51:13
>>Kalium+Wh
Agree with this 100%. Feel for the outdated, sucks to be outmoded, but artificially prolonging the agony is not the way.
[go to top]