zlacker

[parent] [thread] 27 comments
1. shadow+(OP)[view] [source] 2022-10-16 21:55:17
Searching for the function names in his libraries, I'm seeing some 32,000 hits.

I suspect he has a different problem which (thanks to Microsoft) is now a problem he has to care about: his code probably shows up in one or more repos copy-pasted with improper LGPL attribution. There'd be no way for Copilot to know that had happened, and it would have mixed in the code.

(As a side note: understanding why an ML engine outputs a particular result is still an open area of research AFAIK.)

replies(11): >>ianbut+01 >>enrage+N2 >>chiefa+05 >>armcha+Zt >>andrea+aF >>mattig+ZQ >>vinter+oR >>kitsun+HV >>cerved+w01 >>manhol+Uf1 >>neop1x+8p1
2. ianbut+01[view] [source] 2022-10-16 22:05:53
>>shadow+(OP)
Yeah that's a mess, but that's way too much legal baggage for me, an otherwise innocent end user, to want to take on. Especially when I personally tend to try and monetize a lot of my work.

I understand there's no way for the model to know, but it's really on Microsoft then to ensure no private, or poorly licensed or proprietary code is included in the training set. That sounds like a very tall order, but I think they're going to have to otherwise they're eventually going to run into legal problems with someone who has enough money to make it hurt for them.

replies(3): >>shadow+X1 >>twaw+uM >>kwhite+VR
◧◩
3. shadow+X1[view] [source] [discussion] 2022-10-16 22:12:55
>>ianbut+01
Agreed. Silver lining: MS is now heavily incentivized to invest in solutions for an open research problem.
4. enrage+N2[view] [source] 2022-10-16 22:20:21
>>shadow+(OP)
Expanding on that, even if Microsoft sees the error of their ways and retrains copilot against permissively licensed source or with explicit opt-in, it may get trained on proprietary code the old version of copilot inserted into a permissively licensed project.

You would have to just hope that you can take down every instance of your code and keep it down, all while copilot keeps making more instances for the next version to train on and plagiarize.

replies(1): >>scrame+UF
5. chiefa+05[view] [source] 2022-10-16 22:39:18
>>shadow+(OP)
I thought the same thing. But then shouldn't CP look at things it's not supposed to use and see if that's happened? How is that any different than you committing your API to Platform X and shortly thereafter Platform X reaches out to you...because GH let them know?
6. armcha+Zt[view] [source] 2022-10-17 02:43:02
>>shadow+(OP)
This is exactly what I was thinking. It's still a legal headache for Microsoft but it's not like they're just blatantly ignoring the license
7. andrea+aF[view] [source] 2022-10-17 05:29:37
>>shadow+(OP)
"It's too hard" isn't a valid reason for me to not follow laws and/or social norms. This is a predictable result and was predicted by many people; "oops we didn't know" is neither credible nor acceptable.
replies(1): >>Spivak+DG
◧◩
8. scrame+UF[view] [source] [discussion] 2022-10-17 05:41:58
>>enrage+N2
> Microsoft

> sees the error of their ways

You must be new here.

◧◩
9. Spivak+DG[view] [source] [discussion] 2022-10-17 05:50:58
>>andrea+aF
It’s not “oops we didn’t know” it’s, “someone published a project under a permissive license which included this code.”

If your standard is “Github should have an oracle to the US court system and predict what the outcome of a lawsuit alleging copyright infringement for a given snippet of code would be” then it is literally impossible for anyone to use any open source code ever because it might contain infringing code.

There is no chain of custody for this kind of thing which is what it would require.

replies(4): >>polyma+JK >>vincne+gO >>schwar+Z91 >>cowtoo+GH1
◧◩◪
10. polyma+JK[view] [source] [discussion] 2022-10-17 06:36:23
>>Spivak+DG
Exactly, the chain of custody is absolutely required for this to be legal because no oracle can exist. It must be able to attribute exactly who contributed the suspect code. It must be able to handle the edge case where some humans might publish code without permission.

Either that or we effectively get rid of software copyright as copilot can be used (or even claim to be used) to launder code of license restrictions. Eg No I didn't copy your code, I used copilot and it copied your code so I did nothing wrong.

replies(2): >>roer+nW >>concor+Kh1
◧◩
11. twaw+uM[view] [source] [discussion] 2022-10-17 06:57:03
>>ianbut+01
Open source code is open source when it license is obeyed only. When license is not obeyed, e.g. copyright notice is no reproduced, it should be treated as private code, except for dual-licensed code.
◧◩◪
12. vincne+gO[view] [source] [discussion] 2022-10-17 07:15:35
>>Spivak+DG
This reminds me my 4 year old daughter. She often comes from kindergarten with new toys. When i ask here, where did she get it, she tells that her friend gave this as a gift to her. When i dig deeper and ask around, i turns out that the friend who were gifting her things were not real owners of the gift. I see why i could be difficult for children to understand concept of ownership and that you should not gift things to others that are not your own.

So in this case copilot just looks at the situation like that someone gifted me this, and does not question if the person gifting was the real owner of the gift.

replies(1): >>Spivak+Vc2
13. mattig+ZQ[view] [source] 2022-10-17 07:46:08
>>shadow+(OP)
It doesn't matter that there is not way for copilot to know what happened, doing something illegal because hundreads of people did it before it's never a valid excuse under the rule of law; nor it is "I didn't know it was illegal". Regardless if it's copying code without permission or jaywalking.
14. vinter+oR[view] [source] 2022-10-17 07:50:03
>>shadow+(OP)
Well yes, there'd be no way for the copilot model, as currently specified and trained, to know.

But it IS possible to train a model for that. In fact, I believe ML models can be fantastic "code archaeologists", giving us insights into not just direct copying, but inspiration and idioms as well. They don't just have the code, they have commit histories with timestamps.

A causal fact which these models could incorporate, is that we know data from the past wasn't influenced by data from the future. I believe that is a lever to pry open a lot of wondrous discoveries, and I can't wait until a model with this causal assumption is let loose on Spotify's catalog, and we get a computer's perspective on who influenced who.

But in the meantime, discovering where copy-pasted code originated should be a lot easier.

replies(1): >>pca006+v81
◧◩
15. kwhite+VR[view] [source] [discussion] 2022-10-17 07:55:46
>>ianbut+01
Of course the model can know if the code is repeated in multiple repositories with different licences. The people who maintain copilot simply don't care to make it do so.
16. kitsun+HV[view] [source] 2022-10-17 08:35:17
>>shadow+(OP)
There'd be no way for Copilot to know that had happened? What? YT uses Content ID. GH could set up a similar program for OSS.
replies(1): >>kbelde+143
◧◩◪◨
17. roer+nW[view] [source] [discussion] 2022-10-17 08:42:26
>>polyma+JK
Right, so we need a system for when a dev goes and grabs code-snippets from blogs and open-source freely licensed projects on e.g. github in which they can say that the code is from so-and-so source? So like a way to distribute and inherit git blame?
18. cerved+w01[view] [source] 2022-10-17 09:25:20
>>shadow+(OP)
> There'd be no way for Copilot to know that had happened

All lines are associated to a commit, which has author/commit date. A reasonable guess as to which snippet was made first can be done

◧◩
19. pca006+v81[view] [source] [discussion] 2022-10-17 10:54:26
>>vinter+oR
Ah, a plagiarism checker that can understand simple code transformation and find the original source? Sounds like a good idea for patent trolls and I have no idea about how/if copyright laws can be apply in this case. Does copying the idea but not copying the code verbatim constitutes copyright violation?
replies(1): >>vinter+qj1
◧◩◪
20. schwar+Z91[view] [source] [discussion] 2022-10-17 11:08:44
>>Spivak+DG
If someone created an AI for making movies, and it started spitting out star wars and marvel stuff, you can bet them saying "we trained it on other materials that violate copywriter" wouldn't be enough. They are banking on most devs not knowing, caring or having the ability to follow through on this.
21. manhol+Uf1[view] [source] 2022-10-17 11:55:11
>>shadow+(OP)
> his code probably shows up in one or more repos copy-pasted with improper LGPL attribution.

Can Copilot prove that and link to the source LGPL code whenever it reproduces more than half a line of code from such a source?

Because without that clear attribution trail, nobody in their right mind would contaminate their codebase with possibly stolen code. Hell, some bad actor might purposefully publish a proprietary base full of stolen LGPL code, and run scanners on other products until they get a Copilot "bite". When that happens and you get sued, good luck finding the original open source code both you and your aggressor derive from.

◧◩◪◨
22. concor+Kh1[view] [source] [discussion] 2022-10-17 12:10:30
>>polyma+JK
This takes place with or without copilot. The problem would be people copying code and releasing it under a different license.
◧◩◪
23. vinter+qj1[view] [source] [discussion] 2022-10-17 12:23:41
>>pca006+v81
The patent troll version of the algorithm needs the victim's bank balance as input too. In fact that's probably all it needs.

It would be much more valuable for people who care about the truth.

24. neop1x+8p1[view] [source] 2022-10-17 13:05:36
>>shadow+(OP)
>> his code probably shows up in one or more repos copy-pasted with improper LGPL attribution

That is why Copilot should have always been opt-in (explicitly ask original authors to provide their code to copilot training). Instead, they are simply stealing the code of others.

◧◩◪
25. cowtoo+GH1[view] [source] [discussion] 2022-10-17 14:35:04
>>Spivak+DG
I am going to make a robot that burns your house down. You might think this is unethical, but what you expect me to do? Implement an oracle to the US court system?

You might think it's unreasonable to build such a house-burning robot, but you have to realize that I actually designed it as a lawn-mowing robot. The robot will simply not value your life or property because its utility function is descended from my own, so may burn your house down in the regular course of its duties (if it decides to use a powerful laser to trim the grass to the exact nanometer). Sorry neighbor.

What do you expect me to do? NOT build this robot? How dare you stand in the way of progress!

◧◩◪◨
26. Spivak+Vc2[view] [source] [discussion] 2022-10-17 16:35:24
>>vincne+gO
> and does not question if the person gifting was the real owner of the gift

If you can figure out a method of determining whether someone owns the code that doesn't involve, "try suing in court for copyright infringement and see if it sticks" then we're kinda stuck. Because just because a codebase contains an exact or similar snippet from another codebase doesn't mean that snippet reaches the threshold of copyrightable work. Or the reverse being that just because two code snippets look wildly different doesn't mean it's not infringement and detecting that automatically is solving the halting problem.

The thing you want for software to actually solve this is chain of custody which we don't have. If you require everyone assume everyone else could be lying or mistaken about infringement then using any open source project for anything becomes legal hot water.

In fact when you upload code to Github you grant them a license to do things like "display it" which you can't do if you don't actually own the copyright or have a license so even before the code is ever slurrped into Copilot the same exact legal situation arises as to wether Github is legally allowed to host the code at all. Can you imagine if when you uploaded code to Github you had to sign a document saying you owned the code and indemnifying Microsoft against any lawsuit alleging infringement o boy people would not enjoy that.

replies(1): >>vincne+xb4
◧◩
27. kbelde+143[view] [source] [discussion] 2022-10-17 20:44:19
>>kitsun+HV
>GH could set up a similar program for OSS.

What a nightmare.

I'd say that constant code copying is massively pervasive, with no regard to licensing, always has been, and that's not really a bad thing, and attempts to stop it are going to be far more harmful than helpful.

◧◩◪◨⬒
28. vincne+xb4[view] [source] [discussion] 2022-10-18 06:18:41
>>Spivak+Vc2
I'll flip it around. If you can't figure out if the code is properly copyrighted, and can't afford to face consequences, don't use it.
[go to top]