zlacker

[return to "GitHub Copilot, with “public code” blocked, emits my copyrighted code"]
1. ianbut+ce[view] [source] 2022-10-16 21:38:47
>>davidg+(OP)
I just tested it myself on a random c file I created in the middle of a rust project I'm working on, it reproduced his full code verbatim from just the function header so clearly it does regurgitate proprietary code unlike some people have said, I do not have his source so co-pilot isn't just using existing context.

I've been finding co-pilot really useful but I'll be pausing it for now, and I'm glad I have only been using it on personal projects and not anything for work. This crosses the line in my head from legal ambiguity to legal "yeah that's gonna have to stop".

◧◩
2. shadow+Wf[view] [source] 2022-10-16 21:55:17
>>ianbut+ce
Searching for the function names in his libraries, I'm seeing some 32,000 hits.

I suspect he has a different problem which (thanks to Microsoft) is now a problem he has to care about: his code probably shows up in one or more repos copy-pasted with improper LGPL attribution. There'd be no way for Copilot to know that had happened, and it would have mixed in the code.

(As a side note: understanding why an ML engine outputs a particular result is still an open area of research AFAIK.)

◧◩◪
3. andrea+6V[view] [source] 2022-10-17 05:29:37
>>shadow+Wf
"It's too hard" isn't a valid reason for me to not follow laws and/or social norms. This is a predictable result and was predicted by many people; "oops we didn't know" is neither credible nor acceptable.
◧◩◪◨
4. Spivak+zW[view] [source] 2022-10-17 05:50:58
>>andrea+6V
It’s not “oops we didn’t know” it’s, “someone published a project under a permissive license which included this code.”

If your standard is “Github should have an oracle to the US court system and predict what the outcome of a lawsuit alleging copyright infringement for a given snippet of code would be” then it is literally impossible for anyone to use any open source code ever because it might contain infringing code.

There is no chain of custody for this kind of thing which is what it would require.

◧◩◪◨⬒
5. schwar+Vp1[view] [source] 2022-10-17 11:08:44
>>Spivak+zW
If someone created an AI for making movies, and it started spitting out star wars and marvel stuff, you can bet them saying "we trained it on other materials that violate copywriter" wouldn't be enough. They are banking on most devs not knowing, caring or having the ability to follow through on this.
[go to top]