zlacker

[parent] [thread] 3 comments
1. guitar+(OP)[view] [source] 2022-10-17 05:50:59
The chilling effect of this decision is something everybody who uses open source software should be worried about.
replies(1): >>Feepin+vy
2. Feepin+vy[view] [source] 2022-10-17 11:49:49
>>guitar+(OP)
I'm worried that this will harm open source, but in a different way: lots of people switching to unfree "no commercial use at all" licenses, special exemptions in licenses, and so on. I'm also worried that it'll harm scientific progress by criminalizing a deeply harmless and commonplace activity such as "learning from open code" when it's AIs that do it. And of course retarding the progress of AI code assistance, a vital component of scaling up programmer productivity.

From an AI safety perspective, I'm also worried it will accelerate the transition to self-learning code, ie. the model both generating and learning from source code, which is a crucial step on the way to general artificial intelligence that we are not ready for.

replies(1): >>carom+IO2
◧◩
3. carom+IO2[view] [source] [discussion] 2022-10-17 23:33:23
>>Feepin+vy
Horrible framing. AI is not learning from code. The model is a function. The AI is a derivative work of its training material. They built a program based on open source code and failed to open source it.

They also built a program that outputs open source code without tracking the license.

This isn't a human who read something and distilled a general concept. This is a program that spits out a chain of tokens. This is more akin to a human who copied some copywritten material verbatim.

replies(1): >>Feepin+t33
◧◩◪
4. Feepin+t33[view] [source] [discussion] 2022-10-18 01:26:19
>>carom+IO2
The brain is a function. You're positing a distinction without a difference.
[go to top]