zlacker

[parent] [thread] 2 comments
1. Feepin+(OP)[view] [source] 2022-10-17 11:49:49
I'm worried that this will harm open source, but in a different way: lots of people switching to unfree "no commercial use at all" licenses, special exemptions in licenses, and so on. I'm also worried that it'll harm scientific progress by criminalizing a deeply harmless and commonplace activity such as "learning from open code" when it's AIs that do it. And of course retarding the progress of AI code assistance, a vital component of scaling up programmer productivity.

From an AI safety perspective, I'm also worried it will accelerate the transition to self-learning code, ie. the model both generating and learning from source code, which is a crucial step on the way to general artificial intelligence that we are not ready for.

replies(1): >>carom+dg2
2. carom+dg2[view] [source] 2022-10-17 23:33:23
>>Feepin+(OP)
Horrible framing. AI is not learning from code. The model is a function. The AI is a derivative work of its training material. They built a program based on open source code and failed to open source it.

They also built a program that outputs open source code without tracking the license.

This isn't a human who read something and distilled a general concept. This is a program that spits out a chain of tokens. This is more akin to a human who copied some copywritten material verbatim.

replies(1): >>Feepin+Yu2
◧◩
3. Feepin+Yu2[view] [source] [discussion] 2022-10-18 01:26:19
>>carom+dg2
The brain is a function. You're positing a distinction without a difference.
[go to top]