zlacker

[return to "Elon Musk sues Sam Altman, Greg Brockman, and OpenAI [pdf]"]
1. HarHar+vu1[view] [source] 2024-03-01 19:23:01
>>modele+(OP)
Any competent lawyer is going to get Musk on the stand reiterating his opinions about the danger of AI. If the tech really is dangerous then being more closed arguably is in the public's best interest, and this is certainly the reason OpenAI have previously given.

Not saying I agree that being closed source is in the public good, although one could certainly argue that accelerating the efforts of bad actors to catch up would not be a positive.

◧◩
2. nicce+1w1[view] [source] 2024-03-01 19:30:18
>>HarHar+vu1
> If the tech really is dangerous then being more closed arguably is in the public's best interest, and this is certainly the reason OpenAI have previously given.

Not really. It slows down like security over obscurity. It needs to be open that we know the real risks and we have the best information to combat it. Otherwise, someone who does the same in closed matter, has better chances to get advantage when misusing it.

◧◩◪
3. tw04+wy1[view] [source] 2024-03-01 19:43:25
>>nicce+1w1
Just like nuclear weapons?

The whole “security through obscurity doesn’t work” is absolute nonsense. It absolutely works and there are countless real world examples. What doesn’t work is relying on that as your ONLY security.

◧◩◪◨
4. gary_0+0C1[view] [source] 2024-03-01 20:04:05
>>tw04+wy1
I'm not sure if nuclear weapons are a good example. In the 1940's most of the non-weapons-related nuclear research was public (and that did make certain agencies nervous). That's just how scientists tend to do things.

While the US briefly had unique knowledge about the manufacture of nuclear weapons, the basics could be easily worked out from first principles, especially once schoolchildren could pick up an up-to-date book on atomic physics. The engineering and testing part is difficult, of course, but for a large nation-state stealing the plans is only a shortcut. The on-paper part of the engineering is doable by any team with the right skills. So the main blocker with nuclear weapons isn't the knowledge, it's acquiring the raw fissile material and establishing the industrial base required to refine it.

This makes nuclear weapons a poor analogy for AI, because all you need to develop an LLM is a big pile of commodity GPUs, the publicly available training data, some decent software engineers, and time.

So in both cases all security-through-obscurity will buy you is a delay, and when it comes to AI probably not a very long one (except maybe if you can restrict the supply of GPUs, but the effectiveness of that strategy against China et al remains to be seen).

◧◩◪◨⬒
5. tw04+lZ1[view] [source] 2024-03-01 22:35:55
>>gary_0+0C1
>This makes nuclear weapons a poor analogy for AI, because all you need to develop an LLM is a big pile of commodity GPUs, the publicly available training data, some decent software engineers, and time.

Except the GPUs are on export control, and keeping up with the arms race requires a bunch of data you don't have access to (NVidia's IP) - or direct access to the source.

Just like building a nuclear weapon requires access to either already refined fissile material. Or the IP and skills to build your own refining facilities (IP most countries don't have). Literally everyone has access to Uranium - being able to do something useful with it is another story.

Kind of like... AI.

◧◩◪◨⬒⬓
6. a_wild+052[view] [source] 2024-03-01 23:15:31
>>tw04+lZ1
After the export ban, China demonstrating a process node advancement that shocked the world. So the GPU story doesn't support your position particularly well.

Every wealthy nation & individual on Earth has abundant access to AI's "ingredients" -- compute, data, and algorithms from the '80s. The resource controls aren't really comparable to nuclear weapons. Moreover, banning nukes won't also potentially delay cures for disease, unlock fusion, throw material science innovation into overdrive, and other incredible developments. That's because you're comparing a general tool to one exclusively proliferated for mass slaughter. It's just...not a remotely appropriate comparison.

[go to top]