zlacker

[parent] [thread] 9 comments
1. tw04+(OP)[view] [source] 2024-03-01 19:43:25
Just like nuclear weapons?

The whole “security through obscurity doesn’t work” is absolute nonsense. It absolutely works and there are countless real world examples. What doesn’t work is relying on that as your ONLY security.

replies(6): >>matthe+r1 >>gary_0+u3 >>sobell+D3 >>whelp_+Z6 >>serf+ih >>llm_tr+cm
2. matthe+r1[view] [source] 2024-03-01 19:52:54
>>tw04+(OP)
This is a broken comparison IMO because you can’t instantly and freely duplicate nuclear weapons across the planet and then offer them up to everyone for low marginal cost and effort.

The tech exists, and will rapidly become easy to access. There is approximately zero chance of it remaining behind lock and key.

3. gary_0+u3[view] [source] 2024-03-01 20:04:05
>>tw04+(OP)
I'm not sure if nuclear weapons are a good example. In the 1940's most of the non-weapons-related nuclear research was public (and that did make certain agencies nervous). That's just how scientists tend to do things.

While the US briefly had unique knowledge about the manufacture of nuclear weapons, the basics could be easily worked out from first principles, especially once schoolchildren could pick up an up-to-date book on atomic physics. The engineering and testing part is difficult, of course, but for a large nation-state stealing the plans is only a shortcut. The on-paper part of the engineering is doable by any team with the right skills. So the main blocker with nuclear weapons isn't the knowledge, it's acquiring the raw fissile material and establishing the industrial base required to refine it.

This makes nuclear weapons a poor analogy for AI, because all you need to develop an LLM is a big pile of commodity GPUs, the publicly available training data, some decent software engineers, and time.

So in both cases all security-through-obscurity will buy you is a delay, and when it comes to AI probably not a very long one (except maybe if you can restrict the supply of GPUs, but the effectiveness of that strategy against China et al remains to be seen).

replies(1): >>tw04+Pq
4. sobell+D3[view] [source] 2024-03-01 20:05:04
>>tw04+(OP)
As I understand it, the principles behind nuclear weapons are well known and the chief difficulty is obtaining enough highly enriched material.
5. whelp_+Z6[view] [source] 2024-03-01 20:25:05
>>tw04+(OP)
Nuclear weapons can definitely be replicated. The U.S. and allies aggressively control the hard to get materials and actively sabotage programs that work on it.

And the countries that want nukes have some anyway, even if they are not as good.

6. serf+ih[view] [source] 2024-03-01 21:29:08
>>tw04+(OP)
Security through obscurity isn't what is at play with nuclear weapons. It's a fabrication and chemistry nightmare at every single level; the effort and materials is what prevents these kind of things from happening -- the knowledge and research needed has been essentially available since the 50s-60s like others have said.

It's more like 'security through scarcity and trade control.'

7. llm_tr+cm[view] [source] 2024-03-01 22:02:50
>>tw04+(OP)
The knowledge of how to make the tool chain of building a nuclear weapon is something that every undergraduate in physics can work out from first principles.

This has been the case since 1960: https://www.theguardian.com/world/2003/jun/24/usa.science

◧◩
8. tw04+Pq[view] [source] [discussion] 2024-03-01 22:35:55
>>gary_0+u3
>This makes nuclear weapons a poor analogy for AI, because all you need to develop an LLM is a big pile of commodity GPUs, the publicly available training data, some decent software engineers, and time.

Except the GPUs are on export control, and keeping up with the arms race requires a bunch of data you don't have access to (NVidia's IP) - or direct access to the source.

Just like building a nuclear weapon requires access to either already refined fissile material. Or the IP and skills to build your own refining facilities (IP most countries don't have). Literally everyone has access to Uranium - being able to do something useful with it is another story.

Kind of like... AI.

replies(1): >>a_wild+uw
◧◩◪
9. a_wild+uw[view] [source] [discussion] 2024-03-01 23:15:31
>>tw04+Pq
After the export ban, China demonstrating a process node advancement that shocked the world. So the GPU story doesn't support your position particularly well.

Every wealthy nation & individual on Earth has abundant access to AI's "ingredients" -- compute, data, and algorithms from the '80s. The resource controls aren't really comparable to nuclear weapons. Moreover, banning nukes won't also potentially delay cures for disease, unlock fusion, throw material science innovation into overdrive, and other incredible developments. That's because you're comparing a general tool to one exclusively proliferated for mass slaughter. It's just...not a remotely appropriate comparison.

replies(1): >>tw04+ME
◧◩◪◨
10. tw04+ME[view] [source] [discussion] 2024-03-02 00:14:16
>>a_wild+uw
>After the export ban, China demonstrating a process node advancement that shocked the world. So the GPU story doesn't support your position particularly well.

I'm not sure why you're conflating process technology with GPUs, but if you want to go there, sure. If anyone was surprised by China announcing they had the understanding of how to do 7nm, they haven't been paying attention. China has been openly and actively poaching TSMC engineers for nearly a decade now.

Announcing you can create a 7nm chip is a VERY, VERY different thing than producing those chips at scale. The most ambitious estimates put it at a 50% yield, and the reality is with China's disinformation engine, it's probably closer to 20%. They will not be catching up in process technology anytime soon.

>Every wealthy nation & individual on Earth has abundant access to AI's "ingredients" -- compute, data, and algorithms from the '80s. The resource controls aren't really comparable to nuclear weapons. Moreover, banning nukes won't also potentially delay cures for disease, unlock fusion, throw material science innovation into overdrive, and other incredible developments. That's because you're comparing a general tool to one exclusively proliferated for mass slaughter. It's just...not a remotely appropriate comparison.

Except they don't? Every nation on earth doesn't have access to the technology to scale compute to the levels needed to make meaningful advances in AI. To say otherwise shows an ignorance of the market. There are a handful of nations capable, at best. Just like there are a handful of nations that have any hope of producing a nuclear weapon.

[go to top]