zlacker

[parent] [thread] 9 comments
1. genera+(OP)[view] [source] 2023-05-16 14:43:45
This is going to be RSA export restrictions all over again. I wish the regulators the best of luck in actually enforcing this. I'm tempted to think that whatever regulations they put in place won't really matter that much, and progress will march on regardless.

Give it a year and a 10x more efficient algorithm, and we'll have GPT4 on our personal devices and there's nothing that any government regulator will be able to do to stop that.

replies(3): >>antilo+l2 >>whazor+lg >>drvdev+QG1
2. antilo+l2[view] [source] 2023-05-16 14:54:33
>>genera+(OP)
Enforcing this is easy. The top high performance GPU manufacturers (Nvidia, AMD, Intel) are all incorporated in the U.S.
replies(2): >>slowmo+LE >>jart+CX1
3. whazor+lg[view] [source] 2023-05-16 15:54:46
>>genera+(OP)
Agreed. Places like huggingface, or even torrents are allowing a unstoppable decentralised AI race. This is like fighting media piracy. Plus other countries might outcompete you on AI now.
◧◩
4. slowmo+LE[view] [source] [discussion] 2023-05-16 17:33:40
>>antilo+l2
Meaning we won't be able to buy an a100 without a license... Wait, I can't afford an a100 anyway.
replies(1): >>shagie+j01
◧◩◪
5. shagie+j01[view] [source] [discussion] 2023-05-16 19:21:56
>>slowmo+LE
As a point of trivia, at one time "a" Mac was one of the fastest computers in the world.

https://www.top500.org/lists/top500/2004/11/ and https://www.top500.org/system/173736/

And while 1100 Macs wouldn't exactly be affordable, the idea of trying to limit commercial data centers gets amusing.

That system was "only" 12,250.00 GFlop/s - I could do that with a small rack of Mac M1 minis now for less than $10k and fewer computers than are in the local grade school computer room.

(and I'm being a bit facetious here) Local authorities looking at power usage and heat dissipation for marijuana growing places might find underground AI training centers.

replies(1): >>nerpde+r51
◧◩◪◨
6. nerpde+r51[view] [source] [discussion] 2023-05-16 19:42:47
>>shagie+j01
All the crypto mining hardware flooding the market right now is being bought up by hobbyists training and fine tuning their own models.
replies(2): >>shagie+Lc1 >>slt202+dd1
◧◩◪◨⬒
7. shagie+Lc1[view] [source] [discussion] 2023-05-16 20:19:21
>>nerpde+r51
My "in my copious free time" ML project is a classifier for cat pictures to reddit cat subs.

For example: https://commons.wikimedia.org/wiki/File:Cat_August_2010-4.jp... would get classified as /r/standardissuecat

https://stock.adobe.com/fi/images/angry-black-cat/158440149 would get classified as /r/blackcats and /r/stealthbombers

Anyways... that's my hobbyist ML project.

◧◩◪◨⬒
8. slt202+dd1[view] [source] [discussion] 2023-05-16 20:21:34
>>nerpde+r51
we needed crypto to crash so that gamers and AI enthusiast could get GPUs
9. drvdev+QG1[view] [source] 2023-05-16 23:17:18
>>genera+(OP)
Yeah and one of the fundamental assumptions around statements like “let’s make AI a licensed regime” is the idea that we know what AI even is. This idea is banking on current technology being the best algorithm or method to produce “AI” and the whole lesson from the “we have no moat” crowd is that this is actually quite uncertain. Even if they succeed in getting some class of model like LLMs under “regulatory capture” - the technology they are working with today is likely to be undermined still by something cheaper working on weaker hardware and with smaller datasets and it’ll probably happen faster if they seek this market capture.

So yes it is quite comparable to the export restrictions of the 90s.

But since Microsoft is involved and we are all of course thinking about Windows vs Linux, I think another good comparison is the worst assumption Microsoft made in the 90s: “we know what an operating system is and what it is for.”

◧◩
10. jart+CX1[view] [source] [discussion] 2023-05-17 01:13:58
>>antilo+l2
Back when crypto was a munition, common people didn't care about being able to invent their own crypto algorithms. They just wanted to use the existing ones, and all they needed to do that was some C code that Phil Zimmermann famously published in a book, to get around export controls. Controlling GPUs won't control the use of AI because GPUs are only needed to train new models. If you just want to use a large language model then CPU works great and you just need one guy to publish the weights.

That happened a few months ago with LLaMA. Since then, the open source community has exceeded all expectations democratizing the hardware angle. AI regulators are already checkmated and they don't know it yet. If their goal is to control the use of AI (rather than simply controlling the people building it) then they'd need to go all the way when it comes to tyranny in order to accomplish their goals. Intel would need to execute Order 66 with their Management Engine and operating systems would need to modify their virus scanners to monitor and report the use of linear algebra. It'd be outrageous.

[go to top]