zlacker

[parent] [thread] 25 comments
1. nico+(OP)[view] [source] 2023-05-16 14:48:49
This is quite incredible

Could you imagine if MS had convinced the govt back in the day, to require a special license to build an operating system (this blocking Linux and everything open)?

It’s essentially what’s happening now,

Except it is OpenAI instead of MS, and it is AI instead of Linux

AI is the new Linux, they know it, and are trying desperately to stop it from happening

replies(10): >>pwdiss+T2 >>xiphia+mc >>sangno+Vd >>brkebd+fx >>0xDEF+Ye1 >>johnal+xy1 >>ploppy+tA1 >>Mistle+yG1 >>shswkn+yf2 >>kraf+Rm2
2. pwdiss+T2[view] [source] 2023-05-16 15:02:53
>>nico+(OP)
> OpenAI instead of MS

In other words, MS with extra steps.

3. xiphia+mc[view] [source] 2023-05-16 15:43:08
>>nico+(OP)
I bet OpenAI is using MS connections and money for lobbying, so it's basically MS again.
replies(2): >>modsha+yZ >>rvz+PB1
4. sangno+Vd[view] [source] 2023-05-16 15:48:56
>>nico+(OP)
I guess @sama took that leaked Google memo to heart ("We have no moat... and neither does OpenAI"). Requiring a license would take out the biggest competitive threats identified in the same memo (Open Source projects) which can result in self-hosted models, which I suppose Altman sees as an existential threat to OpenAI
replies(1): >>helloj+3A
5. brkebd+fx[view] [source] 2023-05-16 17:05:11
>>nico+(OP)
Microsoft did thought. not directly like that because up to the 90s we still have the pretense of being free.

Microsoft did influence government spending in ways that require windows in every govt owned computer, and schools.

◧◩
6. helloj+3A[view] [source] [discussion] 2023-05-16 17:17:47
>>sangno+Vd
There is no way to stop self hosted models. The best would be to send gov to data centers, but what if those centers are outside US jurisdiction? Too funny to watch the gov play these losing games.
replies(3): >>sangno+wB >>Throwa+Qc1 >>antice+He1
◧◩◪
7. sangno+wB[view] [source] [discussion] 2023-05-16 17:23:03
>>helloj+3A
> There is no way to stop self hosted models.

edit: Current models- sure, but they will soon be outdated. I think the idea is to strangle the development of comparable, SoTA models in the future that individuals can self-host; OpenAI certainly won't release their weights, and they'd want the act of releasing weights without a license to be criminalized. If such a law is signed, it would remove the threat of smaller AI companies from disintermediating OpenAI, and individuals from collaborating to engage in any activity that results in publicly available model weights (or even making the recipe itself illegal to distribute)

replies(3): >>helloj+xN >>10000t+Mc1 >>pentag+3y1
◧◩◪◨
8. helloj+xN[view] [source] [discussion] 2023-05-16 18:25:48
>>sangno+wB
I thought we got away from knowledge distribution embargos via 1A during the encryption era.

Even if it passed, I find it hard to believe a bunch of individuals couldn't collaborate via distributed training, which would be almost impossible to prohibit. Anyone could mask their traffic or connect to anon US VPN to circumvent it. The demand will be there to outweigh the risk.

replies(1): >>NavinF+4B1
◧◩
9. modsha+yZ[view] [source] [discussion] 2023-05-16 19:22:56
>>xiphia+mc
just a billion dollar coincidence
◧◩◪◨
10. 10000t+Mc1[view] [source] [discussion] 2023-05-16 20:24:10
>>sangno+wB
You can't strangle the development of such models because the data comes from anywhere and everywhere. Short of shutting off the entire Internet, there's nothing a government can do to prevent some guy on the opposite side of the world from hoovering up publicly accessible human text into a corpus befitting an LLM training set.
replies(1): >>bootsm+jl1
◧◩◪
11. Throwa+Qc1[view] [source] [discussion] 2023-05-16 20:24:46
>>helloj+3A
Sure, but they can be made illegal and difficult to share on the clear web.
◧◩◪
12. antice+He1[view] [source] [discussion] 2023-05-16 20:35:55
>>helloj+3A
Access blocks to those in the US?
13. 0xDEF+Ye1[view] [source] 2023-05-16 20:37:19
>>nico+(OP)
Microsoft owns 49% of OpenAI and is its primary partner and customer.

OpenAI is Microsoft.

◧◩◪◨⬒
14. bootsm+jl1[view] [source] [discussion] 2023-05-16 21:11:23
>>10000t+Mc1
It costs a lot of money to train foundation models, that is a big hurdle to open source models which can strangle further development.

Open source AI needs people with low stakes (Meta AI) who continue to open source foundation models for the community to tinker with

◧◩◪◨
15. pentag+3y1[view] [source] [discussion] 2023-05-16 22:25:40
>>sangno+wB
I have a question, AI is not exclusively to use with data from the internet right?, eg you can throw a bunch of text and ask to order it on a table with x columns, this will need data from the internet? I guess not, you can self host and use it exclusively with your data
16. johnal+xy1[view] [source] 2023-05-16 22:28:03
>>nico+(OP)
I'm no expert, but I'm old and I think that Unix is actually the model that won. Linux won because of Unix IMO, and I think that its too late for the regulators. Not that I understand the stuff but like Unix, the code and the ideas are out there in Universities and even if OpenAI gets their licensing, there will be really open stuff also. So, no worries. Except for the fact that AI itself - well, are we mature enough to handle it without supervision? Dunno.
17. ploppy+tA1[view] [source] 2023-05-16 22:42:38
>>nico+(OP)
Is the old MS tactic of Embrace, Extend, Extinguish? Albeit through the mask of OpenAI / Altman?
◧◩◪◨⬒
18. NavinF+4B1[view] [source] [discussion] 2023-05-16 22:47:06
>>helloj+xN
> distributed training

Unfortunately this isn't a thing. Eg too much batch norm latency leaves your GPUs idle. Unless all your hardware is in the same building, training a single model would be so inefficient that it's not worth it.

◧◩
19. rvz+PB1[view] [source] [discussion] 2023-05-16 22:52:48
>>xiphia+mc
Exactly, say it with me:

Embrace, Extend...

What comes after Extend?

replies(3): >>mindsl+e92 >>gymbea+h92 >>xiphia+Lp2
20. Mistle+yG1[view] [source] 2023-05-16 23:22:47
>>nico+(OP)
Will 2023 be the year of desktop AI?
◧◩◪
21. mindsl+e92[view] [source] [discussion] 2023-05-17 03:24:44
>>rvz+PB1
Enjoy! Yaay!
◧◩◪
22. gymbea+h92[view] [source] [discussion] 2023-05-17 03:25:37
>>rvz+PB1
Extinguish!
23. shswkn+yf2[view] [source] 2023-05-17 04:37:06
>>nico+(OP)
What‘s incredible to me is how humans [in this case Sam Altman] have that tendency to fulfil their given role [CEO of OpenAI] with such a singleminded purpose, that they are able to block out the bigger picture entirely and rationalise away the wider consequences of their actions.

It is as if most humans lack an inner compass, a set of principles, a solid value system.

It is as if humans are robots, that fulfil their given role as if they were programmed to do so.

Our institutions, companies, political parties, etc. are these functions that humans slip into and in robot-like fashion execute their expected role like clockwork.

The free thinkers / big picture thinkers, humanity‘s heroes seem to be so rare.

replies(1): >>ChatGT+dy2
24. kraf+Rm2[view] [source] 2023-05-17 06:00:15
>>nico+(OP)
Do you think AIs are safe? I'd bet that if you would have a convincing argument that they are, then there wouldn't be a need for regulations. If you just assume that it can't possibly be that bad you should really read what the critics have to say. I don't see a way around regulations and I'm hoping that they'll get them right because a mistake here will likely cost us everything
◧◩◪
25. xiphia+Lp2[view] [source] [discussion] 2023-05-17 06:32:26
>>rvz+PB1
Exterminate. But only after AGI is ready.
◧◩
26. ChatGT+dy2[view] [source] [discussion] 2023-05-17 08:00:36
>>shswkn+yf2
I honestly just see the whole company as a lightning rod for crazy ass problems. Anywhere from copyright issues to state sponsored hacking.

I get they're excited they can make money off it, but wow, what a nightmare. I feel if they just stuck to their principles and kept it "Open" they'd still be better off in general.

[go to top]