zlacker

[parent] [thread] 10 comments
1. vikram+(OP)[view] [source] 2023-11-20 17:54:49
theoretically their concern is around AI safety - whatever it is in practice doing something like that would instantly signal to everyone that they are the bad guys and confirm everyone's belief that this was just a power grab

Edit: since it's being brought up in thread they claimed they closed sourced it because of safety. It was a big controversial thing and they stood by it so it's not exactly easy to backtrack

replies(2): >>whatwh+Kb >>mcv+Oi
2. whatwh+Kb[view] [source] 2023-11-20 18:36:14
>>vikram+(OP)
A power grab by open sourcing something that fits their initial mission? Interesting analysis
replies(2): >>nvm0n2+ie >>vikram+Ji
◧◩
3. nvm0n2+ie[view] [source] [discussion] 2023-11-20 18:44:19
>>whatwh+Kb
No, that's backwards. Remember that these guys are all convinced that AI is too dangerous to be made public at all. The whole beef that led to them blowing up the company was feeling like OpenAI was productizing and making it available too fast. If that's your concern then you neither open source your work nor make it available via an API, you just sit on it and release papers.

Not coincidentally, exactly what Google Brain, DeepMind, FAIR etc were doing up until OpenAI decided to ignore that trust-like agreement and let people use it.

◧◩
4. vikram+Ji[view] [source] [discussion] 2023-11-20 19:00:17
>>whatwh+Kb
They claimed they closed sourced it because of safety. If they go back on that they'd have to explain why the board went along with a lie of that scale, and they'd have to justify why all the concerns they claimed about the tech falling in the wrong hands were actually fake and why it was ok that the board signed off on that for so long
5. mcv+Oi[view] [source] 2023-11-20 19:00:28
>>vikram+(OP)
Not sure how that would make them the bad guys. Doesn't their original mission say it's meant to benefit everybody? Open sourcing it fits that a lot better than handing it all to Microsoft.
replies(1): >>arrowl+Om
◧◩
6. arrowl+Om[view] [source] [discussion] 2023-11-20 19:17:20
>>mcv+Oi
All of their messaging, Ilya's especially, has always been that the forefront of AI development needs to be done by a company in order to benefit humanity. He's been very vocal about how important the gap between open source and OpenAI's abilities is, so that OpenAI can continue to align the AI with 'love for humanity'.
replies(2): >>mcv+Nr >>octaca+8Z
◧◩◪
7. mcv+Nr[view] [source] [discussion] 2023-11-20 19:34:49
>>arrowl+Om
I can read the words, but I have no idea what you mean by them. Do you mean that he says that in order to benefit humanity, AI research needs to be done by private (and therefore monopolising) company? That seems like a really weird thing to say. Except maybe for people who believe all private profit-driven capitalism is inherently good for everybody (which is probably a common view in SV).
replies(2): >>octaca+rZ >>colins+z11
◧◩◪
8. octaca+8Z[view] [source] [discussion] 2023-11-20 21:46:19
>>arrowl+Om
It benefits humanity. Where humanity is very selective part of OpenAI investors. But yea, declare we are non-profit and after closing sourcing for "safety" reasons is smart. Wondering how can it be even legal. Ah, these "non-profits".
◧◩◪◨
9. octaca+rZ[view] [source] [discussion] 2023-11-20 21:47:25
>>mcv+Nr
Private, monopolising. But not paying taxes, because "benefits for humanity".

Ah, OpenAI is closed source stuff. Non-profit, but "we will sell the company" later. Just let us collect data, analyse it first, build a product.

War is peace, freedom is slavery.

◧◩◪◨
10. colins+z11[view] [source] [discussion] 2023-11-20 21:57:45
>>mcv+Nr
the view -- as presented to me by friends in the space but not at OpenAI itself -- is something like "AGI is dangerous, but inevitable. we, the passionate idealists, can organize to make sure it develops with minimal risk."

at first that meant the opposite of monopolization: flood the world with limited AIs (GPT 1/2) so that society has time to adapt (and so that no one entity develops asymmetric capabilities they can wield against other humans). with GPT-3 the implementation of that mission began shifting toward worry about AI itself, or about how unrestricted access to it would allow smaller bad actors (terrorists, or even just some teenager going through a depressive episode) to be an existential threat to humanity. if that's your view, then open models are incompatible.

whether you buy that view or not, it kinda seems like the people in that camp just got outmanuevered. as a passionate idealist in other areas of tech, the way this is happening is not good. OpenAI had a mission statement. M$ manuevered to co-opt that mission, the CEO may or may not have understood as much while steering the company, and now a mass of employees is wanting to leave when the board steps in to re-align the company with its stated mission. whether or not you agree with the mission: how can i ever join an organization with a for-the-public-good type of mission i do agree with, without worrying that it will be co-opted by the familiar power structures?

the closest (still distant) parallel i can find: Raspberry Pi Foundation took funding from ARM: is the clock ticking to when RPi loses its mission in a similar manner? or does something else prevent that (maybe it's possible to have a mission-driven tech organization so long as the space is uncompetitive?)

replies(1): >>mcv+df1
◧◩◪◨⬒
11. mcv+df1[view] [source] [discussion] 2023-11-20 23:14:22
>>colins+z11
Exactly. It seems to me that a company is exactly the wrong vehicle for this. Because a company will be drawn to profit and look for a way to make money of it, rather than developing and managing it according to this ideology. Companies are rarely ideological, and usually simply amoral profit-seekers.

But they probably allowed this to get derailed far too long ago to do anything about it now.

Sounds like their only options are:

a) Structure in a way Microsoft likes and give them the tech

b) Give Microsoft the tech in a different way

c) Disband the company, throw away the tech, and let Microsoft hire everybody who created the tech so they can recreate it.

[go to top]