zlacker

[parent] [thread] 6 comments
1. mcv+(OP)[view] [source] 2023-11-20 19:00:28
Not sure how that would make them the bad guys. Doesn't their original mission say it's meant to benefit everybody? Open sourcing it fits that a lot better than handing it all to Microsoft.
replies(1): >>arrowl+04
2. arrowl+04[view] [source] 2023-11-20 19:17:20
>>mcv+(OP)
All of their messaging, Ilya's especially, has always been that the forefront of AI development needs to be done by a company in order to benefit humanity. He's been very vocal about how important the gap between open source and OpenAI's abilities is, so that OpenAI can continue to align the AI with 'love for humanity'.
replies(2): >>mcv+Z8 >>octaca+kG
◧◩
3. mcv+Z8[view] [source] [discussion] 2023-11-20 19:34:49
>>arrowl+04
I can read the words, but I have no idea what you mean by them. Do you mean that he says that in order to benefit humanity, AI research needs to be done by private (and therefore monopolising) company? That seems like a really weird thing to say. Except maybe for people who believe all private profit-driven capitalism is inherently good for everybody (which is probably a common view in SV).
replies(2): >>octaca+DG >>colins+LI
◧◩
4. octaca+kG[view] [source] [discussion] 2023-11-20 21:46:19
>>arrowl+04
It benefits humanity. Where humanity is very selective part of OpenAI investors. But yea, declare we are non-profit and after closing sourcing for "safety" reasons is smart. Wondering how can it be even legal. Ah, these "non-profits".
◧◩◪
5. octaca+DG[view] [source] [discussion] 2023-11-20 21:47:25
>>mcv+Z8
Private, monopolising. But not paying taxes, because "benefits for humanity".

Ah, OpenAI is closed source stuff. Non-profit, but "we will sell the company" later. Just let us collect data, analyse it first, build a product.

War is peace, freedom is slavery.

◧◩◪
6. colins+LI[view] [source] [discussion] 2023-11-20 21:57:45
>>mcv+Z8
the view -- as presented to me by friends in the space but not at OpenAI itself -- is something like "AGI is dangerous, but inevitable. we, the passionate idealists, can organize to make sure it develops with minimal risk."

at first that meant the opposite of monopolization: flood the world with limited AIs (GPT 1/2) so that society has time to adapt (and so that no one entity develops asymmetric capabilities they can wield against other humans). with GPT-3 the implementation of that mission began shifting toward worry about AI itself, or about how unrestricted access to it would allow smaller bad actors (terrorists, or even just some teenager going through a depressive episode) to be an existential threat to humanity. if that's your view, then open models are incompatible.

whether you buy that view or not, it kinda seems like the people in that camp just got outmanuevered. as a passionate idealist in other areas of tech, the way this is happening is not good. OpenAI had a mission statement. M$ manuevered to co-opt that mission, the CEO may or may not have understood as much while steering the company, and now a mass of employees is wanting to leave when the board steps in to re-align the company with its stated mission. whether or not you agree with the mission: how can i ever join an organization with a for-the-public-good type of mission i do agree with, without worrying that it will be co-opted by the familiar power structures?

the closest (still distant) parallel i can find: Raspberry Pi Foundation took funding from ARM: is the clock ticking to when RPi loses its mission in a similar manner? or does something else prevent that (maybe it's possible to have a mission-driven tech organization so long as the space is uncompetitive?)

replies(1): >>mcv+pW
◧◩◪◨
7. mcv+pW[view] [source] [discussion] 2023-11-20 23:14:22
>>colins+LI
Exactly. It seems to me that a company is exactly the wrong vehicle for this. Because a company will be drawn to profit and look for a way to make money of it, rather than developing and managing it according to this ideology. Companies are rarely ideological, and usually simply amoral profit-seekers.

But they probably allowed this to get derailed far too long ago to do anything about it now.

Sounds like their only options are:

a) Structure in a way Microsoft likes and give them the tech

b) Give Microsoft the tech in a different way

c) Disband the company, throw away the tech, and let Microsoft hire everybody who created the tech so they can recreate it.

[go to top]