zlacker

[parent] [thread] 17 comments
1. roflc0+(OP)[view] [source] 2023-11-22 13:35:41
The simple answer here is that the boards actions stood to incinerate millions of dollars of wealth for most of these employees, and they were up in arms.

They’re all acting out the intended incentives of giving people stake in a company: please don’t destroy it.

replies(2): >>whywhy+K1 >>citygu+1i
2. whywhy+K1[view] [source] 2023-11-22 13:45:17
>>roflc0+(OP)
Wild the employees will go back under a new board and the same structure, first priority should be removing the structure that allowed a small group of people to destroy things over what may have been very petty reasons.
replies(1): >>CydeWe+K4
◧◩
3. CydeWe+K4[view] [source] [discussion] 2023-11-22 13:59:08
>>whywhy+K1
Well it's a different group of people and that group will now know the consequences of attempting to remove Sam Altman. I don't see this happening again.
replies(1): >>youcan+1d
◧◩◪
4. youcan+1d[view] [source] [discussion] 2023-11-22 14:34:19
>>CydeWe+K4
Most likely, but it is cute how confident you are towards humanity learning their lesson.
replies(1): >>tstrim+uQ
5. citygu+1i[view] [source] 2023-11-22 14:55:08
>>roflc0+(OP)
I don’t understand how the fact they went from a nonprofit into a for-profit subsidiary of one of the most closed-off anticompetitive megacorps in tech is so readily glossed over. I get it, we all love money and Sam’s great at generating it, but anyone who works at OpenAI besides the board seems to be morally bankrupt.
replies(5): >>gdhkgd+wn >>Zpalmt+us >>endtim+AM >>rozap+5W >>cma+I91
◧◩
6. gdhkgd+wn[view] [source] [discussion] 2023-11-22 15:17:08
>>citygu+1i
Pretty easy to complain about lack of morals when it’s someone else’s millions of dollars of potential compensation that will be incinerated.

Also, working for a subsidiary (which was likely going to be given much more self-governance than working directly at megacorp), doesn’t necessarily mean “evil”. That’s a very 1-dimensional way to think about things.

Self-disclosure: I work for a megacorp.

replies(5): >>yoyohe+pq >>Beetle+9w >>yterdy+0G >>slg+WU >>citygu+Of2
◧◩◪
7. yoyohe+pq[view] [source] [discussion] 2023-11-22 15:30:24
>>gdhkgd+wn
We can acknowledge that it's morally bankrupt, while also not blaming them. Hell, I'd probably do the same thing in their shoes. That doesn't make it right.
◧◩
8. Zpalmt+us[view] [source] [discussion] 2023-11-22 15:39:06
>>citygu+1i
Why would they be morally bankrupt? Do the employees have to care if it's a non profit or a for profit?

And if they do prefer it as a for profit company, why would that make them morally bankrupt?

◧◩◪
9. Beetle+9w[view] [source] [discussion] 2023-11-22 15:56:22
>>gdhkgd+wn
> Pretty easy to complain about lack of morals when it’s someone else’s millions of dollars of potential compensation that will be incinerated.

And while also working for a for-profit company.

◧◩◪
10. yterdy+0G[view] [source] [discussion] 2023-11-22 16:40:31
>>gdhkgd+wn
If some of the smartest people on the planet are willing to sell the rest of us out for Comfy Lifestyle Money (not even Influence State Politics Money), then we are well and truly Capital-F Fucked.
replies(1): >>deckar+RS
◧◩
11. endtim+AM[view] [source] [discussion] 2023-11-22 17:11:34
>>citygu+1i
> anyone who works at OpenAI besides the board seems to be morally bankrupt.

People concerned about AI safety were probably not going to join in the first place...

◧◩◪◨
12. tstrim+uQ[view] [source] [discussion] 2023-11-22 17:29:04
>>youcan+1d
Humanity no. But it's not humanity on the OpenAI board. It's 9 individuals. Individuals have amazing capacity for learning and improvement.
◧◩◪◨
13. deckar+RS[view] [source] [discussion] 2023-11-22 17:39:12
>>yterdy+0G
We already know some of the smartest people are willing to sell us out. Because they work for FAANG ad tech, spending their days figuring out how to maximize the eyeballs they reach while sucking up all your privacy.

It's a post-"Don't be evil" world today.

replies(1): >>jacque+yg1
◧◩◪
14. slg+WU[view] [source] [discussion] 2023-11-22 17:47:21
>>gdhkgd+wn
> Pretty easy to complain about lack of morals when it’s someone else’s millions of dollars of potential compensation that will be incinerated.

That is a part of the reason why organizations choose to set themselves up as a non-profit, to help codify those morals into the legal status of the organization to ensure that the ingrained selfishness that exists in all of us doesn’t overtake their mission. That is the heart of this whole controversy. If OpenAI was never a non-profit, there wouldn’t be any issue here because they wouldn’t even be having this legal and ethical fight. They would just be pursuing the selfish path like all other for profit businesses and there would be no room for the board to fire or even really criticize Sam.

◧◩
15. rozap+5W[view] [source] [discussion] 2023-11-22 17:52:13
>>citygu+1i
Easy to see how humans would join a non profit for the vibes, and then when they create one of the most compelling products of the last decade worth billions of dollars, quickly change their thinking into "wait, i should get rewarded for this".
◧◩
16. cma+I91[view] [source] [discussion] 2023-11-22 18:47:11
>>citygu+1i
Supposedly they had about 50% of employees leave in the year of the conversion to for-profit.
◧◩◪◨⬒
17. jacque+yg1[view] [source] [discussion] 2023-11-22 19:18:16
>>deckar+RS
If half of the brainpower invested in advertising food would go towards world hunger we'd have too much food.
◧◩◪
18. citygu+Of2[view] [source] [discussion] 2023-11-23 00:43:26
>>gdhkgd+wn
I guess my qualm is that this is the cost of doing business, yet people are outraged at the board because they’re not going to make truckloads of money in equity grants. That’s the morally bankrupt part in my opinion.

If you throw your hands up and say, “well kudos to them, theyre actually fulfilling their goal of being a non profit. I’m going to find a new job”. That’s fine by me. But if you get morally outraged at the board over this because you expected the payday of a lifetime, that’s on you.

[go to top]