zlacker

[parent] [thread] 13 comments
1. jprd+(OP)[view] [source] 2024-05-14 23:25:00
Yes. They joined OpenAI with the understanding that it was meant to be an non-profit with a mission to benefit humanity.
replies(2): >>deadba+i >>gfourf+d4
2. deadba+i[view] [source] 2024-05-14 23:26:55
>>jprd+(OP)
Did you not read what I said? They joined a non-profit and eventually realized the mission is futile.
replies(2): >>llamai+K >>jprd+nR1
◧◩
3. llamai+K[view] [source] [discussion] 2024-05-14 23:31:40
>>deadba+i
So dissolve it, return the money, go start a commercial enterprise, then raise some money.
replies(1): >>deadba+a1
◧◩◪
4. deadba+a1[view] [source] [discussion] 2024-05-14 23:36:23
>>llamai+K
I’m sure that’s what each and every member of Hackernews would have done in the same position.
replies(3): >>ok_dad+P1 >>llamai+b2 >>dboreh+n8
◧◩◪◨
5. ok_dad+P1[view] [source] [discussion] 2024-05-14 23:41:22
>>deadba+a1
Not all of us are as morally bankrupt as that. I personally think I could make tons of money with a dumb AI product in my specific area of expertise, but I don’t see how any tech from today would improve outcomes versus the SOTA that’s not AI, but it would add costs and complexity. I would personally be annoyed if a company I worked at changed its goals to make money rather than something more noble. It’s happened a few times to me, unfortunately.
replies(1): >>deadba+5h
◧◩◪◨
6. llamai+b2[view] [source] [discussion] 2024-05-14 23:45:44
>>deadba+a1
Hmm no, I don't think that's the case, but what exactly is the legal or ethical relevance of it?

You don't generally get to excuse bad behavior because you can make up a hypothetical different person doing the same bad thing in that situation.

7. gfourf+d4[view] [source] 2024-05-15 00:06:49
>>jprd+(OP)
This entire saga is really an example of the absurdity of non-profits and philanthropy in general.

The only difference between nonprofit and for-profit entities is that nonprofits divert their profits to a nebulous “cause”, with the investors receiving nothing, while for-profits can distribute profits to their funders.

Other than that, they are free to operate identically.

Generally, entities subject to competitive pressures and with incentives for performance are much better at “benefitting humanity.” Therefore, non-profit status really only makes sense when, one, a profitable enterprise oriented around the intended result isn’t viable (e.g., conservation) or two, there’s a stakeholder that we’ve decided ought to be sheltered from the dynamics of private enterprise, e.g, university students or neutral public broadcasters.

But even in these cases, the non-profit entities basically behave like profit-oriented companies, because their goal is still profitability, just without a return to investors.

OpenAI as a nonprofit would behave the exact same way. There’s no law that the models would have to be open. They’d still be making closed models, charging users, and paying massive salaries. Literally the only difference is that they wouldn’t be able to return money to their investors, and therefore have a much harder time attracting investors, and therefore be less equipped to accomplishing their goal of developing powerful AI.

The irony is that nonprofits are usually only good for things that make for shitty businesses, and things that make shitty businesses usually aren’t that beneficial to humanity. As soon as something becomes really good at what it does, for-profit status makes sense.

What this means, imo, is that most philanthropy dollars are wasted and we would be much better off if they were invested instead. The irony is that this is the point of much philanthropic giving - it ends up being a game of how much money you can burn on nothing, a crass status symbol.

replies(1): >>lokar+ea
◧◩◪◨
8. dboreh+n8[view] [source] [discussion] 2024-05-15 00:48:06
>>deadba+a1
Sure, after fees and expenses..
◧◩
9. lokar+ea[view] [source] [discussion] 2024-05-15 01:00:42
>>gfourf+d4
Matt Levine likes to say that the big Wall Street banks are socialist paradises that funnel almost all of the returns to the workers.

It happens everywhere

◧◩◪◨⬒
10. deadba+5h[view] [source] [discussion] 2024-05-15 02:11:12
>>ok_dad+P1
Not all, but also not enough.
replies(1): >>ok_dad+Zi
◧◩◪◨⬒⬓
11. ok_dad+Zi[view] [source] [discussion] 2024-05-15 02:30:46
>>deadba+5h
It is career limiting to have morals.
replies(1): >>holons+2y
◧◩◪◨⬒⬓⬔
12. holons+2y[view] [source] [discussion] 2024-05-15 05:34:24
>>ok_dad+Zi
It's much harder living without strong moral virtue.
replies(1): >>deadba+sF1
◧◩◪◨⬒⬓⬔⧯
13. deadba+sF1[view] [source] [discussion] 2024-05-15 14:51:21
>>holons+2y
No matter what path you choose, you still die.
◧◩
14. jprd+nR1[view] [source] [discussion] 2024-05-15 15:45:35
>>deadba+i
I read what you said, and I apologize for not being a bit more clear.

I completely understand your perspective, and I hope I'm always strong enough to listen to my conscience and obey my morals.

One of the first interviews I was ever offered in a technical role was for Bechtel, in 2004. I was desperate to break into a career, I accepted the interview. I was in the car driving to the location, and just realized I couldn't do it. I couldn't ignore my morality to work for such a clear and direct war profiteer, that as a private company, had no oversight.

If I join a non-profit that has a humanitarian mission, I do so because I'm into the mission and feel fulfilled by that more than my comp. I can't imagine trading that in just because @sama got thirsty.

The mission is futile, the mission at this organization has been compromised and corrupted. Resign and continue your mission elsewhere.

[go to top]