zlacker

[parent] [thread] 48 comments
1. shubha+(OP)[view] [source] 2023-11-20 09:02:39
I am not claiming how right or wrong the final outcome would be, but owning the technology with a clear "for-profit" objective is definitely a better structure for Microsoft and for Sam Altman as well (considering, his plans for the future). I have no opinion on AI risk. I just think that a super valuable technology under a non-profit objective was simply an untenable structure, regardless of potential threats.
replies(5): >>9dev+K >>slg+C1 >>calf+43 >>croes+t7 >>bookaw+Ia
2. 9dev+K[view] [source] 2023-11-20 09:07:37
>>shubha+(OP)
This is precisely the problem OpenAI aimed to solve: This technology cannot be treated independently of the potential risks involved.

I agree that this solution seems beneficial for both Microsoft and Sam Altman, but it reflects poorly on society if we simply accept this version of the story without criticism.

replies(2): >>disgru+y8 >>avidph+Jj
3. slg+C1[view] [source] 2023-11-20 09:14:20
>>shubha+(OP)
It isn't fear of a sentient AI that enslaves humanity that makes me disappointed with for-profit companies getting a stronger grip on this tech. It is the fear that a greater portion of the value of this technology will go to the stockholders of said companies rather than potentially be shared among a larger percentage of society. Not that I had that much faith in OpenAI, but in general the shift from non-profit to for-profit is a win for the few over the many.
replies(3): >>xapata+92 >>two_in+j9 >>RcouF1+Rn
◧◩
4. xapata+92[view] [source] [discussion] 2023-11-20 09:17:32
>>slg+C1
I'm a Microsoft shareholder. So is basically everyone else who invests in broad index funds, even if indirectly, through a pension fund. That's "many" enough for me.
replies(5): >>belter+A3 >>ssnist+R4 >>pydry+X4 >>slg+l5 >>bergen+C7
5. calf+43[view] [source] 2023-11-20 09:21:54
>>shubha+(OP)
This super-valuable technology would not have existed precisely because of this unstable (metastable) structure. Microsoft or Google did not create ChatGPT because internally there would have been too many rules, too many cooks, red tape, etc., to do such a bold--and incautionary--thing as to use the entirety of the Internet as the training set, copyright law be damned and all. The crazy structure is what allowed the machine of unprecedented scale to be created, and now the structure has to implode.
replies(1): >>cutemo+kc
◧◩◪
6. belter+A3[view] [source] [discussion] 2023-11-20 09:25:25
>>xapata+92
Can I direct my fury to you, for having to pay extra for my hardware when using a PC to install Linux? - https://en.wikipedia.org/wiki/Bundling_of_Microsoft_Windows

Or being forced to use Teams and Azure, due to my company CEO getting the licenses for free out of his Excel spend? :-))

replies(2): >>xapata+t4 >>ikt+Lu
◧◩◪◨
7. xapata+t4[view] [source] [discussion] 2023-11-20 09:30:59
>>belter+A3
Feel free. I can be your pseudonymous villain.
replies(1): >>belter+z7
◧◩◪
8. ssnist+R4[view] [source] [discussion] 2023-11-20 09:32:43
>>xapata+92
Most Microsoft products have miserable UX because of this enabling mentality. Someone has to come out and say "enough is enough".

A broad index fund sans Microsoft will do just fine. That's the whole point of a broad index fund.

◧◩◪
9. pydry+X4[view] [source] [discussion] 2023-11-20 09:33:07
>>xapata+92
https://www.cnbc.com/2021/10/18/the-wealthiest-10percent-of-...
◧◩◪
10. slg+l5[view] [source] [discussion] 2023-11-20 09:35:37
>>xapata+92
The top 1% own over half of all stocks and the top 10% own nearly 90% so it really isn't that "many". And you know what other companies are in those index funds you own, Microsoft's competitors and customers that would both be squeezed if Microsoft gains a monopoly on some hypothetical super valuable AI tech. If Microsoft suddenly doubled in value, you would barely notice it in your 401k.
11. croes+t7[view] [source] 2023-11-20 09:47:19
>>shubha+(OP)
Better for MS and Altman, that's exactly.

AI should benefit mankind, not corporate profit.

replies(3): >>donny2+yc >>arthur+Kc >>j2bax+kg
◧◩◪◨⬒
12. belter+z7[view] [source] [discussion] 2023-11-20 09:47:41
>>xapata+t4
Much appreciated. I will conserve energy, and reserve my next outburst until a future Windows Update.
replies(1): >>HenryB+Uf
◧◩◪
13. bergen+C7[view] [source] [discussion] 2023-11-20 09:48:10
>>xapata+92
"Why don't they just buy stock"? Marie Antoinette or something
◧◩
14. disgru+y8[view] [source] [discussion] 2023-11-20 09:54:34
>>9dev+K
Yeah but this was caused by the OpenAI board when they fired him. I mean, what did they think was going to happen?

Seems like a textbook case of letting the best be the enemy of the good.

replies(1): >>TheOth+yb
◧◩
15. two_in+j9[view] [source] [discussion] 2023-11-20 09:59:56
>>slg+C1
Even if it goes to stockholders it's not lost forever. That's how we got Starship. The question is what they do with it. As for 'sharing', we've seen that. In USSR it ended up with Putin, Lukashenko, turkmenbashi, and so on. In others it's not much better. Europe is slowly falling behind. There should be some balance and culture.
replies(3): >>slg+Vc >>guappa+vp >>gremli+xt
16. bookaw+Ia[view] [source] 2023-11-20 10:07:59
>>shubha+(OP)
This was essentially already in the cards as a possible outcome when Microsoft made it's big investment in OpenAI, so in my view it was a reasonable outcome at this juncture as well. For Microsoft, it's just Nokia in reverse.

If you looked at sama's actions and not his words, he seems intent on maximizing his power, control and prestige (new yorker profile, press blitzes, making a constant effort to rub shoulders with politicians/power players, worldcoin etc). I think getting in bed with Microsoft with the early investment would have allowed sama to entertain the possibility that he could succeed Satya at Microsoft some time in the distant future; that is, in the event that OpenAI never became as big or bigger than Microsoft (his preferred goal presumably) -- and everything else went mostly right for him. After all, he's always going on about how much money is needed for AGI. He wanted more direct access to the money. Now he has it.

Ultimately, this shows how little sama cared for the OpenAI charter to begin with, specifically the part about benefiting all humanity and preventing an unduly concentration of power. He didn’t start his own separate company because the talent was at OpenAI. He wanted to poach the talent, not obey the charter.

Peter Hintjens (ZeroMQ, RIP) wrote a book called "The Psychopath Code", where he posits that psychopaths are attracted to jobs with access to vulnerable people [0]. Selfless talented idealists who do not chase status and prestige can be vulnerable to manipulation. Perhaps that's why Musk pulled out of OpenAI, him and sama were able to recognize the narcissist in each other and put their guard up accordingly. As Altman says, "Elon desperately wants the world to be saved. But only if he can be the one to save it.”[1] Perhaps this apply to him as well.

Amusingly, someone recently posted an old tweet by pg: "The most surprising thing I've learned from being involved with nonprofits is that they are a magnet for sociopaths."[1] As others in the thread noted, if true, it's up for debate whether this applies more to sama or Ilya. Time will tell I guess.

It'll also be interesting to see what assurances were given to sama et al about being exempt from Microsoft's internal red tape. Prior to this, Microsoft had at least a little plausible deniability if OpenAI was ever embroiled in controversy regarding its products. They won't have that luxury with sama's team in-house anymore.

[0] https://hintjens.gitbooks.io/psychopathcode/content/chapter8...

[1] https://archive.is/uUG7H#selection-2071.78-2071.166

[2] >>38339379

◧◩◪
17. TheOth+yb[view] [source] [discussion] 2023-11-20 10:13:32
>>disgru+y8
Perhaps this is why they fired him.

Although IMO MS has consistently been a technological tarpit. Whatever AI comes out of this arrangement will be a thin shadow of what it might have been.

replies(2): >>noprom+oe >>cyanyd+Nl
◧◩
18. cutemo+kc[view] [source] [discussion] 2023-11-20 10:19:00
>>calf+43
That doesn't seem to require a non profit owning a for profit though.

Just a "normal" startup could have worked too (but apparently not big corp)

Edit: Hmm sibling comment says sth else, I wonder if that makes sense

replies(1): >>calf+vq
◧◩
19. donny2+yc[view] [source] [discussion] 2023-11-20 10:20:56
>>croes+t7
Then “mankind” should be paying for research and servers, shouldn’t it?
replies(4): >>layer8+Sd >>elzbar+Zd >>croes+bi >>cyanyd+8m
◧◩
20. arthur+Kc[view] [source] [discussion] 2023-11-20 10:21:59
>>croes+t7
Unless "humanity" funds this effort, corporate profits will be the main driving force.
replies(2): >>mrangl+Wz >>Zpalmt+FN
◧◩◪
21. slg+Vc[view] [source] [discussion] 2023-11-20 10:23:17
>>two_in+j9
>As for 'sharing', we've seen that. In USSR...

HN isn't the place to have the political debate you seem to want to have, so I will simply say that this is really sad that you equate "sharing" with USSR style communism. There is a huge middle ground between that and the trickle-down Reaganomics for which you seem to be advocating. We should have let that type of binary thinking die with the end of the Cold War.

replies(1): >>two_in+Ni
◧◩◪
22. layer8+Sd[view] [source] [discussion] 2023-11-20 10:30:42
>>donny2+yc
Indeed it should.
◧◩◪
23. elzbar+Zd[view] [source] [discussion] 2023-11-20 10:32:26
>>donny2+yc
We are. QE and Covid funny money devalued the dollar in exact proportion it gave so much money that even stock buy-backs got old and they started investing in stuff to get rid of those pesky humans and their insolent asking of salaries.
◧◩◪◨
24. noprom+oe[view] [source] [discussion] 2023-11-20 10:36:04
>>TheOth+yb
MSFT is a technological tarpit?

Mate... Just because you don't bat perfect doesn't make you a tarpit.

MSFT is a technological powerhouse. They have absolutely killed it since they were founded. They have defined personal computing for multiple generations and more or less made the word 'software' something spoken occasionally at kitchen tables vs people saying 'soft-what?'

Definitely not a tarpit. You are throwing out whole villages of babies because of some various nasty bathwater over the years.

The picture is bigger. So much crucial tech from MSFT. Remains true today.

replies(1): >>gremli+4u
◧◩◪◨⬒⬓
25. HenryB+Uf[view] [source] [discussion] 2023-11-20 10:47:09
>>belter+z7
> ..reserve my next outburst until a..

You'll just waste your time :)

Look, it's Microsoft's right to put any/all effort to making more money with their various practices.

It is our right to buy a Win10 Pro license for X amount of USD, then bolt down the ** out of it with the myriad of privacy tools to protect ourselves and have a "better Win7 Pro OS".

MS has always and will always try to play the game of getting more control, making more money, collecting more telemetry, do clean and dirty things until get caught. Welcome to the human condition. MS employees are humans. MS shareholders are also humans.

As for Windows Update, I don't think I've updated the core version at all since I installed it, and I am using WuMgr and WAU Manager (both portables) for very selective security updates.

It's a game. If you are a former sys-admin or a technical person, then you avoid their traps. If you are not, then the machine will chew your data, just like Google Analytics, AdMod, and so many others do.

Side-note: never update apps when they work 'alright', chances are you will regret it.

replies(2): >>cyanyd+bl >>TeMPOr+lr
◧◩
26. j2bax+kg[view] [source] [discussion] 2023-11-20 10:50:05
>>croes+t7
That’s a nice thought but why would this technology be any different than any other? Perhaps OpenAI and Microsoft now compete with each other. Surely they won’t be the only players in the game… Apple, Google won’t just rest on their laurels. Perhaps they will make a better offer at some point to some great minds in AI.
◧◩◪
27. croes+bi[view] [source] [discussion] 2023-11-20 11:03:20
>>donny2+yc
Mankind already pays for education and infrastructure.

Did OpenAI and others pay for the training data from Stack Overflow, Twitter, Reddit, Github etc. Or any other source produced by mankind?

◧◩◪◨
28. two_in+Ni[view] [source] [discussion] 2023-11-20 11:07:29
>>slg+Vc
>> There should be some balance

is all I'm saying. And I'm not interested in political debates. Neither right nor left side is good in long run. We have examples. More over we can predict what happens if...

replies(1): >>albume+Qv
◧◩
29. avidph+Jj[view] [source] [discussion] 2023-11-20 11:13:02
>>9dev+K
> This is precisely the problem OpenAI aimed to solve: This technology cannot be treated independently of the potential risks involved.

I’ve always thought that what OpenAI was purporting to do—-“protect” humanity from bad things that AI could do to it—-was a fool’s errand under a Capitalist system, what with the coercive law of competition and all.

◧◩◪◨⬒⬓⬔
30. cyanyd+bl[view] [source] [discussion] 2023-11-20 11:23:20
>>HenryB+Uf
it'd be nice if we could enforce monopoly regulations too.
replies(1): >>HenryB+uA
◧◩◪◨
31. cyanyd+Nl[view] [source] [discussion] 2023-11-20 11:27:08
>>TheOth+yb
ClippyAI coming2025: I see you're trying to invade a third world nation, can I help you with that?
◧◩◪
32. cyanyd+8m[view] [source] [discussion] 2023-11-20 11:28:29
>>donny2+yc
... that's how government works.

name a utopian fiction that has corporations as benefactors to humanity

replies(1): >>donny2+kp
◧◩
33. RcouF1+Rn[view] [source] [discussion] 2023-11-20 11:39:32
>>slg+C1
> It is the fear that a greater portion of the value of this technology will go to the stockholders of said companies rather than potentially be shared among a larger percentage of society. Not that I had that much faith in OpenAI, but in general the shift from non-profit to for-profit is a win for the few over the many.

You know what is an even bigger temptation to people than money - power. And being a high priest for some “god” controlling access from the unwashed masses who might use it for “bad” is a really heady dose of power.

This safety argument was used to justify monarchy, illiteracy, religious coercion.

There is a much greater chance of AI getting locked away from normal people by a non-profit on a power trip, rather than by a corporation looking to maximize profit.

replies(2): >>bnralt+6u >>slg+aj1
◧◩◪◨
34. donny2+kp[view] [source] [discussion] 2023-11-20 11:49:12
>>cyanyd+8m
I mean, we already benefit plenty in various ways from corporations like Google.

AI is just another product by another corporation. If I get to benefit from the technology while the company that offers it also makes profit, that’s fine, I think? There wasn’t publicly available AI until someone decided to sell it.

replies(1): >>croes+nu
◧◩◪
35. guappa+vp[view] [source] [discussion] 2023-11-20 11:50:05
>>two_in+j9
> That's how we got Starship

You forget massive public investment?

◧◩◪
36. calf+vq[view] [source] [discussion] 2023-11-20 11:56:48
>>cutemo+kc
A Normal startup may not appeal to academics who aren't in it for the money but who want to pioneer AGI research.
◧◩◪◨⬒⬓⬔
37. TeMPOr+lr[view] [source] [discussion] 2023-11-20 12:04:12
>>HenryB+Uf
It's a game, of the kind where the winning move is not to play. Except we're being forced to. Human condition is in many ways fucked.
◧◩◪
38. gremli+xt[view] [source] [discussion] 2023-11-20 12:20:52
>>two_in+j9
Except the USSR 'ended up' with those people because they went towards Western-style capitalism, these werent Soviet nomenklatura who stole power by abusing Soviet bureacracy, these were post-Soviet, American-style "democratic" leaders.
replies(1): >>two_in+bq1
◧◩◪◨⬒
39. gremli+4u[view] [source] [discussion] 2023-11-20 12:24:22
>>noprom+oe
"Innovation" through anti-trust isn't "killing it".
replies(1): >>noprom+gv
◧◩◪
40. bnralt+6u[view] [source] [discussion] 2023-11-20 12:24:34
>>RcouF1+Rn
Right. Greenpeace also protects the world against technological threats only they can see, and in that capacity has worked to stop nuclear power and GMO use. Acting as if all concern about technology is noble is extremely misguided. There's a lot of excessive concern about technology that holds society back.

If we use the standard of the alignment folks - that the technology today doesn't even have to be the danger, but an imaginary technology that could someday be built might be the danger. And we don't even have to be able to articulate clearly how it's a danger, we can just postulate the possibility. Then all technology becomes suspect, and needs a priest class to decided what access the population can have for fear of risking doomsday.

◧◩◪◨⬒
41. croes+nu[view] [source] [discussion] 2023-11-20 12:26:29
>>donny2+kp
And corporations already benefited plenty from infrastructure, education and stability provided by governments.

>If I get to benefit from the technology while the company that offers it also makes profit, that's fine.

What if you don't benefit because you lose your job to AI or have to deal with the mess created by real looking disinformation created by AI?

Is was already bad with fake images out of ARMA but with AI we get a whole new level of fakes.

◧◩◪◨
42. ikt+Lu[view] [source] [discussion] 2023-11-20 12:29:02
>>belter+A3
> Or being forced to use Teams and Azure, due to my company CEO getting the licenses for free out of his Excel spend? :-))

The pain is real :(

"You use Windows because it is the only OS you know. I use Windows because it is the only OS you know."

◧◩◪◨⬒⬓
43. noprom+gv[view] [source] [discussion] 2023-11-20 12:32:46
>>gremli+4u
Uhhh they won that appeal BTW.. If you are referring to the trouble with Janet Reno.

Gates keeps repeating. Noone hears it.

◧◩◪◨⬒
44. albume+Qv[view] [source] [discussion] 2023-11-20 12:36:27
>>two_in+Ni
Not interested in political debates, but you make political statements drawn from the extremes to support your arguments. Gotcha.

"Europe is falling behind" very much depends on your metrics. I guess on HN it's technological innovation, but for most people the metric would be quality of life, happiness, liveability etc. and Europe's left-leaning approach is doing very nicely in that regard; better than the US.

◧◩◪
45. mrangl+Wz[view] [source] [discussion] 2023-11-20 13:00:42
>>arthur+Kc
Corporate profits should be the driving force. Because then at least we know what (who) and where the controlling source is. Whereas "humanity" is a PR word for far-more fuzzy dark sources rooted in the political machine and its extensions, functionally speaking. The former is far more able to be influenced by actual humanity, ironically. Laws can be created and monitored that directly apply to said corporate force, if need be. Not so much for the political machine.
◧◩◪◨⬒⬓⬔⧯
46. HenryB+uA[view] [source] [discussion] 2023-11-20 13:03:30
>>cyanyd+bl
We do, but it takes a long time, and by the time we get to enforce the thing, the party is half-over. How many years was Microsoft playing around with IE as default browser? And they are still playing dirty games with Edge. It's not that they don't learn. It's that they will play the game until someone stops them, and then they will begin playing a different game.

Some people downvote (it's not about the points) but I merely state the reality and not my opinions.

I've made my living as a sys-admin early in my career using MS products, so thank you MS for putting food on my table. But this doesn't negate the dirty games/dark patterns/etc.

◧◩◪
47. Zpalmt+FN[view] [source] [discussion] 2023-11-20 14:01:13
>>arthur+Kc
And that's a good thing
◧◩◪
48. slg+aj1[view] [source] [discussion] 2023-11-20 16:32:38
>>RcouF1+Rn
>You know what is an even bigger temptation to people than money - power.

Do you think profit minded people and organizations aren't motivated by a desire for power? Removing one path to corruption doesn't mean I think it is impossible for a non-profit to become corrupted, but it is one less thing pulling them in that direction.

◧◩◪◨
49. two_in+bq1[view] [source] [discussion] 2023-11-20 17:03:42
>>gremli+xt
> these were post-Soviet, American-style "democratic" leaders

Before that USSR collapsed under Gorbachev. Why? They simply lost with their planned economy where nobody wants to take a risk. Because (1) it's not rewarding, (2) no individual has enough resources (3) to get thing moving they will have to convince a lot of bureaucrats who don't want to take a risk. They moved forward thanks to few exceptional people. But there wasn't as many willing to take a risk as in 'rotting' capitalism. Don't know why, but leaders didn't see Chinese way. Probably they were busy with internal rats fights and didn't see what's in it for them.

My idea is that there are two extremes. On left side people can be happy like yogs. But they don't produce anything or move forward. On the right side is pure capitalism. Which is inhuman. The optimum is somewhere in between. With good life quality and fast progress. What happens when resources are shared too much and life is good? You can see it in Germany today. 80% of Ukrainian refugees don't works and don't want to.

[go to top]