zlacker

[return to "Stargate Project: SoftBank, OpenAI, Oracle, MGX to build data centers"]
1. serjes+ja[view] [source] 2025-01-21 23:24:37
>>tedsan+(OP)
You have to keep in mind Microsoft is planning on spending almost 100B in datacenter capex this year and they're not alone. This is basically OpenAI matching the major cloud provider's spending.

This could also be (at least partly) a reaction to Microsoft threatening to pull OpenAI's cloud credits last year. OpenAI wants to maintain independence and with compute accounting for 25–50% of their expenses (currently) [2], this strategy may actually be prudent.

[1] https://www.cnbc.com/2025/01/03/microsoft-expects-to-spend-8...

[2] https://youtu.be/7EH0VjM3dTk?si=hZe0Og6BjqLxbVav&t=1077

◧◩
2. throit+Oa[view] [source] 2025-01-21 23:28:37
>>serjes+ja
Microsoft has lots of revenue streams tied to that capex outlay. Does OpenAI have similar revenue numbers to Microsoft?
◧◩◪
3. tuvang+Uc[view] [source] 2025-01-21 23:42:24
>>throit+Oa
OpenAI has a very healthy revenue stream in the form of other companies throwing money at them.

But to answer your question, no they aren’t even profitable by themselves.

◧◩◪◨
4. manque+7f[view] [source] 2025-01-21 23:55:36
>>tuvang+Uc
> they aren’t even profitable

Depends on your definition of profitability, They are not recovering R&D and training costs, but they (and MS) are recouping inference costs from user subscription and API revenue with a healthy operating margin.

Today they will not survive if they stop investing in R&D, but they do have to slow down at some point. It looks like they and other big players are betting on a moat they hope to build with the $100B DCs and ASICs that open weight models or others cannot compete with.

This will be either because training will be too expensive (few entities have the budget for $10B+ on training and no need to monetize it) and even those kind of models where available may be impossible to run inference with off the shelf GPUs, i.e. these models can only run on ASICS, which only large players will have access to[1].

In this scenario corporations will have to pay them the money for the best models, when that happens OpenAI can slow down R&D and become profitable with capex considered.

[1] This is natural progression in a compute bottle-necked sector, we saw a similar evolution from CPU to ASICS and GPU in the crypto few years ago. It is slightly distorted comparison due to the switch from PoW to PoS and intentional design for GPU for some coins, even then you needed DC scale operations in a cheap power location to be profitable.

◧◩◪◨⬒
5. enrage+tz2[view] [source] 2025-01-22 18:11:05
>>manque+7f
> but they (and MS) are recouping inference costs from user subscription and API revenue with a healthy operating margin.

As far as I am aware the only information from within OpenAI one way or another is from their financial documents circulated to investors:

> The fund-raising material also signaled that OpenAI would need to continue raising money over the next year because its expenses grew in tandem with the number of people using its products.

Subscriptions are the lions share of their revenue (73%). It's possible they are making money on the average Plus or Enterprise subscription but given the above claim they definitely aren't making enough to cover the cost of inference for free users.

https://www.nytimes.com/2024/09/27/technology/openai-chatgpt...

[go to top]