zlacker

[parent] [thread] 6 comments
1. hannas+(OP)[view] [source] 2026-02-04 04:38:55
People keep saying this but it's simply untrue. AI inference is profitable. Openai and Anthropic have 40-60% gross margins. If they stopped training and building out future capacity they would already be raking in cash.

They're losing money now because they're making massive bets on future capacity needs. If those bets are wrong, they're going to be in very big trouble when demand levels off lower than expected. But that's not the same as demand being zero.

replies(2): >>adgjls+D6 >>mbesto+CI
2. adgjls+D6[view] [source] 2026-02-04 05:46:58
>>hannas+(OP)
those gross profit margins aren't that useful since training at fixed capacity is continually getting cheaper, so there's a treadmill effect where staying in business requires training new models constantly to not fall behind. If the big companies stop training models, they only have a year before someone else catches up with way less debt and puts them out of business.
replies(1): >>HDThor+il
◧◩
3. HDThor+il[view] [source] [discussion] 2026-02-04 07:58:37
>>adgjls+D6
Only if training new models leads to better models. If the newly trained models are just a bit cheaper but not better most users wont switch. Then the entrenched labs can stop training so much and focus on profitable inference
replies(1): >>kuschk+Un
◧◩◪
4. kuschk+Un[view] [source] [discussion] 2026-02-04 08:18:47
>>HDThor+il
If they really have 40-60% gross margins, as training costs go down, the newly trained models could offer the same product at half the price.
replies(1): >>HDThor+vK2
5. mbesto+CI[view] [source] 2026-02-04 10:58:37
>>hannas+(OP)
> Openai and Anthropic have 40-60% gross margins.

Stop this trope please. We (1) don't really know what their margins are and (2) because of the hard tie-in to GPU costs/maintenance we don't know (yet) what the useful life (and therefore associated OPEX) is of GPUs.

> If they stopped training and building out future capacity they would already be raking in cash.

That's like saying "if car companies stopped researching how to make their cars more efficient, safer, more reliable they'd be more profitable"

◧◩◪◨
6. HDThor+vK2[view] [source] [discussion] 2026-02-04 21:31:30
>>kuschk+Un
Well thats why the labs are building these app level products like claude code/codex to lock their users in. Most of the money here is in business subscriptions I think, how much savings would be required for businesses to switch to products that arent better, just cheaper?
replies(1): >>kuschk+ON2
◧◩◪◨⬒
7. kuschk+ON2[view] [source] [discussion] 2026-02-04 21:49:04
>>HDThor+vK2
I think the real lock-in is in "CLAUDE.md" and similar rulesets, which are heavily AI specific.
[go to top]