zlacker

[parent] [thread] 4 comments
1. adgjls+(OP)[view] [source] 2026-02-04 05:46:58
those gross profit margins aren't that useful since training at fixed capacity is continually getting cheaper, so there's a treadmill effect where staying in business requires training new models constantly to not fall behind. If the big companies stop training models, they only have a year before someone else catches up with way less debt and puts them out of business.
replies(1): >>HDThor+Fe
2. HDThor+Fe[view] [source] 2026-02-04 07:58:37
>>adgjls+(OP)
Only if training new models leads to better models. If the newly trained models are just a bit cheaper but not better most users wont switch. Then the entrenched labs can stop training so much and focus on profitable inference
replies(1): >>kuschk+hh
◧◩
3. kuschk+hh[view] [source] [discussion] 2026-02-04 08:18:47
>>HDThor+Fe
If they really have 40-60% gross margins, as training costs go down, the newly trained models could offer the same product at half the price.
replies(1): >>HDThor+SD2
◧◩◪
4. HDThor+SD2[view] [source] [discussion] 2026-02-04 21:31:30
>>kuschk+hh
Well thats why the labs are building these app level products like claude code/codex to lock their users in. Most of the money here is in business subscriptions I think, how much savings would be required for businesses to switch to products that arent better, just cheaper?
replies(1): >>kuschk+bH2
◧◩◪◨
5. kuschk+bH2[view] [source] [discussion] 2026-02-04 21:49:04
>>HDThor+SD2
I think the real lock-in is in "CLAUDE.md" and similar rulesets, which are heavily AI specific.
[go to top]