zlacker

[parent] [thread] 3 comments
1. HDThor+(OP)[view] [source] 2026-02-04 07:58:37
Only if training new models leads to better models. If the newly trained models are just a bit cheaper but not better most users wont switch. Then the entrenched labs can stop training so much and focus on profitable inference
replies(1): >>kuschk+C2
2. kuschk+C2[view] [source] 2026-02-04 08:18:47
>>HDThor+(OP)
If they really have 40-60% gross margins, as training costs go down, the newly trained models could offer the same product at half the price.
replies(1): >>HDThor+dp2
◧◩
3. HDThor+dp2[view] [source] [discussion] 2026-02-04 21:31:30
>>kuschk+C2
Well thats why the labs are building these app level products like claude code/codex to lock their users in. Most of the money here is in business subscriptions I think, how much savings would be required for businesses to switch to products that arent better, just cheaper?
replies(1): >>kuschk+ws2
◧◩◪
4. kuschk+ws2[view] [source] [discussion] 2026-02-04 21:49:04
>>HDThor+dp2
I think the real lock-in is in "CLAUDE.md" and similar rulesets, which are heavily AI specific.
[go to top]