zlacker

[parent] [thread] 1 comments
1. mike_h+(OP)[view] [source] 2026-02-04 08:57:06
I think the most providers all give high latency batch APIs significant discounts. A lot of AI workloads feel batch-oriented to me, or could be once they move beyond the prototype and testing phases. Chat will end up being a small fraction of load in the long term.
replies(1): >>KoolKa+bC2
2. KoolKa+bC2[view] [source] 2026-02-04 23:22:32
>>mike_h+(OP)
That would imply there's still capacity here on earth for this type of traffic.
[go to top]