zlacker

[parent] [thread] 2 comments
1. farees+(OP)[view] [source] 2023-02-09 01:25:56
Google has figured out how to scale search to the current number of requests per second

Scaling a ChatGPT like product to that level would be challenging I assume - and extremely expensive. Is that correct?

replies(1): >>freedi+Sb
2. freedi+Sb[view] [source] 2023-02-09 02:55:25
>>farees+(OP)
Going rate is about 3 cents per LLM query, so at Google scale of 100k qps, it would be ~$300MM/day. However you probably need to send only about 1/10 of current queries to LLM (as most are still not suitrable for LLM to asnwer) so that would cost only $30MM/day or about $9B/year which is peanuts for Google.
replies(1): >>farees+dxb
◧◩
3. farees+dxb[view] [source] [discussion] 2023-02-12 13:50:45
>>freedi+Sb
From what I have read, some people are of the view that it will affect their margins and ability to compete in other businesses. This is based on something Satya Nadella mentioned in an interview
[go to top]