zlacker

[return to "Amazon's Secret Weapon in Chip Design Is Amazon"]
1. jerf+ij2[view] [source] 2024-09-16 14:31:27
>>mdp202+(OP)
Amazon's secret weapon is Amazon's own use of chips, if your idea of "secret" is that you wear a blindfold and anything you can't see is therefore "secret".

More like "Amazon's Blindlingly Obvious Weapon In Chip Design Is AWS".

◧◩
2. alephn+Ok2[view] [source] 2024-09-16 14:41:11
>>jerf+ij2
All Cloud Providers are working on vertically integrating compute - for example, look at GCP's bet on TPUs and Azure's investment into Cobalt and Maia.

The difference is, Amazon's bet on commodified ML Compute infra largely paid off due to a mix of developer advocacy and the fact that there is a large existing market of users somewhat adept with AWS.

In fact, it could be a case study of how an incumbent can lose the ball - back in the 2014-18 period Tensorflow was THE framework, and Google absolutely could have used it as a killer app to market then new GCP (and they did try), but Amazon was able to outcompete GCP on both Containerization and Cloud ML Compute because of their strong developer advocacy and training programs.

Don't be dismissive about Amazon's historically strong developer advocacy motion. Peak Microsoft, Intel, Cisco, VMWare etc all placed similar bets, and Nvidia has done something similar since the mid-2010s in the ML space. At the end of the day, buyers are somewhat technical.

GTM strategy is just as important as technical and product strategy.

◧◩◪
3. jerf+aq2[view] [source] 2024-09-16 15:17:26
>>alephn+Ok2
I am at a complete loss as to how you get from "Of course Amazon's chip advantage is their own chip consumption and scale in AWS" to "being dismissive of Amazon's developer advocacy".

If they're doing better than the others, good for them. It still blindingly obvious that having the biggest cloud is a huge advantage for chip design and to successfully exploit savings because of that scale, not some sort of super amazing secret just now being revealed by IEEE.

◧◩◪◨
4. alephn+bO2[view] [source] 2024-09-16 17:31:15
>>jerf+aq2
Both Google and MS had advantages that Amazon did not have in the mid-2010s in the ML space.

Google had the advantage of owning the entire ML and Infra stack (TensorFlow, K8s, BERT, CNCF) and Microsoft had an inbuilt advantage in research communities thanks to MS Research's outsized impact in fundamental ML research.

At the time, the Annapurna Labs acquisition was seen as a massive coin-toss because IBM went down a similar path a decade before and failed.

◧◩◪◨⬒
5. dh2022+Ct3[view] [source] 2024-09-16 21:13:12
>>alephn+bO2
"Microsoft had an inbuilt advantage in research communities thanks to MS Research's outsized impact in fundamental ML research"

I thought for a few minutes and I could not come up with an example of an ML technology that originated at MS Research and then spread outside MSFT. Care to give some examples? Thanks!

◧◩◪◨⬒⬓
6. alephn+9b4[view] [source] 2024-09-17 03:07:30
>>dh2022+Ct3
> Care to give some examples

In the 2010s they were the leader in NLP and the precursor of LLMs like GPT3/3.5/4/4o

Machine Translation with Human Parity (2018) - https://arxiv.org/abs/1803.05567

MT-DNN (2019) - https://arxiv.org/abs/1901.11504

MASS (2019) - https://arxiv.org/abs/1905.02450

VALL-E (2023) - https://arxiv.org/abs/2301.02111

VALL-E 2 (2024) - https://arxiv.org/abs/2406.05370

While OpenAI was the first to monetize an LLM at scale via ChatGPT, it's still the early stages of this field, and there is a lot of innovation that can still be leveraged, especially in non-English language modeling, machine translation, text-to-speech, etc.

It's in this segment that Microsoft Research shines moreso than even Google Research let alone other organizations because of their strong NLP background in Chinese (Microsoft Research Asia), South Asian languages (Microsoft Research India), Arabic (Microsoft Research's older work during the Iraq War), etc.

[go to top]