> My estimate is that within 2 to 3 years, the lowest cost way to generate AI compute will be in space.
This is so obviously false. For one thing, in what fantasy world would the ongoing operational and maintenance needs be 0?
No operational needs is obviously ... simplified. You still need to manage downlink capacity, station keeping, collision avoidance, etc. But for a large constellation the per-satellite cost of that would be pretty small.
Letting them burn up in the atmosphere every time there's an issue does not sound sustainable.
In the back on my head this all seemed astronomically far-fetched, but 5.5 million to get 8 GPUs in space... wild. That isn't even a single TB of VRAM.
Are you maybe factoring in the cost to powering them in space in that 5 million?
Maybe with Starship the premium is less extreme? $10 million per 350 NVidia systems seems already within margins, and $1M would definitely put it in the range of being a rounding error.
But that's only the Elon style "first principles" calculation. When reality hits it's going to be an engineering nightmare on the scale of nuclear power plants. I wouldn't be surprised if they'd spend a billion just figuring out how to get a datacenter operational in space. And you can build a lot of datacenters on earth for a billion.
If you ask me, this is Elon scamming investors for his own personal goals, which is just the principle of having AI be in space. When AI is in space, there's a chance human derived intelligence will survive an extinction event on earth. That's one of the core motivations of Elon.