zlacker

[return to "xAI joins SpaceX"]
1. rybosw+u5[view] [source] 2026-02-02 22:10:52
>>g-mork+(OP)
> The basic math is that launching a million tons per year of satellites generating 100 kW of compute power per ton would add 100 gigawatts of AI compute capacity annually, with no ongoing operational or maintenance needs. Ultimately, there is a path to launching 1 TW/year from Earth.

> My estimate is that within 2 to 3 years, the lowest cost way to generate AI compute will be in space.

This is so obviously false. For one thing, in what fantasy world would the ongoing operational and maintenance needs be 0?

◧◩
2. wongar+z8[view] [source] 2026-02-02 22:21:58
>>rybosw+u5
You operate them like Microsoft's submerged data center project: you don't do maintenance, whatever fails fails. You start with enough redundancy in critical components like power and networking and accept that compute resources will slowly decrease as nodes fail

No operational needs is obviously ... simplified. You still need to manage downlink capacity, station keeping, collision avoidance, etc. But for a large constellation the per-satellite cost of that would be pretty small.

◧◩◪
3. rybosw+Xa[view] [source] 2026-02-02 22:30:11
>>wongar+z8
An 8 GPU B200 cluster goes for about $500k right now. You'd need to put thousands of those into space to mimic a ground-based data center. And the launch costs are best case around 10x the cost of the cluster itself.

Letting them burn up in the atmosphere every time there's an issue does not sound sustainable.

◧◩◪◨
4. sogane+Ng[view] [source] 2026-02-02 22:50:49
>>rybosw+Xa
Are launch costs really 10x!? Could I get a source for that?

In the back on my head this all seemed astronomically far-fetched, but 5.5 million to get 8 GPUs in space... wild. That isn't even a single TB of VRAM.

Are you maybe factoring in the cost to powering them in space in that 5 million?

◧◩◪◨⬒
5. tinco+hF1[view] [source] 2026-02-03 09:17:59
>>sogane+Ng
The Falcon Heavy is $97 million per launch for 64000 kg to LEO, about $1,500 per kg. Starship is gonna be a factor 10 or if you believe Elon a factor 100 cheaper. A single NVidia system is ~140kg. So a single flight can have 350 of them + 14000kg for the system to power it. Right now 97 million to get it into space seems like a weird premium.

Maybe with Starship the premium is less extreme? $10 million per 350 NVidia systems seems already within margins, and $1M would definitely put it in the range of being a rounding error.

But that's only the Elon style "first principles" calculation. When reality hits it's going to be an engineering nightmare on the scale of nuclear power plants. I wouldn't be surprised if they'd spend a billion just figuring out how to get a datacenter operational in space. And you can build a lot of datacenters on earth for a billion.

If you ask me, this is Elon scamming investors for his own personal goals, which is just the principle of having AI be in space. When AI is in space, there's a chance human derived intelligence will survive an extinction event on earth. That's one of the core motivations of Elon.

[go to top]