My guess is "that they did the math" and had an engineering study which convinced them that getting AI datacenters into space will make sense.
It's also not hard to imagine why, the process alone once perfected could be reused for asteroid mining for example, then mirogravity manufacturing, either of which alone would be enormous capital intensive projects. Even if AI dataenters in space are break-even it would be a massive win for SpaceX and leave their competition far behind.
There is no benefit to putting data centers in space versus the giant cost that you would incur by doing so.
Can people please try and use their fucking brains for a second?
Have you considered that people smarter than you think it is plausible?
I know many people smarter than me, plenty of them who have spent careers building data centers, and not one of them think this is plausible.
You should consider whether people smarter than the average investor are pulling a fast one.
I don't doubt spacex can fail at this.
I also don't doubt we are fairly close to making this plausible.
> plenty of them who have spent careers building data centers
Famously, plenty of people who have spent careers building rockets would swear that reusable rockets would absolutely never work.
Maybe you should doubt that. There's literally no reason to think this is plausible besides some hype merchants' say-so.
Excluding Spacex:
Nvidia, Google, China, European Commission, Blue Origin
And this being HN, a YC funded company has put a single GPU rack in space and demonstrated training a reasonable sized model on it.
But yeah, it's all hype, sure.
It's trivial to understand why this is all hype if you pay attention to physics, as another commenter suggested earlier.
https://en.wikipedia.org/wiki/Stefan%E2%80%93Boltzmann_law
Assume you're radiating away the heat for a single B200 (~1kW), and the max radiator temp is 100C, you find A = ~3m^2.
So that's 3 square meters per GPU. Now if you take into account that the largest planar structure deployed into space is ~3k m^2 (https://investors.lockheedmartin.com/news-releases/news-rele...), you're looking at 1000 GPUs.
That's a single aisle in a terrestrial data center.
Cost to deploy on earth vs satellite is left as an exercise to the reader.
You do not radiate all the heat away from a GPU, a modern GPU can run pretty hot. Also look up how this is getting better for the next generation of GPUs.
Maybe repeat your calculation with updated assumptions?
But even if you were completely right, your argument is that we can't do this tomorrow, yes I agree. Typical technology development cycles are about 5-10 years.
LOL. If you don't radiate the heat the spacecraft just gets indefinitely hotter (until it glows and the heat is forcibly irradiated). It's space, there's no fluid to provide convection.