> My estimate is that within 2 to 3 years, the lowest cost way to generate AI compute will be in space.
This is so obviously false. For one thing, in what fantasy world would the ongoing operational and maintenance needs be 0?
Well, if you can't get there, you can't do maintenance, so there is zero maintenance :)
For AI a random bit flip doesn't matter much.
No operational needs is obviously ... simplified. You still need to manage downlink capacity, station keeping, collision avoidance, etc. But for a large constellation the per-satellite cost of that would be pretty small.
The thing being called obvious here is that the maintenance you have to do on earth is vastly cheaper than the overspeccing you need to do in space (otherwise we would overspec on earth). That's before even considering the harsh radiation environment and the incredible cost to put even a single pound into low earth orbit.
Do you not understand how satellites work? They don't send repair people into space.
This has been a solved problem for decades before the AI gold rush assumed they have some new otherworldly knowledge to teach the rest of the world.
Try doing fault injection on a chip some time. You'll see it's significantly easier to cause a crash / reset / hang than to just flip data bits.
'rad-triggered bit flips don't matter with AI' is a lie spoken by people who have obviously never done any digital design in their life.
Anyone who thinks it makes sense to blast data centers into space has never seen how big and heavy they are, or thought about their immense power consumption, much less the challenge of radiating away that much waste heat into space.
Letting them burn up in the atmosphere every time there's an issue does not sound sustainable.
I think passive cooling (running hot) reduced some of the advantages of undersea compute.
The whole thing makes no sense. What's the advantage of putting AI compute in space? What's even one advantage? There are none. Cooling is harder. Power is harder. Radiation is worse. Maintenance is impossible.
The only reason you'd ever put anything in orbit, aside from rare cases where you need zero-gee, is because you need it to be high up for some reason. Maybe you need it to be above the atmosphere (telescopes), or maybe you need a wide view of the earth (communications satellites), but it's all about the position, and you put up with a lot of downsides for it.
I feel like either I'm taking crazy pills, or all these people talking about AI in space are taking crazy pills. And I don't think it's me.
Space is pretty ridicolous, but underwater might genuinely be a good fit in certain areas.
I would say they probably something a little beefier than consumer hardware and just deal with lots of failures and bit flips.
But cooling is a bigger issue probably?
A million tons will cost $1500x1000x1000000= 1,500,000,000,000. That is one and a half TRILLION dollars per year. That is only the lift costs, it does not take into account the cost of manufacturing the actual space data centers. Who is going to pay this?
The craziest part of those statements is "100 kW per ton." IDK what math he is doing there or future assumptions, but today we can't even sniff at 10 kW per ton. iROSA [1] on the ISS is about 0.150 kW per ton.
[1]https://en.wikipedia.org/wiki/Roll_Out_Solar_Array
edit: iROSA = 33 kW per ton, thanks friends
So, let's accept that Musk's concern of evil runaway AI is a real problem. In that case, is there anything more concerning than a distributed solar powered orbital platform for AI inference?
Elon Musk appears to be his own nemesis.
Not trying to be rude - but it's you who doesn't understand how satellites work.
The U.S. has 31 GPS satellites in orbit right now. The operational cost of running those is $2 million/day.
Not to mention the scale of these satellites would be on the order of 10x-100x the size of the ISS, which we do send people to perform maintenance.
One of the biggest but most pointless questions I have about our current moment in history is whether the people in power actually believe the stuff they say or are lying. Ultimately I don't think the answer really matters, their actions are their actions, but there is just so much that is said by people like Musk that strains credulity to the point that it indicates either they're total idiots or they think the rest of us are total idiots and I'm genuinely curious which of those is more true.
Land and permitting. I’m not saying the math works. Just that there are envelopes for it to.
What if you could keep them in space long enough that by the time they burn up in the atmosphere, there are newer and better GPUs anyway?
Still doesn't seem sustainable to me given launch costs and stuff (hence devil's advocate), but I can sort of see the case if I squint?
It might be possible to scam the Pentagon with some talk about AI and killer satellites that take down ICBMs.
https://www.planetary.org/articles/20170929-spacex-updated-c...
If the cost per pound, power, regulatory burden, networking, and radiation shielding can be gamed out, as well as the thousand other technically difficult and probably expensive problems that can crop up, they have to sum to less than the effective cost of running that same datacenter here on earth. It's interesting that it doesn't play into Jevon's paradox the way it might otherwise - there's a reduction in power consumption planetside, if compute gets moved to space, but no equivalent expansion since the resource isn't transferable.
I think some sort of space junk recycling would be necessary, especially at the terawatt scale being proposed - at some point vaporizing a bunch of arbitrary high temperature chemistry in the upper atmosphere isn't likely to be conducive to human well-being. Copper and aluminum and gold and so on are also probably worth recovering over allowing to be vaporized. With that much infrastructure in space, you start looking at recycling, manufacturing, collection in order to do cost reductions, so maybe part of the intent is to push into off-planet manufacturing and resource logistics?
The whole thing's fascinating - if it works, that's a lot of compute. If it doesn't work, that's a lot of very expensive compute and shooting stars.
We're getting close to having the time for Starship's delays to be the same as the actual time for the Saturn 5 to go from plans to manned launches (Jan 1962-Dec 1968).
In the back on my head this all seemed astronomically far-fetched, but 5.5 million to get 8 GPUs in space... wild. That isn't even a single TB of VRAM.
Are you maybe factoring in the cost to powering them in space in that 5 million?
People are going to Tory Bruno the space datacenters until one day their Claude agent swarm's gonna run in space and they'll be wondering "how did we get here"?
You’ve spent too much life force trying to even understand the liar’s fake logic.
Let’s start right here: there is no such thing as becoming power/grid constrained on earth. If you replaced just the cornfields that the United States uses just to grow corn for ethanol in gasoline just in the corn belt, you could power the entire country with solar+batteries+wind. Easily, and cheaply.
If you don’t even believe that solar+batteries are cheap (they are), fine, choose your choice of power plant. Nuclear works fine.
The truth is, xAI combining with SpaceX is almost certainly corrupt financial engineering. SpaceX as a government contractor and that means Elon’s pal Trump can now siphon money into xAI via the federal government.
Nothing in there is a lie, but any substance is at best implied. Yes, 1,000,000 tons/year * 100kW/ton is 100GW. Yes, there would be no maintenance and negligible operational cost. Yes, there is some path to launching 1TW/year (whether that path is realistic isn't mentioned, neither what a realistic timeline would be). And then without providing any rationale Elon states his estimate that the cheapest way to do AI compute will be in space in a couple years. Elon is famously bad at estimating, so we can also assume that this is his honest belief. That makes a chain of obviously true statements (or close to true, in the case of operating costs), but none of them actually tell us that this will be cheap or economically attractive. And all of them are complete non-sequiturs.
The US mandates by law that we grow a fuck ton of corn to mix 10% ethanol into gasoline.
If you replaced just those cornfields with solar/wind, they would power the entire USA and a 100% electric vehicle fleet. That includes the fact that they are in the corn belt with less than ideal sun conditions.
We aren’t even talking about any farmland that produces actual food or necessary goods, just ethanol as a farm subsidy program.
The US is already horrendously bad at land use. There’s plenty of land. There’s plenty of ability to build more grid capacity.
Let's say given component failure rates, you can expect for 20% of the GPUs to fail in that time. I'd say that's acceptable.
It is already more expensive to performance maintenance on SOCs than it is to replace them. Remember, these machines are not for serving a database, there are practically no storage needs (and storage is the component that fails most often.)
Given that, the main challenge is cooling, I assume that will be figured out before yeeting 100 billion $ of computers into space. Plenty of smart people work at these companies.
One is obviously true, and the other is very likely false.
1. solar is very efficient at generating energy, no moving parts, simple physics etc.
2. in space you don't deal with weather or daylight cycle, you can just point your panels at the sun and generate very stable energy, no batteries required
3. environmental factors are simpler, no earthquakes, security, weather. Main problem here is radiation
In theory its a very elegant way to convert energy to compute.
The basic idea of putting compute in space to avoid inefficient power beaming goes back to NASA in the 60s, but the problem was always the high cost to orbit. Clearly Musk expects Starship will change that.
Agreed, when I wrote "just unplug it," this counterargument was present in my mind, but nobody likes a wall of text.
However, my original point was that a distributed solar powered orbital inference platform is even worse! Think about how hard it would be to practically take out Starlink... it's really hard.
Now.. >1M nodes of a neural net in the sky? Why would someone who lives as a god, the richest man in the world, the only person capable of doing this thanks to his control of SpaceX... do the literal worst thing possible?
> The basic math is that launching a 100,000 tons per year of satellites generating 10 kW of compute power per ton would add 1 gigawatt of AI compute capacity annually, with no ongoing operational or maintenance needs. Ultimately, there is a path to launching 0.01 TW/year from Earth. > My estimate is that within 20 to 30 years, the lowest cost way to generate AI compute will be in space.
It’s completely delusional to think you could operate a data centre in a void with nowhere to put the heat.
Did the Cybertruck "never work"? Obviously not, they're on the streets. Was it a <$40k truck with >250mi range? No.
Did FSD "never work"? Obviously not, tons of people drive many, many miles without touching the wheel. Does Tesla feel confident in it enough to not require safety operators to follow it on robotaxi trips? No. Does Tesla trust it enough to operate in the Las Vegas Loop? No. Has Tesla managed to get any state to allow it to operate truly autonomously? No.
Look, I hope Starship does work as advertised. Its cool stuff. But I don't see it as a given that it will. And given by the track record of the guy who promised it, it gives even less confidence. I'm sad there's less competition in this space. We have so many billionaires out there and yet so few out there actually willing push envelopes.
- launch costs are so high that doing exotic bespoke engineering might be worth it if it can shave off a few pounds
- once again because launches are expensive and rare, you cannot afford to make mistakes, so everything has to work perfectly
If you are willing to launch to lower orbits, and your launch vehicle is cheap, you are building in bulk, then you can compromise on engineering and accept a few broken sats
Undergrads afaik even high schoolers have built cubesats out of aluminum extrusions, hobbyist solar panels, and a tape measure as an antenna. These things probably dont do that much, but they are up there and they do work.
They’d need incredible leaps in efficiency for an orbiting ton collecting and performing 100 KW of compute.
[0] https://en.wikipedia.org/wiki/Electrical_system_of_the_Inter...
Planned lifespan of Starlink satellites is 5years.
Will that come to be? I'm skeptical, especially within the next several years. Starship would have to perform perfectly, and a lot of other assumptions hold, to make this make sense. But that's the idea.
Satellites are heavily reliant on either batteries or being robust to reboots, because they actually do not get stable power - it's much more dynamic (just more predictable too since no weather).
NVIDIA H200 is 0.7 KW per chip.
To have 100K of GPUs you need 500 ISSs.
ISS cooling is 16KW dissipation. So like 16 H200. Now imagine you want to cool 100k instead of 16.
And all this before we talk about radiation, connectivity (good luck with 100gbps rack-to-rack we have on earth), and what have you.
—
Sometimes I think all this space datacenters talk is just a PR to hush those sad folks that happen to live near the (future) datacenter: “don’t worry, it’s temporary”
https://www.nytimes.com/2025/10/20/technology/ai-data-center...
I wouldn't exactly call this a success, for that matter.
Let’s say the costs in 5 years do get as low as $15 per kilogram or about 2 orders of magnitude improvement in launch prices. That means a 200-ton payload Starship would cost $3,000 to launch.
Do you honestly believe that? The world’s largest rocket cost a total of $3,000 to launch?
And cooling. There is no cold water or air in space.
I suppose that an orbit-ready server is going to cost more, and weigh less.
The water that serves as the coolant will weigh a lot though, but it can double as a radiation shield, and partly as reaction mass for orbital correction and deorbiting.
Obviously the solar and cooling for the above would both weigh and cost a ton but... It's feels surprisingly close to being within an order of magnitude of current costs when you ballpark it?
Like i don't think it's actually viable, it's just a little shocking that the idea isn't as far out of line as i expected.
According to this other source https://www.satellitetoday.com/connectivity/2026/02/02/space...
the filing mentions this
> these satellites would operate between 500 km and 2,000 km altitude and 30 degrees and Sun-Synchronous Orbit inclinations (SSO)
Naysayers probably get fired fast.
Musk has a documented history of failing to deliver on promises, timescale or no. So it’s best to engage in some actual critical thinking about the claims he is making.
This is starting to get really serious.
You have missed three zeroes in this calculation ;)
15 per kg for a 200-ton payload is about 3 million$. That seems achievable, given that propellant costs are about 1-1.5 million.
Company website:
https://rdw.com/wp-content/uploads/2023/06/redwire-roll-out-...
And their Opal configuration beats the metric: 5.3 kW for 42.7 kg.
The best case is you meed the unrealistic timeline, the average case outcome is you solve the problem but it is delayed several years. And the worst case is it fails and investors lose some money.
If you try to hire people but your message is: we want to reduce the cost of access to space by 20% in thirty years, you are going to get approximately zero competent engineers, and a whole lot of coasters.
And no investors, so you'll be dependent on the government anyway. Depending on the government is great until people you do not agree with or are generally anti science, are in power. I assume this part should not need an example nowadays?
The panels suffer radiation damage they don't suffer on Earth. If this is e.g. the same altitude orbits as Starlink, then the satellites they're attached to burn up after around tenth of their ground-rated lifetimes. If they're a little higher, then they're in the Van Allen belts and have a much higher radiation dose. If they're a lot higher, the energy cost to launch is way more.
If you could build any of this on the moon, that would be great; right now, I've heard of no detailed plans to do more with moon rock than use it as aggregate for something else, which means everyone is about as far from making either a PV or compute factory out of moon rock as the residents of North Sentinel Island are.
OK, perhaps that's a little unfair, we do actually know what the moon is made of and they don't, but it's a really big research project just to figure out how to make anything there right now, let alone making a factory that could make them cost-competitive with launching from Earth despite the huge cost of launching from Earth.
In this case, it's all about Starship ramping up to such a scale that the cost per pound to orbit drops sufficiently for everything else to make sense - from the people who think the numbers can work, that means somewhere between $20 and $80 per pound, currently at $1300-1400 per pound with Falcon 9. Starship at scale would have to enable at least 2 full orders of magnitude decrease in price to make space compute viable.
If Starship realistically gets into the $90/lb or lower range, space compute makes sense; things like shielding and the rest become pragmatic engineering problems that can be solved. If the cost goes above $100 or so, it doesn't matter how the rest of the considerations play out, you're launching at a loss. That still might warrant government, military, and research applications for space based datacenters, especially in developing the practical engineering, but Starship needs to work, and there needs to be a ton of them for the datacenter-in-space idea to work out.
> ROSA is 20 percent lighter (with a mass of 325 kg (717 lb))[3] and one-fourth the volume of rigid panel arrays with the same performance.
And that’s not the current cutting edge in solar panels either. A company can take more risks with technology choices and iterate faster (get current state-of-the-art solar to be usable in space).
The bet they’re making is on their own engineering progress, like they did with rockets, not on sticking together pieces used on the ISS today.
Don't assume this. Why would you assume this?
1. China is very concerned about Starlink-like constellations. They want their own, but mostly they want to be able to destroy competitors. That is really hard.
2. Many countries have single ASAT capabilities. Where one projectile can hit one satellite. However, this is basically shoot a bullet, with a bullet, on different trajectories.
3. > Sure, it'd take orbital launch capabilities to lift ... how many bags of metal scrap and explosives?
If I understand orbital mechanics... those clouds of chaff would need to oppose the same orbit, otherwise it is a gentle approach. In the non-aligned orbit, it's another bullet hitting a bullet scenarios as in 2, but with a birdshot shotgun.
My entire point is that constellations in LEO take hundreds of Falcon 9's worth of mass to orbit and delta-v to destroy them, as in-orbit grenades which approach gently. This IS REALLY HARD, as far as mass to orbit, all at once! If you blow up some group of Starlink, that chaff cloud will just keep in orbit on the same axis. It will not keep blowing up other Starlinks.
The gentle grenade approach was possibly tested by the CCP here:
Not that you would want 500+ square meters just for cooling of 200KW
And, mind you, it won’t be a simple copper radiator
https://www.nasa.gov/wp-content/uploads/2021/02/473486main_i...
Its also a good way to shred morale and investor confidence when you're a decade past your timelines or continue to fail on actually delivering on past promises.
Thanks for the clarification, I guess that explains this (from you):
> Think about how hard it would be to practically take out Starlink.
and this:
> My entire point is that constellations in GEO
which you've now corrected.
Moving on:
> My entire point is that constellations in LEO take hundreds of Falcon 9's worth of mass to orbit and delta-v to destroy them, as in-orbit grenades which approach gently. This IS REALLY HARD
So let's not do that .. how hard is it to render the entire LEO zone a shit show with contra wise clouds of frag that cause cascading failures?
Forget the geopolitics of China et al. .. LEO launch capabilities are spreading about the globe, it's not just major world powers that pose a threat here.
Just to get on the same page here. My arugument is that prior to Elon Musk, the only human capable of launching >1M distributed solar powered inference nodes, if one accepts runaway AGI/ASI as a threat... prior to that we had a few hundred terrestrial AI inference mega-data centers. Most of them had easily disrupted power supplies by one dude with a Sawzall.
Now, we are moving to a paradigm where the power supply is the sun, the orbital plane gives the nodes power 24/7, and the dude with the Sawzall needs to buy >10,000x (not sure of the the multiple here) the Sawzalls, and also give them escape velocity.
Can we not agree that this is a much more difficult problem to "just unplug it," than it was when the potentially troublesome inference was terrestrial?
It doesn't make sense (neither does Tesla's valuation, for example), but it is what it is.
Both Spacex and Xai have investors lining up.
I meant it specifically for figuring out cooling computers in space.
I am pretty sure this is going to be a solvable problem if this is the bottleneck to achieve data centers in space, given that newer chips are much more tolerant to high temperatures.
https://www.marketplace.org/story/2026/01/07/new-ai-chips-wi...
Just because an idea has some factors in its favor (Space-based datacenter: 100% uptime solar, no permitting problems [2]) doesn't mean it isn't ridiculous on its face. We're in an AI bubble, with silly money flowing like crazy and looking for something, anything to invest it. That, and circular investments to keep the bubble going. Unfortunately this gives validation to stupid ideas, it's one of the hallmarks of bubbles. We've seen this before.
The only things that space-based anything have advantages on are long-distance communication and observation, neither of which datacenters benefit from.
The simple fact is that anything that can be done in a space-based datacenter can be done cheaper on Earth.
[1] https://en.wikipedia.org/wiki/A_Modest_Proposal for the obtuse
[2] until people start having qualms about the atmospheric impact of all those extra launches and orbital debris
https://www.nvidia.com/en-eu/data-center/dgx-h200/?utm_sourc...
Power draw is max 10.2 kW but average draw would be 60-70% of that. let's call it 6kW.
It is possible to obtain orbits that get 24/7 sunlight - but that is not simple. And my understanding is it's more expensive to maintain those orbits than it would be to have stored battery power for shadow periods.
Average blackout period is 30-45 minutes. So you'd need at least 6 kWh of storage to avoid draining the batteries to 0. But battery degradation is a thing. So 6 kWh is probably the absolute floor. That's in the range of 50-70 kg for off-the-shelf batteries.
You'd need at least double the solar panel capacity of the battery capacity, because solar panels degrade over time and will need to charge the batteries in addition to powering the gpu's. 12 kW solar panels would be the absolute floor. A panel system of that size is 600-800 kg.
These are conservative estimates I think. And I haven't factored in the weight of radiators, heat and radiation shielding, thermal loops, or anything else that a cluster in space might need. And the weight is already over 785 kg.
Using the $1,500 per kg, we're approaching $1.2 million.
Again, this is a conservative estimate and without accounting for most of the weight (radiators) because I'm too lazy to finish the napkin math.
This is absolutely not true. I’ve worked on some of this stuff. Permitting costs months, which in dollar terms pays for launch costs ten-fold.
A lot. As someone that has been responsible for trainings with up to 10K GPUs, things fail all the time. By all the time I don't mean every few weeks, I mean daily. From disk failings, to GPU overheating, to infiniband optical connectors not being correctly fastened and disconnecting randomly, we have to send people to manually fix/debug things in the datacenter all the time.
If one GPU fails, you essentially lose the entire node (so 8 GPUs), so if your strategy is to just turn off whatever fails forever and not deal with it, it's gonna get very expensive very fast.
And thats in an environment where temperature is very well controlled and where you don't have to put your entire cluster through 4 Gs and insane vibrations during take off.
Honestly that story sounds right up Pete Hegseth's alley.
However, I think it did accomplish my goal. I bet that we could now have a beer/tea, and laugh together.
If you are ever near Wroclaw, Prague, Leipzig/Dresden, or Seattle, please email my username at the the big G. I would happily meet you at the nearest lovely hotel bar. HN mini meetup. I can only imagine the stories that we might exchange.
Look, I'm Australian, I enjoy a bit of banter. I stripped the personal info from my comment above; I was happy to share with you, reluctant to leave it as was.
I was a frequent Toronto visitor, for the TSX, back when we ran a minerals intelligence service before passing that onto Standard&Poor.
You're on the list, however my movements are constrained for now, my father's a feriously active nonagenarian which is keeping me with one foot nailed to the ground here for now.
The solar panels used in space are really lightweight, about 2 kg / m² [1], it's like ten times lighter weight than terrestrial panels. Still they need load-bearing scaffolding, and electrical conductors to actually collect the hundreds of kilowatts.
Water can't be made as lightweight though.
I wonder how much faith Musk has that the US will never again have a president and/or Congress willing to torpedo such an incestuous deal.
Also, thank you for the reminder that I need to get my ass back to Seattle to be with remaining parent, while I still can. I have been a jackass about that.
What if you had a fleet of Optimus robots up there who would actually operate a TSMC in space and they would maintain the data centers in space?
Hold on let me enter a K hole…
What if we just did things?
I don't think this is true, Starlink satellites have an orbital lifetime of 5-7 years, and GPUs themselves are much more sensitive than solar panels for rad damage. I'd guess the limiting factor is GPU lifetime, so as long as your energy savings outpace the slightly faster gpu depreciation (maybe from 5 -> 3 years) plus cost of launch, it would be economical.
I've said this elsewhere, but based on my envelope math, the cost of launch is the main bottleneck and I think considerably more difficult to solve than any of the other negatives. Even shielding from radiation is a weight issue. Unfortunately all the comments here on HN are focused on the wrong, irrelevant issues like talking about convection in space.
If, like, sea-water entered and corroded the system and it blew up and ate babies, and caused Godzilla, that would be a failure. It just being not quite interesting enough to go after seems... I mean I guess it is, but on a "meh" level.
Maybe with Starship the premium is less extreme? $10 million per 350 NVidia systems seems already within margins, and $1M would definitely put it in the range of being a rounding error.
But that's only the Elon style "first principles" calculation. When reality hits it's going to be an engineering nightmare on the scale of nuclear power plants. I wouldn't be surprised if they'd spend a billion just figuring out how to get a datacenter operational in space. And you can build a lot of datacenters on earth for a billion.
If you ask me, this is Elon scamming investors for his own personal goals, which is just the principle of having AI be in space. When AI is in space, there's a chance human derived intelligence will survive an extinction event on earth. That's one of the core motivations of Elon.
That's better than I thought, but still means their PV is only lasting order-of 20% of their ground lifespans, so the integrated lifetime energy output per unit mass of PV isn't meaningfully improved by locating them in space, even if they were launched by an efficient electromagnetic system rather than by a rocket.
There were five separate flights to service the Hubble telescope. It was designed from the beginning to be repaired and upgraded.
So let's say you expect them to do useful work for you for maybe 2 or 3 years? You have to amortize the launch cost and the build-it-for-space premium in a relatively short time frame. And then what? Reentry? With all the pollution that comes with it?
Also, what orbit do you use? Low-earth orbit is already getting pretty full, with starlink and similar constellations taking up quite some space and increasing collision risk. The higher you go, the more your launch costs go up, and the higher your latency. In higher orbits, atmospheric drag doesn't de-orbit failed satellites quickly, increasing risk of Kessler syndrome.
All in all, I don't buy it.
The average person who has an opinion on musk has roughly the same long term memory consolidation pattern as the average person in general.