zlacker

[parent] [thread] 48 comments
1. kevdev+(OP)[view] [source] 2025-11-29 15:00:24
As someone with a similar background to the writer of this post (I did avionics work for NASA before moving into more “traditional” software engineering), this post does a great job at summing up my thoughts on why space-based data centers won’t work. The SEU issues were my first though followed by the thermal concerns, and both are addressed here fantastically.

On the SEU issue I’ll add in that even in LEO you can still get SEUs - the ISS is in LEO and gets SEUs on occasion. There’s also the South Atlantic Anomaly where spacecraft in LEO see a higher number of SEUs.

replies(4): >>foobar+yd >>RobotT+Ue1 >>hedora+Lp1 >>inejge+zO1
2. foobar+yd[view] [source] 2025-11-29 16:45:40
>>kevdev+(OP)
The only advantage I can come up with is the background temperature being much colder than Earth surface. If you ignored the capex cost to get this launched and running in orbit, could the cooling cost be smaller? Maybe that's the gimmick being used to sell the idea. "Yes it costs more upfront but then the 40% cooling bill goes away... breakeven in X years"
replies(7): >>dzhiur+tJ >>skywho+U41 >>cmptrn+e51 >>andrew+r91 >>nosela+ka1 >>jcranm+Ya1 >>wat100+Ed1
◧◩
3. dzhiur+tJ[view] [source] [discussion] 2025-11-29 21:16:47
>>foobar+yd
Breakeven in X years probably makes sense for storage (slow depreciation), not GPUs (depreciates in like 4 years)
replies(1): >>foobar+6n1
◧◩
4. skywho+U41[view] [source] [discussion] 2025-11-30 00:19:31
>>foobar+yd
But the cooling cost wouldn’t be smaller. There’s no good way to eliminate the waste heat into space. It’s actually far far harder to radiate the waste heat into space directly than it would be to get rid of it on Earth.
replies(2): >>buildb+R51 >>foobar+8p1
◧◩
5. cmptrn+e51[view] [source] [discussion] 2025-11-30 00:22:58
>>foobar+yd
Cooling is more difficult in space, yes it's colder, but transferring heat is more difficult.
◧◩◪
6. buildb+R51[view] [source] [discussion] 2025-11-30 00:28:33
>>skywho+U41
Which is why vacuum flask for hot/cold drinks are a thing/work. Empty space is a pretty good insulator as it turns out.

It’s a little worrying so many don’t know that.

◧◩
7. andrew+r91[view] [source] [discussion] 2025-11-30 01:01:03
>>foobar+yd
This question is thoroughly covered in the linked article.
replies(1): >>foobar+Mq1
◧◩
8. nosela+ka1[view] [source] [discussion] 2025-11-30 01:11:23
>>foobar+yd
Is it an advantage though ? One of the main objections in the article is exactly that.

There's no atmosphere that helps with heat loss through convection, there's nowhere to shed heat through conduction, all you have is radiation. It is a serious engineering challenge for spacecrafts to getting rid of the little heat they generate, and avoid being overheated by the sun.

replies(1): >>foobar+Vn1
◧◩
9. jcranm+Ya1[view] [source] [discussion] 2025-11-30 01:16:27
>>foobar+yd
Strictly speaking, the thermosphere is actually much warmer than the atmosphere we experience--on the order of 100's or even a 1000 degrees Celsius, if you're measuring by temperature (the average kinetic energy of molecules). However, since particle density is so low, the number of molecules is quite low, and so total heat content of the thermosphere is low. But since particle count is low, conduction and convection are essentially nonexistent, which means cooling needs to rely entirely on radiation, which is much less efficient than other modes at cooling.

In other words, a) background temperature (to the extent it's even meaningful) is much warmer than Earth's surface and b) cooling is much, much more difficult than on Earth.

replies(1): >>Madnes+9n1
◧◩
10. wat100+Ed1[view] [source] [discussion] 2025-11-30 01:39:11
>>foobar+yd
Things on earth also have access to that coldness for about half of each day. How many data centers use radiative cooling into the night sky to supplement their regular cooling? The fact that the answer is “zero” should tell you all you need to know about how useful this is.
replies(2): >>foobar+Pm1 >>oceanp+Sq2
11. RobotT+Ue1[view] [source] 2025-11-30 01:52:59
>>kevdev+(OP)
As someone with only a basic knowledge of space technology, my first thought when I read the idea was "how the hell are they going to cool it".
◧◩◪
12. foobar+Pm1[view] [source] [discussion] 2025-11-30 03:17:12
>>wat100+Ed1
The atmosphere is in the way even at night, and re-radiates the energy. The effective background temperature is the temperature of the air, not to mention it would only work at night. I think there would need to be like 50-ish acres of radiators for a 50MW datacenter to radiate from 60 to 30C. This would be a lot smaller in space due to bigger temp delta. Either way opex would be much much less than average Earth DC (PUE almost 1 instead of run-of-the mill 1.5 or as low as 1.1 for hyperscalers). But yeah the upfront cost would be immense.
replies(1): >>tstrim+ep1
◧◩◪
13. foobar+6n1[view] [source] [discussion] 2025-11-30 03:20:10
>>dzhiur+tJ
I think by far the most mass in this kind of setup would go into the heat management, which could probably last a long time and could be amortized separately from the electronics.
replies(1): >>tsimio+pf2
◧◩◪
14. Madnes+9n1[view] [source] [discussion] 2025-11-30 03:20:30
>>jcranm+Ya1
Technically radiation cooling is 100% efficient. And remarkably effective, you can cool an inert object to the temperature of the CMBR (4K) without doing anything at all. However it is rather slow and works best if there's no nearby planets or stars.

Fun fact though, make your radiator hotter and you can dump just as much if not more energy then you would typically via convective cooling. At 1400C (just below the melting point of steel) you can shed 450kW of heat per square meter, all you need is a really fancy heat pump!

replies(2): >>wat100+4v1 >>fsh+DG1
◧◩◪
15. foobar+Vn1[view] [source] [discussion] 2025-11-30 03:28:30
>>nosela+ka1
I think it is an advantage, the question is just how big, and assume we look only at ongoing operation cost.

- Earth temperatures are variable, and radiation only works at night

- The required radiator area is much smaller for the space installation

- The engineering is simple: CPU -> cooler -> liquid -> pipe -> radiator. We're assuming no constraint on capex so we can omit heat pumps

replies(2): >>Hikiko+ru1 >>nosela+592
◧◩◪
16. foobar+8p1[view] [source] [discussion] 2025-11-30 03:41:24
>>skywho+U41
I don't know about that. Look at where the power goes in a typical data center, for a 10MW DC you might spend 2MW just to blow air around. A radiating cooler in space would almost eliminate that. The problem is the initial investment is probably impractical.
replies(3): >>Hikiko+Iu1 >>nick23+ez1 >>wat100+dd2
◧◩◪◨
17. tstrim+ep1[view] [source] [discussion] 2025-11-30 03:42:50
>>foobar+Pm1
I think you’re ignoring a huge factor in how radiative cooling actually works. I thought the initial question was fine if you hadn’t read the article but understand the downvotes due to doubling down. Think of it this way. Why do thermoses have a vacuum sealed chamber between two walls in order to insulate the contents of the bottle? Because a vacuum is a fucking terrible heat convector. Putting your data center into space in order to cool it is like putting a computer inside of a thermos to cool it. It makes zero fucking sense. There is nowhere for the heat to actually radiate to so it stays inside.
replies(1): >>foobar+4q1
18. hedora+Lp1[view] [source] 2025-11-30 03:47:11
>>kevdev+(OP)
Single event upsets are already commonplace at sea level well below data center scale.

The section of the article that talks about them isn’t great. At least for FPGAs, the state of the art is to run 2-3 copies of the logic, and detect output discrepancies before they can create side effects.

I guess you could build a GPU that way, but it’d have 1/3 the parallelism as a normal one for the same die size and power budget. The article says it’d be a 2-3 order of magnitude loss.

It’s still a terrible idea, pf course.

replies(3): >>ACCoun+7O1 >>sdento+3P1 >>jeltz+m92
◧◩◪◨⬒
19. foobar+4q1[view] [source] [discussion] 2025-11-30 03:49:54
>>tstrim+ep1
Pardon but this doesn't make sense to me. A 1 m^2 radiator in space can eliminate almost a kilowatt of heat.

>vacuum is a fucking terrible heat convector

Yes we're talking about radiating not convection

replies(2): >>wat100+iu1 >>kergon+wC2
◧◩◪
20. foobar+Mq1[view] [source] [discussion] 2025-11-30 03:59:46
>>andrew+r91
Pardon, but the question of "could the operational cost be smaller in space" is almost not touched at all in the article. The article mostly argues that designing thermal management systems for space applications is hard, and that the radiators required would be big, which speaks to the upfront investment cost, not ongoing opex.
replies(1): >>andrew+Yt1
◧◩◪◨
21. andrew+Yt1[view] [source] [discussion] 2025-11-30 04:29:55
>>foobar+Mq1
Ok, sure, technically. To be fair you can't really assess the opex of technology that doesn't exist yet, but I find it hard to believe that operating brand new, huge machines that have to move fluid around (and not nice fluids either) will ever be less than it is on the surface. Better hope you never get a coolant leak. Heck, it might even be that opex=0 still isn't enough to offset the "capex". Space is already hard when you're not trying to launch record-breaking structures.

Even optimistically, capex goes up by a lot to reduce opex, which means you need a really really long breakeven time, which means a long time where nothing breaks. How many months of reduced electricity costs is wiped out if you have to send a tech to orbit?

Oh, and don't forget the radiation slowly destroying all your transistors. Does that count as opex? Can you break even before your customers start complaining about corruption?

replies(1): >>wat100+pv1
◧◩◪◨⬒⬓
22. wat100+iu1[view] [source] [discussion] 2025-11-30 04:34:11
>>foobar+4q1
At what temperature?

And a kilowatt from one square meter is awful. You can do far more than that with access to an atmosphere, never mind water.

◧◩◪◨
23. Hikiko+ru1[view] [source] [discussion] 2025-11-30 04:36:07
>>foobar+Vn1
Radiators on earth mainly do it to air, there's no air in space.
◧◩◪◨
24. Hikiko+Iu1[view] [source] [discussion] 2025-11-30 04:38:59
>>foobar+8p1
Now scale the radiator size for your 8MW datacenter.
◧◩◪◨
25. wat100+4v1[view] [source] [discussion] 2025-11-30 04:41:59
>>Madnes+9n1
How much power would a square meter at 1400C shed from convection?
replies(3): >>baobri+Nz1 >>fsh+lG1 >>Madnes+pJ1
◧◩◪◨⬒
26. wat100+pv1[view] [source] [discussion] 2025-11-30 04:44:29
>>andrew+Yt1
Maintenance will be impossible or at least prohibitively expensive. Which means your only opex is ground support. But it also means your capex depreciates over whatever lifetime these things will have with zero repairs or preventive maintenance.
replies(1): >>verzal+0U1
◧◩◪◨
27. nick23+ez1[view] [source] [discussion] 2025-11-30 05:25:43
>>foobar+8p1
>99.999% of the power put into compute turns into heat, so you're going to need to reject 8 MW of power into space with pure radiation. The ISS EATCS radiators reject 0.07 MW of power in 85 sq. m, so you're talking about 9700 sq. m of radiators, or bigger than a football field/pitch.
replies(1): >>mercut+A03
◧◩◪◨⬒
28. baobri+Nz1[view] [source] [discussion] 2025-11-30 05:31:36
>>wat100+4v1
Not much in space; There's almost no matter to convect!
◧◩◪◨⬒
29. fsh+lG1[view] [source] [discussion] 2025-11-30 07:06:25
>>wat100+4v1
A sports car radiator has about that size and dumps 1 MW without boiling the coolant.
replies(1): >>alexti+UJ1
◧◩◪◨
30. fsh+DG1[view] [source] [discussion] 2025-11-30 07:12:07
>>Madnes+9n1
Your hypothetical liquid metal heat pump would have a Carnot efficiency of only 25%.
◧◩◪◨⬒
31. Madnes+pJ1[view] [source] [discussion] 2025-11-30 07:48:32
>>wat100+4v1
I dont have firm numbers for you since it would depend on environmental conditions. As an educated guess though, I would say a fucking shit ton. You wouldn't want to be anywhere near the damn thing.
◧◩◪◨⬒⬓
32. alexti+UJ1[view] [source] [discussion] 2025-11-30 07:54:06
>>fsh+lG1
A car's "radiator" doesn't actually lose heat by radiation though. It conducts heat to the air rushing through it. That's absolutely nothing like a radiator in a vacuum.
replies(2): >>fsh+rO1 >>tsimio+Ae2
◧◩
33. ACCoun+7O1[view] [source] [discussion] 2025-11-30 08:49:54
>>hedora+Lp1
If you're using GPUs, you're running AI workloads. In which case: do you care?

One of the funniest things about modern AI systems is just how many random bitflips they can tank before their performance begins to really suffer.

◧◩◪◨⬒⬓⬔
34. fsh+rO1[view] [source] [discussion] 2025-11-30 08:53:59
>>alexti+UJ1
That's the point. Forced air cooling is way more efficient than radiative cooling.
35. inejge+zO1[view] [source] 2025-11-30 08:55:57
>>kevdev+(OP)
> On the SEU issue I’ll add in that even in LEO you can still get SEUs

As a sibling post noted, SEUs are possible all the way down to sea level. The recent Airbus mass intervention was essentially a fix for a badly handled SEU in a corner case.

◧◩
36. sdento+3P1[view] [source] [discussion] 2025-11-30 09:03:30
>>hedora+Lp1
It strikes me that neutral network inference loads are probably pretty resilient to these kinds of problems (as we see the bits per activation steadily decreasing), and where they aren't, you can add them as augmentations at training time and they will essentially act as regularization.
◧◩◪◨⬒⬓
37. verzal+0U1[view] [source] [discussion] 2025-11-30 10:01:55
>>wat100+pv1
But ground support will not be cheap. You need to transfer a huge amount of data, which means you need to run and maintain a network of ground stations. And satellite operations are not as cheap as people like to think either.
◧◩◪◨
38. nosela+592[view] [source] [discussion] 2025-11-30 12:59:13
>>foobar+Vn1
A typical CPU heatsink dissipates 10-30% of heat through radiation, and the rest through convection. In space you're in a vacuum so you can't disipated heat through convection.

You need to rework your physical equipment quite substantially to make up for the fact you can't shed 70-90% of the heat in the same manner as you can down here on Earth

◧◩
39. jeltz+m92[view] [source] [discussion] 2025-11-30 13:02:18
>>hedora+Lp1
Sounds like it would remove a lot of the benefits gain from more solar power.
◧◩◪◨
40. wat100+dd2[view] [source] [discussion] 2025-11-30 13:39:06
>>foobar+8p1
How do you propose to get 10MW of heat from the computers out to the radiators?
replies(1): >>mr_toa+hf2
◧◩◪◨⬒⬓⬔
41. tsimio+Ae2[view] [source] [discussion] 2025-11-30 13:50:55
>>alexti+UJ1
The question was about comparing the 1400KW of radiative cooling to how much convective coolig you could get from the same radiator on Earth.
◧◩◪◨⬒
42. mr_toa+hf2[view] [source] [discussion] 2025-11-30 13:56:05
>>wat100+dd2
Same way we’ve always done it.

https://en.wikipedia.org/wiki/External_Active_Thermal_Contro...

replies(1): >>wat100+vj2
◧◩◪◨
43. tsimio+pf2[view] [source] [discussion] 2025-11-30 13:57:19
>>foobar+6n1
How would the radiators be useful if the electronics no longer are? Unless you can repurpose the radiators once the electronics are useless, which you can't in space, then the radiators' useful lifetime is hard limited by the electronics' lifetime.
◧◩◪◨⬒⬓
44. wat100+vj2[view] [source] [discussion] 2025-11-30 14:31:24
>>mr_toa+hf2
I.e. pumps, just like on the ground.
◧◩◪
45. oceanp+Sq2[view] [source] [discussion] 2025-11-30 15:27:35
>>wat100+Ed1
Look up Tech Ingredients episode on Radiative Paint.

The fact that people aren’t using something isn’t evidence that it’s not possible or even a great idea, it could be that a practical application didn’t exist before or someone enterprising enough hasn’t come along yet.

replies(1): >>wat100+SC2
◧◩◪◨⬒⬓
46. kergon+wC2[view] [source] [discussion] 2025-11-30 16:39:59
>>foobar+4q1
> A 1 m^2 radiator in space can eliminate almost a kilowatt of heat.

Assuming that this is the right order of magnitude, a 8MW datacenter discussed upthread would require ~8000 m^2, plus a fancy way of getting the heat there.

A kilowatt is nothing. The workstation on my desk can sustain 1 kW.

replies(1): >>mercut+113
◧◩◪◨
47. wat100+SC2[view] [source] [discussion] 2025-11-30 16:42:49
>>oceanp+Sq2
When something has been known for millennia and hasn’t been put to a particular use even after decades where it could have been used, that is pretty good evidence that this use isn’t a good idea. Especially when it’s something really simple.

Radiative cooling is great for achieving temperature a bit below ambient at night when you don’t have any modern refrigeration equipment. That’s about all. It’s used in space applications because it’s literally the only option.

◧◩◪◨⬒
48. mercut+A03[view] [source] [discussion] 2025-11-30 19:29:17
>>nick23+ez1
Yes, so?

Everyone keeps talking past each other on this, it seems.

“Generating power in space is easy, but ejecting heat is hard!”

Yes.

“That means you’d need huge radiators!”

Yes.

OK, we’re back to “how expensive/reliable is your giant radiator with a data center attached?”

We don’t know yet, but with low launch costs, it isn’t obviously crazy.

◧◩◪◨⬒⬓⬔
49. mercut+113[view] [source] [discussion] 2025-11-30 19:32:42
>>kergon+wC2
Why are you assuming active heat transfer? Passive is the way to go.
[go to top]