It’s a little worrying so many don’t know that.
There's no atmosphere that helps with heat loss through convection, there's nowhere to shed heat through conduction, all you have is radiation. It is a serious engineering challenge for spacecrafts to getting rid of the little heat they generate, and avoid being overheated by the sun.
In other words, a) background temperature (to the extent it's even meaningful) is much warmer than Earth's surface and b) cooling is much, much more difficult than on Earth.
Fun fact though, make your radiator hotter and you can dump just as much if not more energy then you would typically via convective cooling. At 1400C (just below the melting point of steel) you can shed 450kW of heat per square meter, all you need is a really fancy heat pump!
- Earth temperatures are variable, and radiation only works at night
- The required radiator area is much smaller for the space installation
- The engineering is simple: CPU -> cooler -> liquid -> pipe -> radiator. We're assuming no constraint on capex so we can omit heat pumps
>vacuum is a fucking terrible heat convector
Yes we're talking about radiating not convection
Even optimistically, capex goes up by a lot to reduce opex, which means you need a really really long breakeven time, which means a long time where nothing breaks. How many months of reduced electricity costs is wiped out if you have to send a tech to orbit?
Oh, and don't forget the radiation slowly destroying all your transistors. Does that count as opex? Can you break even before your customers start complaining about corruption?
And a kilowatt from one square meter is awful. You can do far more than that with access to an atmosphere, never mind water.
You need to rework your physical equipment quite substantially to make up for the fact you can't shed 70-90% of the heat in the same manner as you can down here on Earth
https://en.wikipedia.org/wiki/External_Active_Thermal_Contro...
The fact that people aren’t using something isn’t evidence that it’s not possible or even a great idea, it could be that a practical application didn’t exist before or someone enterprising enough hasn’t come along yet.
Assuming that this is the right order of magnitude, a 8MW datacenter discussed upthread would require ~8000 m^2, plus a fancy way of getting the heat there.
A kilowatt is nothing. The workstation on my desk can sustain 1 kW.
Radiative cooling is great for achieving temperature a bit below ambient at night when you don’t have any modern refrigeration equipment. That’s about all. It’s used in space applications because it’s literally the only option.
Everyone keeps talking past each other on this, it seems.
“Generating power in space is easy, but ejecting heat is hard!”
Yes.
“That means you’d need huge radiators!”
Yes.
OK, we’re back to “how expensive/reliable is your giant radiator with a data center attached?”
We don’t know yet, but with low launch costs, it isn’t obviously crazy.