zlacker

[return to "Data centers in space makes no sense"]
1. beloch+kK[view] [source] 2026-02-03 23:33:46
>>ajyoon+(OP)
I would not assume cooling has been worked out.

Space is a vacuum. i.e. The lack-of-a-thing that makes a thermos great at keeping your drink hot. A satellite is, if nothing else, a fantastic thermos. A data center in space would necessarily rely completely on cooling by radiation, unlike a terrestrial data center that can make use of convection and conduction. You can't just pipe heat out into the atmosphere or build a heat exchanger. You can't exchange heat with vacuum. You can only radiate heat into it.

Heat is going to limit the compute that can be done in a satellite data centre and radiative cooling solutions are going to massively increase weight. It makes far more sense to build data centers in the arctic.

Musk is up to something here. This could be another hyperloop (i.e. A distracting promise meant to sabotage competition). It could be a legal dodge. It could be a power grab. What it will not be is a useful source of computing power. Anyone who takes this venture seriously is probably going to be burned.

◧◩
2. lancew+SS[view] [source] 2026-02-04 00:21:10
>>beloch+kK
It's exiting the 5th best social network and the 10th (or worse) best AI company and selling them to a decent company.

It probably increases Elon's share of the combined entity.

It delivers on a promise to investors that he will make money for them, even as the underlying businesses are lousy.

◧◩◪
3. gpt5+AZ[view] [source] 2026-02-04 01:01:18
>>lancew+SS
I'm confused about the level of conversation here. Can we actually run the math on heat dissipation and feasibility?

A Starlink satellite uses about 5K Watts of solar power. It needs to dissipate around that amount (+ the sun power on it) just to operate. There are around 10K starlink satellites already in orbit, which means that the Starlink constellation is already effectively equivalent to a 50 Mega-watt (in a rough, back of the envelope feasibility way).

Isn't 50MW already by itself equivalent to the energy consumption of a typical hyperscaler cloud?

Why is starlink possible and other computations are not? Starlink is also already financially viable. Wouldn't it also become significantly cheaper as we improve our orbital launch vehicles?

◧◩◪◨
4. hirsin+B51[view] [source] 2026-02-04 01:43:28
>>gpt5+AZ
Simply put no, 50MW is not the typical hyperscaler cloud size. It's not even the typical single datacenter size.

A single AI rack consumes 60kW, and there is apparently a single DC that alone consumes 650MW.

When Microsoft puts in a DC, the machines are done in units of a "stamp", ie a couple racks together. These aren't scaled by dollar or sqft, but by the MW.

And on top of that... That's a bunch of satellites not even trying to crunch data at top speed. No where near the right order of magnitude.

◧◩◪◨⬒
5. tensor+Wn1[view] [source] 2026-02-04 04:22:33
>>hirsin+B51
How much of that power is radiated as the radio waves it sends?
◧◩◪◨⬒⬓
6. hirsin+4y1[view] [source] 2026-02-04 06:07:48
>>tensor+Wn1
Good point - the comms satellites are not even "keeping" some of the energy, while a DC would. I _am_ now curious about the connection between bandwidth and wattage, but I'm willing to bet that less than 1% of the total energy dissipation on one of these DC satellites would be in the form of satellite-to-earth broadcast (keeping in mind that s2s broadcast would presumably be something of a wash).
◧◩◪◨⬒⬓⬔
7. adrian+aD2[view] [source] 2026-02-04 14:29:55
>>hirsin+4y1
I am willing to bet that more than 10% of the electrical energy consumed by the satellite is converted into transmitted microwaves.

There must be many power consumers in the satellite, e.g. radio receivers, lasers, computers and motors, where the consumed energy eventually is converted into heat, but the radio transmitter of a communication satellite must take a big fraction of the average consumed power.

The radio transmitter itself has a great efficiency, much greater than 50%, possibly greater than 90%, so only a small fraction of the electrical power consumed by the transmitter is converted into heat and most is radiated in the microwave signal that goes to Earth's surface.

◧◩◪◨⬒⬓⬔⧯
8. tullia+fK2[view] [source] 2026-02-04 15:03:09
>>adrian+aD2
Unfortunately this is not the case. The amplifiers on the transmit-side phased arrays are about 10% efficient (perhaps 12% on a good day), but the amps represent only ~half the power consumption of the transmit phased arrays. The beamformers and processors are 0% efficient. The receive-side phased arrays are of course 0% efficient as well.
◧◩◪◨⬒⬓⬔⧯▣
9. klaff+7y3[view] [source] 2026-02-04 18:37:19
>>tullia+fK2
I'm curious. I think the whole thing (space-based compute) is infeasible and stupid for a bunch of reasons, but even a class-A amplifier has a theoretical limit of 50% efficiency, and I thought we used class-C amplifiers (with practical efficiencies above 50%) in FM/FSK/etc. applications in which amplitude distortion can be filtered away. What makes these systems be down at 10%?
[go to top]