1) new technology improves vacuum heat radiation efficiency
2) new technology reduces waste heat generation from compute
All the takes I've seen have been focused on #1, but I'm starting to wonder about #2... Specifically spintronics and photonic chips.
In all the conversations I've seen play out on hacker news about compute in space, what comes up every time is "it's unviable because cooling is so inefficient".
Which got me thinking, what if cooling needs dropped by orders of magnitude? Then I learned about photonic chips and spintronics.
After that frankly society-destabilizing miracle of inventing competitive photonic processing, your goal of operating data centers in space becomes a tractable economic problem:
Pros:
- You get a continuous 1.37 kW/m^2 instead of an intermittent 1.0 kW/m^2
- Any reasonable spatial volume is essentially zero-cost
Cons:
- Small latency disadvantage
- You have to launch all of your hardware into polar orbit
- On-site servicing becomes another economic problem
So it's totally reasonable to expect the conversation to revolve around cooling, because we know SpaceX can probably direct around $1T into converting methane into delta-V to make the economics work, but the cooling issue is the difference between maybe getting one DC up for that kind of money, or 100 DCs.
If we suddenly lose 2 orders of magnitude of heat produced by our chips, that means we can fit 2 orders of magnitude more compute in the same volume. That is going to be destabilizing in some way, at the very least because you will get the same amount of compute in 1% the data center square footage of today; alternatively, you will get 100-900x the compute in today's data center footprint. That's like going from dial-up to fiber.