radiators can be made as long as desirable within the shade of the solar panels, hence the designer can pracitically set arbitrarily low temperatures above the background temperature of the universe.
Yes, you can overcome this with enough radiator area. Which costs money, and adds weight and space, which costs more money.
Nobody is saying the idea of data centers in space is impossible. It's obviously very possible. But it doesn't make even the slightest bit of economic sense. Everything gets way, way harder and there's no upside.
I don't think dissipating heat would be an issue at all. The cost of launch I think is the main bottleneck, but cooling would just be a small overhead on the cost of energy. Not a fundamental problem.
Either that or your talking out of your ass.
FYI a single modern rack consumes twice the energy of the entire ISS, in a much much much much smaller package and you'll need thousands of them. You'd need 500-1000 sqm of radiator per rack and that alone would weight several tonnes...
You'll also have to actively cool down your gigantic solar panel array
No need to apply at NASA, to the contrary, if you don't believe in Stefan Boltzmann law, feel free to apply for a Nobel prize with your favorite crank theory in physics.
Also this assumes a flat surface on both sides. Another commenter in this thread brought up a pyramid shape which could work.
Finally, these gpus are design for earth data centers where power is limited and heat sinks are abundant. In the case of space data centers you can imagine we get better radiators or silicon that runs hotter. Crypto miners often run asics very hot.
I just don't understand why every time this topic is brought up, everyone on HN wants to die on the hill that cooling is not possible. It is?? the primary issue if you do the math is clearly the cost of launch.
My example is optimized not for minimal radiator surface area, but for minimal mathematical and physical knowledge required to understand feasibility.
Your numbers are different because you chose 82 C (355 K) instead of my 26 C (300 K).
Near normal operating temperatures hardware lifetime roughly doubles for every 10 deg C/K decrease in temperature (this does not hold indefinitely of course).
You still need to move the heat from the GPU to the radiator so my example of 26 deg C at the radiator just leaves a lot of room against criticism ;)