zlacker

[return to "xAI joins SpaceX"]
1. rybosw+u5[view] [source] 2026-02-02 22:10:52
>>g-mork+(OP)
> The basic math is that launching a million tons per year of satellites generating 100 kW of compute power per ton would add 100 gigawatts of AI compute capacity annually, with no ongoing operational or maintenance needs. Ultimately, there is a path to launching 1 TW/year from Earth.

> My estimate is that within 2 to 3 years, the lowest cost way to generate AI compute will be in space.

This is so obviously false. For one thing, in what fantasy world would the ongoing operational and maintenance needs be 0?

◧◩
2. moomoo+l7[view] [source] 2026-02-02 22:17:11
>>rybosw+u5
They'll just be decommissioned and burn up in space. nVidia will make space-grade GPUs on a 2-3 year cycle.
◧◩◪
3. pantal+G7[view] [source] 2026-02-02 22:18:45
>>moomoo+l7
They don't need to be space grade, consumer hardware will do just fine.

For AI a random bit flip doesn't matter much.

◧◩◪◨
4. q3k+da[view] [source] 2026-02-02 22:28:01
>>pantal+G7
Only if that bitflip happens somewhere in your actual data, vs. some GPU pipeline register that then locks up the entire system until a power cycle. Or causes a wrong address to be fetched. Or causes other nasty silent errors. Or...

Try doing fault injection on a chip some time. You'll see it's significantly easier to cause a crash / reset / hang than to just flip data bits.

'rad-triggered bit flips don't matter with AI' is a lie spoken by people who have obviously never done any digital design in their life.

[go to top]