zlacker

[return to "Stargate Project: SoftBank, OpenAI, Oracle, MGX to build data centers"]
1. non-+Q1[view] [source] 2025-01-21 22:39:51
>>tedsan+(OP)
Any clues to how they plan to invest $500 billion dollars? What infrastructure are they planning that will cost that much?
◧◩
2. Traine+03[view] [source] 2025-01-21 22:44:12
>>non-+Q1
I'll make a wild guess that they will be building data centers and maybe robotic labs. They are starting with 100B of committed by mostly Softbank, but probably not transacted yet, money.

> building new AI infrastructure for OpenAI in the United States

The carrot is probably something like - we will build enough compute to make a supper intelligence that will solve all the problems, ???, profit.

◧◩◪
3. K0balt+Rj[view] [source] 2025-01-22 00:28:36
>>Traine+03
If we look at the processing requirements in nature, I think that the main trend in AI going forward is going to be doing more with less, not doing less with more, as the current scaling is going.

Thermodynamic neural networks may also basically turn everything on its ear, especially if we figure out how to scale them like NAND flash.

If anything, I would estimate that this is a space-race type effort to “win” the AI “wars”. In the short term, it might work. In the long term, it’s probably going to result in a massive glut in accelerated data center capacity.

The trend of technology is towards doing better than natural processes, not doing it 100000x less efficiently. I don’t think AI will be an exception.

If we look at what is -theoretically- possible using thermodynamic wells, with current model architectures, for instance, we could (theoretically) make a network that applies 1t parameters in something like 1cm2. It would use about 20watts, back of the napkin, and be able to generate a few thousand T/S.

Operational thermodynamic wells have already been demonstrated en silica. There are scaling challenges, cooling requirements, etc but AFAIK no theoretical roadblocks to scaling.

Obviously, the theoretical doesn’t translate to results, but it does correlate strongly with the trend.

So the real question is, what can we build that can only be done if there are hundreds of millions of NVIDIA GPUs sitting around idle in ten years? Or alternatively, if those systems are depreciated and available on secondary markets?

What does that look like?

◧◩◪◨
4. pillef+QT2[view] [source] 2025-01-22 20:20:06
>>K0balt+Rj
What is a thermodynamic well? Couldn't find much on it.
◧◩◪◨⬒
5. K0balt+TC3[view] [source] 2025-01-23 02:05:38
>>pillef+QT2
Extropic (and others) are working on it. It’s a very fast and efficient way to do the big math and state problems associated with LLMs and ML in general. It does the complex matrix algebra in a single “gate” as an analog system.

Extropic update on building the ultimate substrate for generative AI https://twitter.com/Extropic_AI/status/1820577538529525977

[go to top]