But to be serious: Intel creates software too (compilers for example). Nvidia does provide software too. Its also nice to sell licenses.
It _is_ a hype bubble but it is also an S-curve. Intel has missed the AI boat so far, if they are trying to catch up, I would encourage them to try. Building marginally better x86 chips might not cut it anymore.
Is this just marketing?
Maybe they mean the more vram needed for agentic AI? but then the sane thing to say would be that theyll offer more compute for AI.
its just an unhinged thing for a chip manufacturer to say.
It feels like the whole magnificent seven thing and the the way they are holding up an overvalued stock market is driving some desperate decision-making. Like 12 more billion dollars for XAI to buy more Nvidia GPUs does XAI have any revenue at all?
software tools are essential to driving chip adoption; it's one reason why Nvidia got ahead (CUDA)
Yeah they paved the road, but it's a surprise that it lead to many good destinations.
And besides I think you can probably guess that AI is probably the most hardware oriented the industry has been in decades
1) Context memory requirements scale quadratically with length. 2) "Agentic" AI requires a shittonne of context IME. Like a horrifying amount. Tool definitions alone can add up to thousands upon thousands of tokens, plus schemas and a lot of 'back and forth' context use between tool(s). If you just import a moderately complicated OpenAPI/Swagger schema and use it "as is" you will probably run into the hundreds of thousands of tokens within a few tool calls. 3) Finally, compute actually isn't the bottleneck, its memory bandwidth.
There is a massive opportunity for someone to snipe nvidia for inference at least. Inference is becoming pretty 'standardized' at least with the current state of play. If someone can come along with a cheaper GPU with a lot of VRAM and a lot of memory bandwidth, NVidia's moat is far less software wise than it is for CUDA as a whole. I think AMD are very close to reaching that FWIW.
I suspect training and R&D will remain more in NVidias sphere but if Intel got its act together there is definitely room for competition here.
- google ai-maxxed and wiping the floor after a slow start
- apple owning the smartphone space right as desktops are becoming genuinely irrelevant
- microsoft owning the corporate space right as local and on-site is becoming genuinely irrelevant
- nvidia building gigashovels for the gold rush
- meta... well, we'll see how the 'superintelligence' turns out. meta is the weakest bet of the bunch.
- aws... doing cloud stuff, i guess. also not a clear story.
if anyone got lucky, its zuck. between vr, metaverse, and now his thick-rimmed goggles he appears to be genuinely clueless, riding the waves of his facebook moonshot.
the mag7 are gonna be called the giga7 by 2030.
you think the stocks are overheated? you haven't seen nothing. what we call a 'bubble' today, will appear to have been a gentle ramp-up for the real bubble that's waiting for us in 2030.
the printing will continue, btw. and the debt will hit $100 trillion by 2030. and the US will be even more powerful than previously thought possible.
long pax americana, long orange fool, long whoever succeeds him, long the deep state.