Ultimately, the breakthrough in AI is going to either come from eliminating bottlenecks in computing such that we can simulate many more neurons much more cheaply (in other words, 2025-level technology scaled up is not going to really be necessary or sufficient), or some fundamental research discovery such as a new transformer paradigm. In any case, it feels like these are theoretical discoveries that, whoever makes them first, the other "side" can trivially steal or absorb the information.
ASI is basically a god. This is the ultimate solution (or problem). It will push us to the singularity, and create an utopia or drive humanity to extinction. Imagine someone who is so smart that would win every single nobel prize available, and make multiple discoveries in a matter of a year. And now multiply this person's intelligence by 100 (most likely more, but 100 is already hard enough to grasp). There's no point in investing in anything else. An investment in ASI is an investment in everything (could be a bad one though, depending on the outcome).
The government is banking on being able to control it, which is also pretty funny. It's like a pet hamster thinking they can dictate what a human does.