You raise a valid concern but you presume that we will stay under the OpenAI/Anthropic/etc oligopoly forever. I don't think this is going to be the status quo in the long-term. There is demand for different types of LLMs trained on different data. And there is demand for hardware. For example the new Mac Studio has 512gb VRAM which can run the 600B param Deepseek model locally. So in the future I could see people training their own LLMs to be experts at their language/framework of choice.
Of course you could disagree with my prediction and that these big tech companies are going to build MASSIVE gpu farms the size of the Tesla Gigafactory which can run godlike AI where nobody can compete, but if we get to that point I feel like we will have bigger problems than "AI react code is better than AI solidjs code"