>I have a strong suspicion that the reason Amazon, Google and so on are not particularly interested in building GPT-scale transformers is that they know they can do it anytime - they are just waiting for others to pave the path to actually good stuff.
Google has been hyping gemini since the spring (and not delivering it)
Amazon's Titan Model is not quite there yet.