Board probably took a look at updated burn-rate projections, saw that they have 6 months of runway, saw that they don't have enough GPUs, saw that Llama and Mistral and whatever other open-source models are awesome and run on personal computers, and thought to themselves - why the hell are we spending so much God damn money? For $20 a month memberships? For bots to be able to auto-signup for accounts, not prepay, burn compute, and skip the bill?
Then Grok gets released on Twitter, and they are left wondering - what exactly is it that we do, that is so much better, that we are spending 100x of what cheapo Musk is?
Historically, I'm a backend and distributed systems engineer, but integrating GPT4 into my workflows has unlocked an awe-inspiring ability to lay down fat beads of UI-heavy code in both professional and personal contexts.
But it's still an L3: gotta ask the right questions and doubt every line it produces until it compiles and the tests pass.