I think the adage about "a solution needs to be 10x other solutions to make someone switch" applies here.
Saying something performs slightly better than the industry standard offerings (OpenAI) means that OpenAI is going to laugh all the way to the bank. Everyone will just use their APIs over anything else.
I'm excited about the LLM space and I can barely keep up with the model names, much less all the techniques for fine tuning. A customer is going to have an even worse time.
No one will ever get fired for buying OpenAI (now that IBM is dead, and probably sad Watson never made a dent).
I do use Mistral for all my personal projects but I'm not sure that is going to have the same effect on the industry as open source software did in the past.
It's already superior to OpenAI because it doesn't require an API. You can run the model on your own hardware, in your own datacenter, and your data is guaranteed to remain confidential. Creating a one-off fine-tune is a different story than permanently joining your company at the hip to OpenAI.
I know in our bubble, in the era of Cloud, it's easy to send confidential company data to some random API on the Internet and not worry about it, but that's absolutely not the case for anyone in Healthcare, Government, or even normal companies that are security conscious. For them, OpenAI was never a valid consideration in the first place.
Education and research without gatekeepers in academia and industry complaining about their book sales or prestige titles being obsoleted
Whole lot of uses cases that break us out of having to kowtow to experts who were merely born before us trying to monopolize exploration of science and technology
To that end I’m working on a GPU accelerated client backed by local AI, with NERFs and Gaussian splatting built in.
The upside to being an EE with MSc in math; most of my money comes from engineering real things. I don’t have skin in the cloud CRUD app/API game and don’t see a reason to spend money propping up middle men who, given my skills and abilities, don’t add value
Programmers can go explore syntax art in their parent’s basement again. Tired of 1970s semantics and everyone with a DSL thinking that’s the best thing to happen to computing as a field of inquiry ever.
Like all industries big tech is monopolized by aging rent seekers. Disrupt by divesting from it is my play now.