zlacker

[return to ""]
1. skepti+(OP)[view] [source] 2024-02-14 02:35:28
>>mfigui+M3
Frankly, OpenAI seems to be losing its luster, and fast.

Plugins were a failure. GPTs are a little better, but I still don't see the product market fit. GPT-4 is still king, but not by that much any more. It's not even clear that they're doing great research, because they don't publish.

GPT-5 has to be incredibly good at this point, and I'm not sure that it will be.

2. mfigui+M3[view] [source] 2024-02-14 03:08:18
3. fennec+1H[view] [source] 2024-02-14 10:27:15
>>skepti+(OP)
I mean they just happened to train the biggest, most fine tuned model on the most data out of everyone I guess.

Transformers were invented with the support of Google (by the researchers, not by Google).

Open community has been creating better and better models with a group effort; like how ML works itself, it's way easier to try 100,000 ideas on a small scale than it is to try a couple of ideas on a large scale.

[go to top]