zlacker

[return to ""]
1. skepti+(OP)[view] [source] 2024-02-14 02:35:28
>>mfigui+M3
Frankly, OpenAI seems to be losing its luster, and fast.

Plugins were a failure. GPTs are a little better, but I still don't see the product market fit. GPT-4 is still king, but not by that much any more. It's not even clear that they're doing great research, because they don't publish.

GPT-5 has to be incredibly good at this point, and I'm not sure that it will be.

2. mfigui+M3[view] [source] 2024-02-14 03:08:18
3. roody1+Sf[view] [source] 2024-02-14 04:59:52
>>skepti+(OP)
Running Ollama with a 80gb mistral model works as well if not better than ChatGPT 3.5. This is a good thing for the world IMO as the magic is no longer held just OpenAI. The speed at which competitors have caught up in even the last 3 months is astounding.
◧◩
4. huyter+5h[view] [source] 2024-02-14 05:16:30
>>roody1+Sf
But no one cares about 3.5. It’s an order of magnitude worse than 4. An order of magnitude is a lot harder to catch up with.
◧◩◪
5. sjwhev+il[view] [source] 2024-02-14 06:10:42
>>huyter+5h
What Mistral has though is speed, and with speed comes scale.
◧◩◪◨
6. spacem+Tm[view] [source] 2024-02-14 06:31:21
>>sjwhev+il
Who cares about speed if you’re wrong?

This isn’t a race to write the most lines of code or the most lines of text. It’s a race to write the most correct lines of code.

I’ll wait half an hour for a response if I know I’m getting at least staff engineer level tier of code for every question

◧◩◪◨⬒
7. popinm+Ao[view] [source] 2024-02-14 06:52:08
>>spacem+Tm
For the tasks my group is considering, even a 7B model is adequate.

Sufficiently accurate responses can be fed into other systems downstream and cleaned up. Even code responses can benefit from this by restricting output tokens using the grammar of the target language, or iterating until the code compiles successfully.

And for a decent number of LLM-enabled use cases the functionality unlocked by these models is novel. When you're going from 0 to 1 people will just be amazed that the product exists.

[go to top]