zlacker

[parent] [thread] 4 comments
1. roody1+(OP)[view] [source] 2024-02-14 05:19:13
Yeah but for how long… at this rate I would expect some of the freely distributed models to hit gpt4 levels in as little as 3-6 months.
replies(2): >>int_19+I7 >>huyter+7v2
2. int_19+I7[view] [source] 2024-02-14 06:58:21
>>roody1+(OP)
I've heard claims like that 6 months ago.

But so far nobody is even in the same ballpark. And not just freely distributed models, but proprietary ones backed by big money, as well.

It really makes one wonder what kind of secret sauce OpenAI has. Surely it can't just be all that compute that Microsoft bought them, since Google could easily match that, and yet...

replies(1): >>qetern+IU
◧◩
3. qetern+IU[view] [source] [discussion] 2024-02-14 14:42:38
>>int_19+I7
> But so far nobody is even in the same ballpark.

Miqu is pretty good. Sure, it's a leak...but there's nothing special there. It's just a 70b llama2 finetune.

replies(1): >>int_19+vs2
◧◩◪
4. int_19+vs2[view] [source] [discussion] 2024-02-14 22:06:22
>>qetern+IU
By the standards of other llama2 finetunes, sure. Compared to GPT-4, I stand by my previous assertion.
5. huyter+7v2[view] [source] 2024-02-14 22:21:45
>>roody1+(OP)
Order of magnitude means they’re going to take 20 times longer to get to the 4. So maybe on the order of 40-60 months from this point.
[go to top]