zlacker

[parent] [thread] 2 comments
1. cridde+(OP)[view] [source] 2024-05-15 12:12:11
Isn't access to massive datasets and computation the moat? If you and your very talented friends wanted to build something like GPT-4, could you?

It's going to get orders of magnitude less expensive, but for now, the capital requirements feel like a pretty deep moat.

replies(1): >>Gud+YR
2. Gud+YR[view] [source] 2024-05-15 16:39:46
>>cridde+(OP)
How do you know massive datasets are required? Just because that’s how current LLMs operate, doesn’t mean it’s necessarily the only solution.
replies(1): >>datame+h51
◧◩
3. datame+h51[view] [source] [discussion] 2024-05-15 17:40:39
>>Gud+YR
Then the resources needed to discover an alternative to brute-forcing a large model are a huge barrier.

I think academia and startups are currently better suited to optimize tinyml and edge ai hardware/compilers/frameworks etc.

[go to top]