zlacker

[parent] [thread] 3 comments
1. jedrek+(OP)[view] [source] 2024-05-15 10:52:48
AI? Yes.

LLMs pretending to be AI? No.

replies(1): >>trasht+Wf
2. trasht+Wf[view] [source] 2024-05-15 12:47:19
>>jedrek+(OP)
What you call "AI" is generally named AGI. LLM's are alredy a kind of AI, not just generic enough to fully replace all humans.

We don't know if full AGI can be built using just current technology (like transformers) given enough scale, or if 1 or more fundamental breakthroughs are needed beyond just the scale.

My hypothesis has always been that AGI will arrive roughly when the compute power and model size matches the human brain. That means models of about 100 trillion params, which is not that far away now.

replies(1): >>chx+lk1
◧◩
3. chx+lk1[view] [source] [discussion] 2024-05-15 18:03:34
>>trasht+Wf
> We don't know if full AGI can be built using just current technology (like transformers) given enough scale,

We absolutely do and the answer is such a resounding no it's not even funny.

replies(1): >>trasht+8m1
◧◩◪
4. trasht+8m1[view] [source] [discussion] 2024-05-15 18:13:05
>>chx+lk1
Actually, we really don't. When GPT-3.5 was released, it was a massive surprise to many, exactly because they didn't believe simply scaling up transformers wouldn't end up with something like that.

Now using transformers doesn't mean they have to be assembled like LLM's. There are other ways to stich them together to solve a lot of other problems.

We may very well have the basic types of lego pieces needed to build AGI. We won't know until we try to build all the brain's capacities into a model of size of a few 100 trillion parameters.

And if we actually lack some types of pieces, they may even be available by then.

[go to top]