zlacker

[parent] [thread] 10 comments
1. Taylor+(OP)[view] [source] 2024-05-15 06:43:22
No I’m with you on this. Next token prediction does lead to impressive emergent phenomena. But what makes people people is an internal drive to attend to our needs, and an LLM exists without that.

A real AGI should be something you can drop in to a humanoid robot and it would basically live as an individual, learning from every moment and every day, growing and changing with time.

LLMs can’t even count the number of letters in a sentence.

replies(4): >>vinter+p3 >>astran+w9 >>kgeist+rg >>sebast+Xh1
2. vinter+p3[view] [source] 2024-05-15 07:18:59
>>Taylor+(OP)
From that AGI definition, AGI is probably quite possible and reachable - but also something pointless which there are no good reasons to "use", and many good reasons not to.
3. astran+w9[view] [source] 2024-05-15 08:19:14
>>Taylor+(OP)
LLMs could count the number of letters in a sentence if you stopped tokenizing them first.
replies(1): >>HarHar+pu
4. kgeist+rg[view] [source] 2024-05-15 09:37:26
>>Taylor+(OP)
>LLMs can’t even count the number of letters in a sentence.

It's a consequence of tokenization. They "see" the world through tokens, and tokenization rules depend on the specific middleware you're using. It's like making someone blind and then claiming they are not intelligent because they can't tell red from green. That's just how they perceive the world and tells nothing about intelligence.

replies(1): >>Otomot+yG
◧◩
5. HarHar+pu[view] [source] [discussion] 2024-05-15 11:54:57
>>astran+w9
tokenization is not the issue - these LLMs can all break a word into letters if you ask them.
◧◩
6. Otomot+yG[view] [source] [discussion] 2024-05-15 13:11:20
>>kgeist+rg
But it limits them, they cannot be AGI then, because a child that can count could do it :)
7. sebast+Xh1[view] [source] 2024-05-15 16:05:26
>>Taylor+(OP)
You seem generally intelligent. Can you tell how many letters are in the following sentence?

"هذا دليل سريع على أنه حتى البشر الأذكياء لا يمكنهم قراءة ”الرموز“ أو ”الحروف“ من لغة لم يتعلموها."

replies(2): >>omeze+J32 >>lewhoo+Nr2
◧◩
8. omeze+J32[view] [source] [discussion] 2024-05-15 20:07:35
>>sebast+Xh1
I counted very quickly but 78? I learned arabic in kindergarten, im not sure what your point was. There are arabic spelling bees and an alphabet song just like english

The comment you replied to was saying LLMs trained on english cant count letters in english

replies(1): >>sebast+9V4
◧◩
9. lewhoo+Nr2[view] [source] [discussion] 2024-05-15 22:34:18
>>sebast+Xh1
Is this even a fair comparison ? Are we asking a LLM to count letters in an alphabet it never saw ?
replies(1): >>trucul+Ix2
◧◩◪
10. trucul+Ix2[view] [source] [discussion] 2024-05-15 23:30:20
>>lewhoo+Nr2
Yes, it sees tokens. Asking it to count letters is a little bit like asking that of someone who never learned to read/write and only learned language through speech.
◧◩◪
11. sebast+9V4[view] [source] [discussion] 2024-05-16 20:26:57
>>omeze+J32
LLMs aren't trained in English with the same granularity that you and I are.

So my analogy here stands : OP was trained in "reading human language" with Roman letters as the basis of his understanding, and it would be a significant challenge (fairly unrelated to intelligence level) for OP to be able to parse an Arabic sentence of the same meaning.

Or:

You learned Arabic, great (it's the next language I want to learn so I'm envious!). But from the LLM point of view, should you be considered intelligent if you can count Arabic letters but not Arabic tokens in that sentence?

[go to top]