zlacker

[return to "Language is not essential for the cognitive processes that underlie thought"]
1. Animat+Du5[view] [source] 2024-10-19 19:21:01
>>orcul+(OP)
This is an important result.

The actual paper [1] says that functional MRI (which is measuring which parts of the brain are active by sensing blood flow) indicates that different brain hardware is used for non-language and language functions. This has been suspected for years, but now there's an experimental result.

What this tells us for AI is that we need something else besides LLMs. It's not clear what that something else is. But, as the paper mentions, the low-end mammals and the corvids lack language but have some substantial problem-solving capability. That's seen down at squirrel and crow size, where the brains are tiny. So if someone figures out to do this, it will probably take less hardware than an LLM.

This is the next big piece we need for AI. No idea how to do this, but it's the right question to work on.

[1] https://www.nature.com/articles/s41586-024-07522-w.epdf?shar...

◧◩
2. KoolKa+aC5[view] [source] 2024-10-19 20:19:58
>>Animat+Du5
> What this tells us for AI is that we need something else besides LLMs.

Basically we need Multimodal LLM's (terrible naming as it's not an LLM then but still).

◧◩◪
3. Animat+ZE5[view] [source] 2024-10-19 20:44:10
>>KoolKa+aC5
I don't know what we need. Nor does anybody else, yet. But we know what it has to do. Basically what a small mammal or a corvid does.

There's been progress. Look at this 2020 work on neural net controlled drone acrobatics.[1] That's going in the right direction.

[1] https://rpg.ifi.uzh.ch/docs/RSS20_Kaufmann.pdf

◧◩◪◨
4. fuzzfa+ZG5[view] [source] 2024-10-19 21:01:46
>>Animat+ZE5
You could say language is just the "communication module" but there has got to be another whole underlying interface where non-verbal thoughts are modulated/demodulated to conform to the language expected to be used when communication may or may not be on the agenda.
◧◩◪◨⬒
5. NoMore+XU5[view] [source] 2024-10-19 23:28:26
>>fuzzfa+ZG5
In these discussions, I always knee-jerk into thinking "why don't they just look inward on their own minds". But the truth is, most people don't have much to gaze upon internally... they're the meat equivalent of an LLM that can sort of sound like it makes sense. These are the people always bragging about how they have an "internal monologue" and that those that don't are aliens or psychotics or something.

The only reason humans have that "communication model" is because that's how you model other humans you speak to. It's a faculty for rehearsing what you're going to say to other people, and how they'll respond to it. If you have any profound thoughts at all, you find that your spoken language is deficient to even transcribe your thoughts, some "mental tokens" have no short phrases that even describe them.

The only real thoughts you have are non-verbal. You can see this sometimes in stupid schoolchildren who have learned all the correct words to regurgitate, but those never really clicked for them. The mildly clever teachers always assume that if they thoroughly practice the terminology, it will eventually be linked with the concepts themselves and they'll have fully learned it. What's really happening is that there's not enough mental machinery underneath for those words to ever be anything to link up with.

[go to top]