The actual paper [1] says that functional MRI (which is measuring which parts of the brain are active by sensing blood flow) indicates that different brain hardware is used for non-language and language functions. This has been suspected for years, but now there's an experimental result.
What this tells us for AI is that we need something else besides LLMs. It's not clear what that something else is. But, as the paper mentions, the low-end mammals and the corvids lack language but have some substantial problem-solving capability. That's seen down at squirrel and crow size, where the brains are tiny. So if someone figures out to do this, it will probably take less hardware than an LLM.
This is the next big piece we need for AI. No idea how to do this, but it's the right question to work on.
[1] https://www.nature.com/articles/s41586-024-07522-w.epdf?shar...
Not to over-hype LLMs, but I don't see why this results says this. AI doesn't need to do things the same way as evolved intelligence has.
Imagine trying to limit, control, or explain a being without familiar cognitive structures.
Is there a reason to care about such unfamiliar modalities of cognition?
Anything that doesn't have a spine, I'm pretty sure.
Also if we look at just auditory, tons of creatures are deaf and don't need that.
> Imagine trying to limit, control, or explain a being without familiar cognitive structures.
I don't see why any of that that affects whether it's intelligent.
Presumably they have some sort biological input processing or sensory inputs. They don't eat data.