zlacker

[return to "Chomsky on what ChatGPT is good for (2023)"]
1. newAcc+56[view] [source] 2025-05-25 17:55:34
>>mef+(OP)
[flagged]
◧◩
2. Smaug1+b8[view] [source] 2025-05-25 18:10:32
>>newAcc+56
From some Googling and use of Claude (and from summaries of the suggestively titled "Impossible Languages" by Moro linked from https://en.wikipedia.org/wiki/Universal_grammar ), it looks like he's referring to languages which violate the laws which constrain the languages humans are innately capable of learning. But it's very unclear why "machine M is capable of learning more complex languages than humans" implies anything about the linguistic competence or the intelligence of machine M.
◧◩◪
3. foobar+Fa[view] [source] 2025-05-25 18:28:39
>>Smaug1+b8
It doesn't, it just says that LLMs are not useful models of the human language faculty.
◧◩◪◨
4. specia+4h[view] [source] 2025-05-25 19:16:12
>>foobar+Fa
This is where I'm stuck.

For other commentators, as I understand it, Chomsky's talking about well-defined grammar and language and production systems. Think Hofstadter's Godel Escher Bach. Not "folk" understanding of language.

I have no understanding or intuition, or even a finger nail grasp, for how an LLM generates, seemingly emulating, "sentences", as though created with a generative grammar.

Is any one comparing and contrasting these two different techniques? Being noob, I wouldn't even know where to start looking.

I've gleaned that someone(s) are using LLM/GPT to emit abstract syntax trees (vs a mere stream of tokens), to serve as input for formal grammars (eg programming source code). That sounds awesome. And something I might some day sorta understand.

I've also gleaned that, given sufficient computing power, training data for future LLMs will have tokenized words (vs just character sequences). Which would bring the two strategies closer...? I have no idea.

(Am noob, so forgive my poor use of terminology. And poor understanding of the tech, too.)

[go to top]