For other commentators, as I understand it, Chomsky's talking about well-defined grammar and language and production systems. Think Hofstadter's Godel Escher Bach. Not "folk" understanding of language.
I have no understanding or intuition, or even a finger nail grasp, for how an LLM generates, seemingly emulating, "sentences", as though created with a generative grammar.
Is any one comparing and contrasting these two different techniques? Being noob, I wouldn't even know where to start looking.
I've gleaned that someone(s) are using LLM/GPT to emit abstract syntax trees (vs a mere stream of tokens), to serve as input for formal grammars (eg programming source code). That sounds awesome. And something I might some day sorta understand.
I've also gleaned that, given sufficient computing power, training data for future LLMs will have tokenized words (vs just character sequences). Which would bring the two strategies closer...? I have no idea.
(Am noob, so forgive my poor use of terminology. And poor understanding of the tech, too.)
LLMs (like GPT) and grammars (like Backus–Naur Form) are two different kinds of generative (production) systems, right?
You've been (heroically) explaining Chomsky's criticism of LLMs to other noobs: grammars (theoretically) explain how humans do language, which is very different from how ChatGPT (stochastic parrots) do language. Right?
Since GPT mimics human language so convincingly, I've been wondering if there's any overlap of these two generative systems.
Especially once the (tokenized) training data for GPTs is word based instead of just snippets of characters.
Because I notice grammars everywhere and GPT is still magic to me. Maybe I'd benefit if I could understand GPTs in terms of grammars.