Chomsky introduced his theory of language acquisition, according to which children have an inborn quality of being biologically encoded with a universal grammar
https://psychologywriting.com/skinner-and-chomsky-on-nature-...Yes, maybe we can reproduce that learning process in LLMs, but that doesn't mean the LLMs imitate only the nurture part (might as well be just finetuning), and not the nature part.
An airplane is not an explanation for a bird's flight.
Nature, for an LLM, is its design: graph, starting weights, etc.
Environment, for an LLM, is what happens during training.
LLMs are capable of learning grammar entirely from their environment, which suggests that infants are too, which is bad for Chomsky's position that the basics of grammar are baked into human DNA.
LLM had really destroyed Chomsky's positions in multiple different ways: nothing perform even close to LLM in language generation, yet it didn't grow a UG for natural languages, while it did develop a shared logic for non-natural languages and abstract concepts, while dataset needing to be heavily English biased to be English fluent, and parameter count needing to be truly massive as multiple hundred billion parameters large, so on and on.
Those are all circumstantial evidences at best, a random paraphernalia of statements that aren't even appropriate to bring into discussions, all meaningless - in the sense that an open hand of a person observing another individual aligned to a line between standing position of the person to the center of nearest opening of a wall would be meaningless.
Do you even understand Chomsky's position?
To me this text look like his Baghdad Bob moment. Silly but right and noble. What else is it?
Ironically these days you can just throw this text at ChatGPT to have it debloat or critique text like this transcripts. Worse results than taking time reading yourself, but gives you validation if that is what is needed.