Chomsky introduced his theory of language acquisition, according to which children have an inborn quality of being biologically encoded with a universal grammar
https://psychologywriting.com/skinner-and-chomsky-on-nature-...Yes, maybe we can reproduce that learning process in LLMs, but that doesn't mean the LLMs imitate only the nurture part (might as well be just finetuning), and not the nature part.
An airplane is not an explanation for a bird's flight.
Nature, for an LLM, is its design: graph, starting weights, etc.
Environment, for an LLM, is what happens during training.
LLMs are capable of learning grammar entirely from their environment, which suggests that infants are too, which is bad for Chomsky's position that the basics of grammar are baked into human DNA.
However, we now have a proof-of-concept that a computer can learn grammar in a sophisticated way, from the ground up.
We have yet to code something procedural that approaches the same calibre via a hard-coded universal grammar.
That may not obliterate Chomksy's position, but it looks bad.
Again, that LLMs can learn to compose sophisticated texts from training alone does not close the case on Chomsky's position.
However, it is a piece of evidence against it. It does suggest, by Occam's razor, that a hardwired universal grammar is the lesser theory.