However, we now have a proof-of-concept that a computer can learn grammar in a sophisticated way, from the ground up.
We have yet to code something procedural that approaches the same calibre via a hard-coded universal grammar.
That may not obliterate Chomksy's position, but it looks bad.
Again, that LLMs can learn to compose sophisticated texts from training alone does not close the case on Chomsky's position.
However, it is a piece of evidence against it. It does suggest, by Occam's razor, that a hardwired universal grammar is the lesser theory.