While there's some things in this I find myself nodding along to in this, I can't help but feel it's an a really old take that is super vague and hand-wavy. The truth is that all of the progress on machine learning is absolutely science. We understand extremely well how to make neural networks learn efficiently; it's why the data leads anywhere at all. Backpropagation and gradient descent are extraordinarily powerful. Not to mention all the "just engineering" of making chips crunch incredible amounts of numbers.
Chomsky is extremely ungenerous to the progress and also pretty flippant about what this stuff can do.
I think we should probably stop listening to Chomsky; he hasn't said anything here that he hasn't already say a thousand times for decades.
To be fair the article is from two years ago, which when talking about LLMs in this age arguably does count as "old", maybe even "really old".
If they ask me the previous question I can retrospect/query my memory and tell 100% whether I know it or not - lossy compression aside. An LLM will just reply based on how likely a yes answer is with no regards to having that knowledge or not.