intelligence is what allows one to understand phrases and then construct meaning from it. e.g. the paper is yellow. AI will need to have a concept of paper and yellow. and the to be verb. LLMs just mash samples and form a basic map of what can be throw in one bucket or another with no concept of anything or understanding.
basically, AI is someone capable of minimal criticism. LLMs are someone who just sit in front of the tv and have knee jerk reactions without an ounce of analytical though. qed.
That doesn't clarify anything, you've ever only shuffled the confusion around, moved it to 'understand' and 'meaning'. What does it mean to understand yellow? An LLM or another person could tell you things like "Yellow? Why, that's the color of lemons" or give you a dictionary definition, but does that demonstrate 'understanding', whatever that is?
It's all a philosophical quagmire, made all the worse because for some people its a matter of faith that human minds are fundamentally different from anything soulless machines can possibly do. But these aren't important questions anyway for the same reason. Whether or not the machine 'understands' what it means for paper to be yellow, it can still perform tasks that relate to the yellowness of paper. You could ask an LLM to write a coherent poem about yellow paper and it easily can. Whether or not it 'understands' has no real relevance to practical engineering matters.