If we define "understanding" like "useful", as in, not an innate attribute, but something in relation to a goal, then again, a good imitation, or a rudimentary model can get very far. ChatGPT "understood" a lot of things I have thrown at it, be that algorithms, nutrition, basic calculations, transformation between text formats, where I'm stuck in my personal development journey, or how to politely address people in the email I'm about to write.
>What if our „understanding“ is just unlocking another level in a model?
I believe that it is - that understanding is basically an illusion. Impressions are made up from perceptions and thinking, and extrapolated over the unknown. And just look how far that got us!