Even more strangely, the act of giving a statistical model symbolic input allows it to build a context which then shapes the symbolic output in a way that depends on some level of "understanding" instructions.
We "train" this model on raw symbolic data and it extracts the inherent semantic structure without any human ever embedding in the code anything resembling letters, words, or the like. It's as if Chomsky's elusive universal language is semantic structure itself.