It’s fascinating, but we aren’t going to understand intelligence this way. Emergent phenomenon are part of complexity theory, and we don’t have any maths for it. Our ignorance in this space is large.
When I was young, I remember a common refrain being “will a brain ever be able to understand itself?”. Perhaps not, but the drive towards understanding is still a worthy goal in my opinion. We need to make some breakthroughs in the study of complexity theory.
The same argument holds for "AI" too. We don't understand a damn thing about neural networks.
There's more - we don't care to understand them as long as it's irrelevant to exploiting them.