Try experimenting with immersing your brain in preservatives and staining with heavy metals to see how would you be able to write the comment similar to the above.
No wonder that monkey methods continue to unveil monkey cognition.
I think we all do every day
It’s fascinating, but we aren’t going to understand intelligence this way. Emergent phenomenon are part of complexity theory, and we don’t have any maths for it. Our ignorance in this space is large.
When I was young, I remember a common refrain being “will a brain ever be able to understand itself?”. Perhaps not, but the drive towards understanding is still a worthy goal in my opinion. We need to make some breakthroughs in the study of complexity theory.
On the second point, the failure of Openworm to model the very well-mapped-out C. elegans (~0.3k neurons) says a lot.
The same argument holds for "AI" too. We don't understand a damn thing about neural networks.
There's more - we don't care to understand them as long as it's irrelevant to exploiting them.
Yes, which is why the current explosion in practical application isn’t very interesting.
> we don't care to understand them as long as it's irrelevant to exploiting them.
For some definition of “we”, I’m sure that’s true. We don’t need to understand things to make practical use of them. Giant Cathedrals were built without science and mathematics. Still, once we do have the science and mathematics, generally exponential advancement results.
Yes we figured out how to build aircraft.
But it can not be compared to a bird flying. Neither in terms of efficiency or elegance.