zlacker

[return to "Cubic millimetre of brain mapped at nanoscale resolution"]
1. throwu+J7[view] [source] 2024-05-09 22:41:26
>>geox+(OP)
> The 3D map covers a volume of about one cubic millimetre, one-millionth of a whole brain, and contains roughly 57,000 cells and 150 million synapses — the connections between neurons.

This is great and provides a hard data point for some napkin math on how big a neural network model would have to be to emulate the human brain. 150 million synapses / 57,000 neurons is an average of 2,632 synapses per neuron. The adult human brain has 100 (+- 20) billion or 1e11 neurons so assuming the average rate of synapse/neuron holds, that's 2.6e14 total synapses.

Assuming 1 parameter per synapse, that'd make the minimum viable model several hundred times larger than state of the art GPT4 (according to the rumored 1.8e12 parameters). I don't think that's granular enough and we'd need to assume 10-100 ion channels per synapse and I think at least 10 parameters per ion channel, putting the number closer to 2.6e16+ parameters, or 4+ orders of magnitude bigger than GPT4.

There are other problems of course like implementing neuroplasticity, but it's a fun ball park calculation. Computing power should get there around 2048: >>38919548

◧◩
2. throw3+mn[view] [source] 2024-05-10 01:18:29
>>throwu+J7
Or you can subscribe to Geoffrey Hinton's view that artificial neural networks are actually much more efficient than real ones- more or less the opposite of what we've believed for decades- that is that artificial neurons were just a poor model of the real thing.

Quote:

"Large language models are made from massive neural networks with vast numbers of connections. But they are tiny compared with the brain. “Our brains have 100 trillion connections,” says Hinton. “Large language models have up to half a trillion, a trillion at most. Yet GPT-4 knows hundreds of times more than any one person does. So maybe it’s actually got a much better learning algorithm than us.”

GPT-4's connections at the density of this brain sample would occupy a volume of 5 cubic centimeters; that is, 1% of a human cortex. And yet GPT-4 is able to speak more or less fluently about 80 languages, translate, write code, imitate the writing styles of hundreds, maybe thousands of authors, converse about stuff ranging from philosophy to cooking, to science, to the law.

◧◩◪
3. dsalfd+0G[view] [source] 2024-05-10 05:45:55
>>throw3+mn
"Efficient" and "better" are very different descriptors of a learning algorithm.

The human brain does what it does using about 20W. LLM power usage is somewhat unfavourable compared to that.

◧◩◪◨
4. startu+w74[view] [source] 2024-05-11 16:19:44
>>dsalfd+0G
It is using about 20W and then a person takes a single airplane ride between the coasts. And watches a movie on the way.
[go to top]