zlacker

[return to "Cubic millimetre of brain mapped at nanoscale resolution"]
1. throwu+J7[view] [source] 2024-05-09 22:41:26
>>geox+(OP)
> The 3D map covers a volume of about one cubic millimetre, one-millionth of a whole brain, and contains roughly 57,000 cells and 150 million synapses — the connections between neurons.

This is great and provides a hard data point for some napkin math on how big a neural network model would have to be to emulate the human brain. 150 million synapses / 57,000 neurons is an average of 2,632 synapses per neuron. The adult human brain has 100 (+- 20) billion or 1e11 neurons so assuming the average rate of synapse/neuron holds, that's 2.6e14 total synapses.

Assuming 1 parameter per synapse, that'd make the minimum viable model several hundred times larger than state of the art GPT4 (according to the rumored 1.8e12 parameters). I don't think that's granular enough and we'd need to assume 10-100 ion channels per synapse and I think at least 10 parameters per ion channel, putting the number closer to 2.6e16+ parameters, or 4+ orders of magnitude bigger than GPT4.

There are other problems of course like implementing neuroplasticity, but it's a fun ball park calculation. Computing power should get there around 2048: >>38919548

◧◩
2. creer+eV2[view] [source] 2024-05-10 22:38:29
>>throwu+J7
Yes and no on order of magnitude required for decent AI, there is still (that I know of) very little hard data on info density in the human brain. What there is points at entire sections that can sometimes be destroyed or actively removed while conserving "general intelligence".

Rather than "humbling" I think the result is very encouraging: It points at major imaging / modeling progress, and it gives hard numbers on a very efficient (power-wise, size overall) and inefficient (at cable management and probably redundancy and permanence, etc) intelligence implementation. The numbers are large but might be pretty solid.

Don't know about upload though...

[go to top]