zlacker

[parent] [thread] 4 comments
1. consum+(OP)[view] [source] 2024-01-08 22:55:38
> We can't even simulate all of the chemical processes inside a single cell. We don't even know all of the chemical processes. We don't know the function of most proteins.

Brain > Cell > Molecules(DNA and otherwise) > Atoms > Sub-atomic particles...

Potentially dumb question, but how deeply do we need to understand the underlying components to simulate a flatworm brain?

replies(2): >>BlarfM+l1 >>shpong+I3
2. BlarfM+l1[view] [source] 2024-01-08 23:01:38
>>consum+(OP)
While I believe there are some biological processes that rely on engagement and such, they haven’t been found in the brain. So likely somewhere just above the molecule level (chemical gradients and diffusion timings in cells certainly have an effect).
3. shpong+I3[view] [source] 2024-01-08 23:13:21
>>consum+(OP)
Who knows! I'm sure it depends on how accurately you want to simulate a flatworm brain.

I think current AI research has shown that simply representing a brain as a neural network (e.g. fully connected, simple neurons) is not sufficient for AGI.

replies(1): >>mewpme+tq
◧◩
4. mewpme+tq[view] [source] [discussion] 2024-01-09 01:52:16
>>shpong+I3
How has it shown that exactly if we just year ago had such a huge advance in terms of intelligence?
replies(1): >>shpong+wg2
◧◩◪
5. shpong+wg2[view] [source] [discussion] 2024-01-09 16:46:03
>>mewpme+tq
Estimates of GPT-4 parameter counts are ~1.7 trillion, which is approximately 20-fold greater than the ~85 billion human neurons we have. To me this suggests that naively building a 1:1 (or even 20:1) representation of simple neurons is insufficient for AGI.
[go to top]