zlacker

[parent] [thread] 7 comments
1. coldco+(OP)[view] [source] 2019-12-13 17:09:29
While this approach might seem dated and strange, at some point something will begin to approach the ability to do general learning like a human. I just wonder how long we have to wait.
replies(2): >>The_ra+O9 >>yters+Qg
2. The_ra+O9[view] [source] 2019-12-13 18:16:39
>>coldco+(OP)
Eternity? If all we do is waiting...

Almost nobody is really working on AGI and this is the main issue. A notable counter example is the recent reconversion of John Carmack.

3. yters+Qg[view] [source] 2019-12-13 19:06:45
>>coldco+(OP)
What if general learning is uncomputable?
replies(1): >>Reraro+dH
◧◩
4. Reraro+dH[view] [source] [discussion] 2019-12-13 22:05:11
>>yters+Qg
General learning is uncomputable, it's called Solomonoff induction. You don't need general learning, you need something at least as powerful as the mess in a human brain.
replies(2): >>yters+P91 >>lorepi+Bp1
◧◩◪
5. yters+P91[view] [source] [discussion] 2019-12-14 04:19:57
>>Reraro+dH
What if the mess in the human brain is powered by an uncomputable transcendent mind?
replies(1): >>Reraro+0s1
◧◩◪
6. lorepi+Bp1[view] [source] [discussion] 2019-12-14 10:21:49
>>Reraro+dH
Can you provide some references on "General learning is uncomputable"? Thanks.
replies(1): >>Reraro+xr1
◧◩◪◨
7. Reraro+xr1[view] [source] [discussion] 2019-12-14 11:05:25
>>lorepi+Bp1
https://en.wikipedia.org/wiki/AIXI
◧◩◪◨
8. Reraro+0s1[view] [source] [discussion] 2019-12-14 11:14:30
>>yters+P91
first read this as "uncomputable translucent mind" and I loved the imagery
[go to top]