zlacker

[return to "Cyc: History's Forgotten AI Project"]
1. blueye+Fp[view] [source] 2024-04-17 22:46:18
>>iafish+(OP)
Cyc is one of those bad ideas that won't die, and which keeps getting rediscovered on HN. Lenat wasted decades of his life on it. Knowledge graphs like Cyc are labor intensive to build and difficult to maintain. They are brittle in the face of change, and useless if they cannot represent the underlying changes of reality.
◧◩
2. thesz+SB[view] [source] 2024-04-18 00:24:07
>>blueye+Fp
Lenat was able to produce superhuman performing AI in the early 1980s [1].

[1] https://voidfarer.livejournal.com/623.html

You can label it "bad idea" but you can't bring LLMs back in time.

◧◩◪
3. goatlo+6Q[view] [source] 2024-04-18 02:57:17
>>thesz+SB
Why didn't it ever have the impact that LLMs are having now? Or that DeepMind has had? Cyc didn't pass the Turing Test or become superhuman chess and go players. Yet it's had much more time to become successful.
◧◩◪◨
4. eru+wX[view] [source] 2024-04-18 04:36:29
>>goatlo+6Q
I'm very skeptical of Cyc and other symbolic approaches.

However I think they have a good excuse for 'Why didn't it ever have the impact that LLMs are having now?': lack of data and lack of compute.

And it's the same excuse that neural networks themselves have: back in those days, we just didn't have enough data, and we didn't have enough compute, even if we had the data.

(Of course, we learned in the meantime that neural networks benefit a lot from extra data and extra compute. Whether that can be brought to bear on Cyc-style symbolic approaches is another question.)

◧◩◪◨⬒
5. thesz+391[view] [source] 2024-04-18 07:06:20
>>eru+wX
Usually, LLM's output gets passed through beam search [1] which is as symbolic as one can get.

[1] https://www.width.ai/post/what-is-beam-search

It is possible to even have 3-gram model to output better text predictions if you combine it with the beam search.

[go to top]