> Perhaps their time will come again.
That's pretty sure, as soon as the hype about LLMs has calmed down. I hope that Cyc's data will then still be available, ideally open-source.
> https://muse.jhu.edu/pub/87/article/853382/pdf
Unfortunately paywalled; does anyone have a downloadable copy?
The hype of LLMs is not the reason the likes of Cyc have been abandoned.
It's pretty interesting to see comments like this like deep nets weren't the underdog for decades. You think they were first choice ? The creator of cyc spent decades on it, and he's dead. We use modern NNs today because they just work that much better.
Gofai was abandoned in NLP long before the likes of GPT because non deep-net alternatives just sucked that much. It has nothing to do with any recent LLM hype.
If the problem space is without clear definitions and unambiguous axioms then non deep-net alternatives fall apart.
I'm not sure deep-nets are the key here. I see the key as being lots of data and using statistical modeling. Instead of trying to fit what's happening into nice and clean black-and-white categories.
Btw, I don't even think Gofai is all that good at domains with clear definitions and unambiguous axioms: it took neural nets to beat the best people at the very clearly defined game of Go. And neural net approaches have also soundly beaten the best traditional chess engines. (Traditional chess engines have caught up a lot since then. Competition is good for development, of course.)
I suspect part of the problem for Gofai is that all the techniques that work are re-labelled to be just 'normal algorithms', like A* or dynamic programming etc, and no longer bear the (Gof) AI label.
(Tangent: that's very similar to philosophy. Where every time we turn anything into a proper science, we relabel it from 'natural philosophy' to something like 'physics'. John von Neumann was one of these recent geniuses who liberated large swaths of knowledge from the dark kludges of the philosophy ghetto.)