zlacker

[return to "Obituary for Cyc"]
1. vannev+14[view] [source] 2025-04-08 19:44:13
>>todsac+(OP)
I would argue that Lenat was at least directionally correct in understanding that sheer volume of data (in Cyc's case, rules and facts) was the key in eventually achieving useful intelligence. I have to confess that I once criticized the Cyc project for creating an ever-larger pile of sh*t and expecting a pony to emerge, but that's sort of what has happened with LLMs.
◧◩
2. baq+3j[view] [source] 2025-04-08 21:29:24
>>vannev+14
https://ai-2027.com/ postulates that a good enough LLM will rewrite itself using rules and facts... sci-fi, but so is chatting with a matrix multiplication.
◧◩◪
3. joseph+cm[view] [source] 2025-04-08 21:53:49
>>baq+3j
I doubt it. The human mind is a probabilistic computer, at every level. There’s no set definition for what a chair is. It’s fuzzy. Some things are obviously in the category, and some are at the periphery of it. (Eg is a stool a chair? Is a log next to a campfire a chair? How about a tree stump in the woods? Etc). This kind of fuzzy reasoning is the rule, not the exception when it comes to human intuition.

There’s no way to use “rules and facts” to express concepts like “chair” or “grass”, or “face” or “justice” or really anything. Any project trying to use deterministic symbolic logic to represent the world fundamentally misunderstands cognition.

◧◩◪◨
4. jgalt2+An[view] [source] 2025-04-08 22:05:06
>>joseph+cm
> The human mind is a probabilistic computer, at every level.

Fair enough, but an airplane's wing is not very similar to a bird's wing.

◧◩◪◨⬒
5. joseph+3r[view] [source] 2025-04-08 22:37:15
>>jgalt2+An
That argument would hold a lot more weight if Cyc could fly. But as this article points out, decades of work and millions of dollars have utterly failed to get it off the ground.
[go to top]