zlacker

[return to "Obituary for Cyc"]
1. vannev+14[view] [source] 2025-04-08 19:44:13
>>todsac+(OP)
I would argue that Lenat was at least directionally correct in understanding that sheer volume of data (in Cyc's case, rules and facts) was the key in eventually achieving useful intelligence. I have to confess that I once criticized the Cyc project for creating an ever-larger pile of sh*t and expecting a pony to emerge, but that's sort of what has happened with LLMs.
◧◩
2. baq+3j[view] [source] 2025-04-08 21:29:24
>>vannev+14
https://ai-2027.com/ postulates that a good enough LLM will rewrite itself using rules and facts... sci-fi, but so is chatting with a matrix multiplication.
◧◩◪
3. joseph+cm[view] [source] 2025-04-08 21:53:49
>>baq+3j
I doubt it. The human mind is a probabilistic computer, at every level. There’s no set definition for what a chair is. It’s fuzzy. Some things are obviously in the category, and some are at the periphery of it. (Eg is a stool a chair? Is a log next to a campfire a chair? How about a tree stump in the woods? Etc). This kind of fuzzy reasoning is the rule, not the exception when it comes to human intuition.

There’s no way to use “rules and facts” to express concepts like “chair” or “grass”, or “face” or “justice” or really anything. Any project trying to use deterministic symbolic logic to represent the world fundamentally misunderstands cognition.

◧◩◪◨
4. cybera+SG[view] [source] 2025-04-09 01:40:41
>>joseph+cm
> This kind of fuzzy reasoning is the rule, not the exception when it comes to human intuition.

That is indeed true. But we do have classic fuzzy logic, and it can be used to answer these questions. E.g. a "stool" maybe a "chair", but "automobile" is definitely not.

Maybe the symbolic logic approach could work if it's connected with ML? Maybe we can use a neural network to plot a path in the sea of assertions? Cyc really seems like something that can benefit the world if it's made open under some reasonable conditions.

◧◩◪◨⬒
5. joseph+t01[view] [source] 2025-04-09 06:09:54
>>cybera+SG
> That is indeed true. But we do have classic fuzzy logic, and it can be used to answer these questions. E.g. a "stool" maybe a "chair", but "automobile" is definitely not.

I’m not convinced that classical fuzzy logic will ever solve this - at least not if every concept needs to be explicitly programmed in. What a “chair” is sort of subtly changes at a furniture store and at a campsite. Are you going to have someone explicitly, manually program all of those nuances in? No way! And without that subtlety, you aren’t going to end up with a system that’s as smart as chatgpt. Challenge me on this if you like, but we can play this game with just about any word you can name - more or less everything except for pure mathematics.

And by the way, modern ML approaches understand all of those nuances just fine. It’s not clear to me what value - if any - symbolic logic / expert systems provide that chatgpt isn’t perfectly capable of learning on its own already.

[go to top]