zlacker

[parent] [thread] 1 comments
1. cybera+(OP)[view] [source] 2025-04-09 01:40:41
> This kind of fuzzy reasoning is the rule, not the exception when it comes to human intuition.

That is indeed true. But we do have classic fuzzy logic, and it can be used to answer these questions. E.g. a "stool" maybe a "chair", but "automobile" is definitely not.

Maybe the symbolic logic approach could work if it's connected with ML? Maybe we can use a neural network to plot a path in the sea of assertions? Cyc really seems like something that can benefit the world if it's made open under some reasonable conditions.

replies(1): >>joseph+Bj
2. joseph+Bj[view] [source] 2025-04-09 06:09:54
>>cybera+(OP)
> That is indeed true. But we do have classic fuzzy logic, and it can be used to answer these questions. E.g. a "stool" maybe a "chair", but "automobile" is definitely not.

I’m not convinced that classical fuzzy logic will ever solve this - at least not if every concept needs to be explicitly programmed in. What a “chair” is sort of subtly changes at a furniture store and at a campsite. Are you going to have someone explicitly, manually program all of those nuances in? No way! And without that subtlety, you aren’t going to end up with a system that’s as smart as chatgpt. Challenge me on this if you like, but we can play this game with just about any word you can name - more or less everything except for pure mathematics.

And by the way, modern ML approaches understand all of those nuances just fine. It’s not clear to me what value - if any - symbolic logic / expert systems provide that chatgpt isn’t perfectly capable of learning on its own already.

[go to top]