zlacker

[parent] [thread] 4 comments
1. m0zg+(OP)[view] [source] 2019-12-13 20:04:46
How do you deal with the fact that human knowledge is probabilistic? I.e. that it's actually mostly "belief" rather than a "fact", and the "correct" answer heavily depends on the context in a somewhat Bayesian way. Best I can tell we don't yet have math to model this in any kind of a comprehensive way.
replies(1): >>catpol+Y2
2. catpol+Y2[view] [source] 2019-12-13 20:24:53
>>m0zg+(OP)
Cyc has a few ways of dealing with fallibility.

Cyc doesn't do anything Bayesian like assigning specific probabilities to individual beliefs - IIRC they tried something like that and it had the problem where nobody felt very confident about attaching any particular precise number to priors and also the inference chains can be so long and involve so many assertions that anything less than 1 probability for most assertions would result in conclusions with very low confidence levels.

As to what they actually do, there are a few approaches.

I know that for one thing, there are coarse grained epistemic levels of belief built into the representation system - some predicates have "HighLikelihoodOf___" or "LowLikelihoodOf___" versions that enable very rough probabilistic reasoning that (it's argued - I have no position on this) is actually closer to the kind of folk-probabilistic thinking that humans actually do.

Also Cyc can use non-monotonic logic, which I think is relatively unique for commercial inference engines. I'm not going to give the best explanation here, but effectively, Cyc can assume that some assertions are "generally" true but may have certain exceptions, which makes it easy to express a lot of facts in a way that's similar to human reasoning. In general, mammals don't lay eggs. So you can assert that mammals don't lay eggs. But you can also assert that statement is non-monotonic and has exceptions (e.g. Platypuses).

Finally, and this isn't actually strictly about probabilistic reasoning, but helps represent different kinds of non-absolute reasoning: knowledge in Cyc is always contextualized. The knowledge base is divided up into "microtheories" of contexts where assertions are given to hold as if they're both true and relevant - very little is assumed to be always true across the board. This allows them to represent a lot of different topics, conflicting theories or even fictional worlds - there are various microtheories used for reasoning events in about popular media franchises, where the same laws of physics might not apply.

replies(2): >>m0zg+s8 >>guicho+Tii
◧◩
3. m0zg+s8[view] [source] [discussion] 2019-12-13 21:01:01
>>catpol+Y2
Thank you for the answer, I thought it was simpler than that, I'm glad the assumption was wrong.

I understand that any practical system of this kind would have to be very coarse, but even at the coarse level, does it have any kind of "error bar" indicator, to show how "sure" it is of the possibly incorrect answer? And can it come up with pertinent questions to narrow things down to a more "correct" answer?

replies(1): >>catpol+ja
◧◩◪
4. catpol+ja[view] [source] [discussion] 2019-12-13 21:10:50
>>m0zg+s8
I'm not sure I'm able to answer that in a satisfying way just because my memory is fallible. The degree to which the system is unsure of something (to the degree to which that can be coarsely represented) certainly shows up in the results, and I suspect the underlying search heuristics tend to prioritize things with a represented higher confidence level.

The latter thing sounds like something Doug Lenat has wanted for years, though I think it mostly comes up in cases where the information available is ambiguous, rather than unreliable. There are various knowledge entry schemes that involve Cyc dynamically generating more questions to ask the user to disambiguate or find relevant information.

◧◩
5. guicho+Tii[view] [source] [discussion] 2019-12-22 04:47:46
>>catpol+Y2
What is your opinion about the popular trend of making everything probabilistic, especially in ML, in favor of default logic? For example, does it make sense to say "mammals lay eggs by 98%" because of the Platypuses exception?
[go to top]