This fail when asked about cases not wholly within what it knows about is a problem with lots of AI not just expert systems. Neural Nets mostly do awfully on problems outside their training data, assuming they can even generate an answer at all, which isn't always possible. If you train a neural net to order drinks from Starbucks and one of it's orders fails with the server telling it "We are out of Soy Milk" chances are quite high it's subsequent order will also contain Soy Milk.