Wikipedia's overview: <https://en.wikipedia.org/wiki/Cyc>
Project / company homepage: <https://cyc.com/>
It's failure is no shade against Doug. Somebody had to try it, and I'm glad it was one of the brightest guys around. I think he clung on to it long after it was clear that it wasn't going to work out, but breakthroughs do happen. (The current round of machine learning itself is a revival of a technique that had been abandoned, but people who stuck with it anyway discovered the tricks that made it go.)
I suppose you'd architect it as a layer. It wants to say something, and the ontology layer says, "No, that's stupid, say something else". The ontology layer can recognize ontology-like statements and use them to build and evolve the ontology.
It would be even more interesting built into the visual/image models.
I have no idea if that's any kind of real progress, or if it's merely filtering out the dumb stuff. A good service, to be sure, but still not "AGI", whatever the hell that turns out to be.
Unless it turns out to be the missing element that puts it over the top. If I had any idea I wouldn't have been working with Cyc in the first place.