zlacker

[parent] [thread] 0 comments
1. jfenge+(OP)[view] [source] 2023-09-01 20:02:27
I'm sure somebody somewhere is working on it. I've already seen articles teaching LLMs offload math problems onto a separate module, rather than trying to solve them via the murk of neural network.

I suppose you'd architect it as a layer. It wants to say something, and the ontology layer says, "No, that's stupid, say something else". The ontology layer can recognize ontology-like statements and use them to build and evolve the ontology.

It would be even more interesting built into the visual/image models.

I have no idea if that's any kind of real progress, or if it's merely filtering out the dumb stuff. A good service, to be sure, but still not "AGI", whatever the hell that turns out to be.

Unless it turns out to be the missing element that puts it over the top. If I had any idea I wouldn't have been working with Cyc in the first place.

[go to top]