zlacker

[parent] [thread] 11 comments
1. optima+(OP)[view] [source] 2023-09-01 21:26:35
The best thing Cycorp could do now is open source its accumulated database of logical relations so it can be ingested by some monster LLM.

What's the point of all that data collecting dust and accomplishing not much of anything?

replies(4): >>vtr132+Jb >>adastr+jc >>xpe+3x >>zozbot+JG
2. vtr132+Jb[view] [source] 2023-09-01 23:05:20
>>optima+(OP)
I think military will take over his work.Snowden documents reveled the cyc was been used to come up with Terror attack scenarios.
3. adastr+jc[view] [source] 2023-09-01 23:08:54
>>optima+(OP)
It seems the direction of flow would be the opposite: LLMs are a great source of logical data for Cyc-like things. Distill your LLM into logical statements, then run your Cyc algorithms on it.
replies(2): >>xpe+rp >>creer+3q2
◧◩
4. xpe+rp[view] [source] [discussion] 2023-09-02 01:48:25
>>adastr+jc
> It seems the direction of flow would be the opposite: LLMs are a great source of logical data for Cyc-like things. Distill your LLM into logical statements, then run your Cyc algorithms on it.

This is hugely problematic. If you get the premises wrong, many fallacies will follow.

LLMs can play many roles around this area, but their output cannot be trusted with significant verification and validation.

replies(1): >>xpe+kA1
5. xpe+3x[view] [source] 2023-09-02 03:41:14
>>optima+(OP)
> The best thing Cycorp could do now is open source its accumulated database of logical relations...

This is unpersuasive without laying out your assumptions and reasoning.

Counter points:

(a) It would be unethical for such a knowledge base to be put out in the open without considerable guardrails and appropriate licensing. The details matter.

(b) Cycorp gets some funding from the U.S. Government; this changes both the set of options available and the calculus of weighing them.

(c) Not all nations have equivalent values. Unless one is a moral relativist, these differences should not be deemed equivalent nor irrelevant. As such, despite the flaws of U.S. values and some horrific decision-making throughout history, there are known worse actors and states. Such parties would make worse use of an extensive human-curated knowledge base.

replies(1): >>skylyz+uV1
6. zozbot+JG[view] [source] 2023-09-02 06:21:03
>>optima+(OP)
OpenCyc is already a thing and there's been very little interest in it. These days we also have general-purpose semantic KB's like Wikidata, that are available for free and go way beyond what Cyc or OpenCyc was trying to do.
◧◩◪
7. xpe+kA1[view] [source] [discussion] 2023-09-02 15:35:27
>>xpe+rp
*without
◧◩
8. skylyz+uV1[view] [source] [discussion] 2023-09-02 17:46:45
>>xpe+3x
An older version of the database is already available for download, but that's not the approach you want for common sense anyway, no one needs to remember that a "dog is not a cat".
replies(1): >>xpe+em6
◧◩
9. creer+3q2[view] [source] [discussion] 2023-09-02 21:24:22
>>adastr+jc
LLM statements (distilled into logical statements) would not be logically sound. That's (one of) the main issues of LLMs. And that would make logical inference on these logical statements impossible with current systems.

That's one of the principal features of Cyc. It's carefully built by humans to be (essentially) logically sound. - so that inference can then be run through the fact base. Making that stuff logically sound made for a very detailed and fussy knowledge base. And that in turn made it difficult to expand or even understand for mere civilians. Cyc is NOT simple.

replies(1): >>varjag+Ss2
◧◩◪
10. varjag+Ss2[view] [source] [discussion] 2023-09-02 21:48:32
>>creer+3q2
Cyc is built to be locally consistent but global KB consistency is an impossible task. Lenat stressed that in his videos over and over.
replies(1): >>creer+D93
◧◩◪◨
11. creer+D93[view] [source] [discussion] 2023-09-03 07:39:06
>>varjag+Ss2
My "essentially" was doing some work there. It's been years but I remember something like "within a context" as the general direction? Such as within an area of the ontology (because - by contrast to LLMs - there is one) or within a reasonning problem, that kind of thing.

By contrast, LLMs for now are embarassing. With inconsistent nonsense provided within one answer or an answer not recognizing the context of the problem. Say, the work domain being a food label and the system not recognizing that or not staying within that.

◧◩◪
12. xpe+em6[view] [source] [discussion] 2023-09-04 14:25:44
>>skylyz+uV1
You are probably referring to OpenCyc. It provides much more value than your comment suggests.

I'd recommend that more people take a look and compare its approach against others. https://en.wikipedia.org/wiki/CycL is compact and worth a read, especially the concept of "microtheories".

[go to top]