zlacker
[return to "Obituary for Cyc"]
◧
1. pfdiet+W3
[view]
[source]
2025-04-08 19:43:44
>>todsac+(OP)
It would be cool to try to generate the "knowledge" as in Cyc automatically, from LLMs.
◧◩
2. eob+O5
[view]
[source]
2025-04-08 19:57:44
>>pfdiet+W3
Or vice versa - perhaps some subset of the "thought chains" of Cyc's inference system could be useful training data for LLMs.
◧◩◪
3. eurode+m8
[view]
[source]
2025-04-08 20:12:52
>>eob+O5
When I first learned about LLMs, what came to mind is some sort of "meeting of the minds" with Cyc. 'Twas not to be, apparently.
◧◩◪◨
4. imglor+Jh
[view]
[source]
2025-04-08 21:18:39
>>eurode+m8
I view Cyc's role there as a RAG for common sense reasoning. It might prevent models from advising glue on pizza.
(is-a 'pizza 'food) (not (is-a 'glue 'food)) (for-all i ingredients (assert-is-a i 'food))
◧◩◪◨⬒
5. jes519+Oj
[view]
[source]
2025-04-08 21:35:45
>>imglor+Jh
sure but the bigger models don’t make these trivial mistakes, and I’m not sure if translating the LLM english sentences into LISP and trying to check them is going to be more accurate than just training the models better
[go to top]