zlacker

[parent] [thread] 0 comments
1. famous+(OP)[view] [source] 2025-04-09 19:51:05
Global funding would never have been capped at $200M for LMs because they were obviously useful from the get go and only got more useful with more investment.

Forget CYC, Forget LLMs. We abandoned Symbolic-AI for Neural Networks in NLP long before the advent of the science-fiction esque transformer LLMs. That's how terrible they were.

It wasn't for a lack of trying either. NNs were the underdogs. Some of the greatest minds desparately wanted the symbolic approach to be a valid one and tried for literally decades, and while I wouldn't call it a 'failure', it just couldn't handle anything fuzzy without a rigidly defined problem space, which is kind of unfortunate seeing as that is the exact kind of intelligence that actually exists in the real world.

[go to top]