zlacker

[parent] [thread] 0 comments
1. mcphag+(OP)[view] [source] 2025-04-09 19:02:53
> Quite a lot. Look back at the size of the teams working on language models at IBM, Microsoft, Google, etc, and think about all the decades of language model research going back to Shannon and quantifying the entropy of English.

I wonder at what point the money spent on LLMs matched the $200 million that was ultimately spent on CYC.

> Funnily enough, the more grandiose use-cases of LMs actually were envisioned all the way back at the beginning!

Oh, I know—but those grandiose use cases still have yet to materialize, despite the time and money spent. But the smaller scale use cases have borne fruit.

> there's an incredible science fiction story you've never heard of which takes language models, quite literally, as the route to a Singularity, from 1943. You really have to read it to believe it: "Fifty Million Monkeys", Jones 1943

Thanks, I'll read that.

> If you read the whole OP, which I acknowledge is quite a time investment, I think Yuxi makes a good case for why Lenat culturally aimed for the 'boil the ocean' approach and how they refused to do more incremental small easily-benchmarked applications as distractions and encouraging deeply flawed paradigms and how they could maintain it for so long.

I read it for a chunk, but yeah, not the whole way.

[go to top]