zlacker

[return to "For algorithms, a little memory outweighs a lot of time"]
1. whatev+ti[view] [source] 2025-05-21 21:31:16
>>makira+(OP)
Lookup tables with precalculated things for the win!

In fact I don’t think we would need processors anymore if we were centrally storing all of the operations ever done in our processors.

Now fast retrieval is another problem for another thread.

◧◩
2. EGreg+SE[view] [source] 2025-05-22 01:27:03
>>whatev+ti
You’re not wrong

Using an LLM and caching eg FAQs can save a lot of token credits

AI is basically solving a search problem and the models are just approximations of the data - like linear regression or fourier transforms.

The training is basically your precalculation. The key is that it precalculates a model with billions of parameters, not overfitting with an exact random set of answers hehe

[go to top]