zlacker

[return to "For algorithms, a little memory outweighs a lot of time"]
1. LPisGo+Y9[view] [source] 2025-05-21 20:27:55
>>makira+(OP)
I think it is very intuitive that more space beats the pants off of more time.

In time O(n) you can use O(n) cells on a tape, but there are O(2^n) possible configurations of symbols on a tape of length n (for an alphabet with 2 symbols), so you can do so much more with n space than with n time.

◧◩
2. frollo+8q[view] [source] 2025-05-21 22:35:11
>>LPisGo+Y9
Also, the O(1) random memory access assumption makes it easy to take memory for granted. Really it's something like O(n^(1/3)) when you're scaling the computer to the size of the problem, and you can see this in practice in datacenters.

I forget the name of the O(1) access model. Not UMA, something else.

◧◩◪
3. cperci+3t[view] [source] 2025-05-21 23:05:55
>>frollo+8q
O(n^(1/2)) really, since data centers are 2 dimensional, not 3 dimensional.

(Quite aside from the practical "we build on the surface of the earth" consideration, heat dissipation considerations limit you to a 2 dimensional circuit in 3-space.)

◧◩◪◨
4. frollo+mw[view] [source] 2025-05-21 23:38:52
>>cperci+3t
If you have rows of racks of machines, isn't that 3 dimensions? A machine can be on top of, behind, or next to another that it's directly connected to. And the components inside have their own non-uniform memory access.

Or if you're saying heat dissipation scales with surface area and is 2D, I don't know. Would think that water cooling makes it more about volume, but I'm not an expert on that.

[go to top]