zlacker

[return to "For algorithms, a little memory outweighs a lot of time"]
1. LPisGo+Y9[view] [source] 2025-05-21 20:27:55
>>makira+(OP)
I think it is very intuitive that more space beats the pants off of more time.

In time O(n) you can use O(n) cells on a tape, but there are O(2^n) possible configurations of symbols on a tape of length n (for an alphabet with 2 symbols), so you can do so much more with n space than with n time.

◧◩
2. frollo+8q[view] [source] 2025-05-21 22:35:11
>>LPisGo+Y9
Also, the O(1) random memory access assumption makes it easy to take memory for granted. Really it's something like O(n^(1/3)) when you're scaling the computer to the size of the problem, and you can see this in practice in datacenters.

I forget the name of the O(1) access model. Not UMA, something else.

◧◩◪
3. cperci+3t[view] [source] 2025-05-21 23:05:55
>>frollo+8q
O(n^(1/2)) really, since data centers are 2 dimensional, not 3 dimensional.

(Quite aside from the practical "we build on the surface of the earth" consideration, heat dissipation considerations limit you to a 2 dimensional circuit in 3-space.)

◧◩◪◨
4. frollo+mw[view] [source] 2025-05-21 23:38:52
>>cperci+3t
If you have rows of racks of machines, isn't that 3 dimensions? A machine can be on top of, behind, or next to another that it's directly connected to. And the components inside have their own non-uniform memory access.

Or if you're saying heat dissipation scales with surface area and is 2D, I don't know. Would think that water cooling makes it more about volume, but I'm not an expert on that.

◧◩◪◨⬒
5. manwe1+4K[view] [source] 2025-05-22 02:26:55
>>frollo+mw
That example would be two dimensions still in the limit computation, since you can keep building outwards (add buildings) but not scale upwards (add floors)
◧◩◪◨⬒⬓
6. frollo+CR[view] [source] 2025-05-22 04:03:16
>>manwe1+4K
You can add floors though. Some datacenters are 8 stories with cross-floor network fabrics.
◧◩◪◨⬒⬓⬔
7. immibi+2i1[view] [source] 2025-05-22 09:18:20
>>frollo+CR
When you get to, say, 100000 stories, you can't build more stories. At this point your computer costs more than the Earth's GDP for a century, so talking about theoretical scaling laws is irrelevant. Eventually you run out of the sun's power output so you build a Dyson sphere and eventually use all of that power, anyway.
◧◩◪◨⬒⬓⬔⧯
8. frollo+X12[view] [source] 2025-05-22 15:40:53
>>immibi+2i1
Oh right, so the height is practically a constant. Square root for sure then.
◧◩◪◨⬒⬓⬔⧯▣
9. LPisGo+lV2[view] [source] 2025-05-22 20:32:25
>>frollo+X12
All algorithms are O(1) in this case
◧◩◪◨⬒⬓⬔⧯▣▦
10. frollo+9d3[view] [source] 2025-05-22 22:10:06
>>LPisGo+lV2
You pick what things are constant and what's variable. If you're scaling a supercomputer to fit a problem, the height is going to max out quickly and can be treated as constant, while the other dimensions are variable.
[go to top]