zlacker

[return to "1 kilobyte is precisely 1000 bytes?"]
1. pjdesn+Fw[view] [source] 2026-02-03 19:01:03
>>surpri+(OP)
I had a computer architecture prof (a reasonably accomplished one, too) who thought that all CS units should be binary, e.g. Gigabit Ethernet should be 931Mbit/s, not 1000MBit/s.

I disagreed strongly - I think X-per-second should be decimal, to correspond to Hertz. But for quantity, binary seems better. (modern CS papers tend to use MiB, GiB etc. as abbreviations for the binary units)

Fun fact - for a long time consumer SSDs had roughly 7.37% over-provisioning, because that's what you get when you put X GB (binary) of raw flash into a box, and advertise it as X GB (decimal) of usable storage. (probably a bit less, as a few blocks of the X binary GB of flash would probably be DOA) With TLC, QLC, and SLC-mode caching in modern drives the numbers aren't as simple anymore, though.

◧◩
2. soneil+6P[view] [source] 2026-02-03 20:20:31
>>pjdesn+Fw
This is the bit (sic) that drives me nuts.

RAM had binary sizing for perfectly practical reasons. Nothing else did (until SSDs inherited RAM's architecture).

We apply it to all the wrong things mostly because the first home computers had nothing but RAM, so binary sizing was the only explanation that was ever needed. And 50 years later we're sticking to that story.

◧◩◪
3. krater+Ww1[view] [source] 2026-02-04 00:12:38
>>soneil+6P
Nope. The first home computers like the C64 had RAM and sectors on disc, which in case of the C64 means 256 bytes. And there it is again, the smaller base of 1024.

Just later, some marketing assholes thought they could better sell their hard drives when they lie about the size and weasel out of legal issues with redefining the units.

[go to top]