zlacker

[return to "1 kilobyte is precisely 1000 bytes?"]
1. waffle+pC[view] [source] 2026-02-03 19:24:06
>>surpri+(OP)
The author decidedly has expert syndrome -- they deny both the history and rational behind memory units nomenclature. Memory measurements evolved utilizing binary organizational patterns used in computing architectures. While a proud French pedant might agree with the decimal normalization of memory units discussed, it aligns more closely to the metric system, and it may have benefits for laypeople, it fails to account for how memory is partitioned in historic and modern computing.
◧◩
2. ozozoz+mY[view] [source] 2026-02-03 21:05:19
>>waffle+pC
It’s not them denying it, it’s the LLM that generated this slop.

All they had to say was that the KiB et. al. were introduced in 1998, and the adoption has been slow.

And not “but a kilobyte can be 1000,” as if it’s an effort issue.

◧◩◪
3. kevin_+Uv1[view] [source] 2026-02-04 00:06:15
>>ozozoz+mY
They are managed by different standards organizations. One doesn't like the other encroaching on its turf. "kilo" has only one official meaning as a base-10 scalar.
◧◩◪◨
4. dietr1+oP1[view] [source] 2026-02-04 02:15:15
>>kevin_+Uv1
I don't think of base 10 being meaningful in binary computers. Indexing 1k needs 10 bits regardless if you wanted 1000 or 1024, and the base 10 leaves some awkward holes.

In my mind base 10 only became relevant when disk drive manufacturers came up with disks with "weird" disk sizes (maybe they needed to reserve some space for internals, or it's just that the disk platters didn't like powers of two) and realised that a base 10 system gave them better looking marketing numbers. Who wants a 2.9TB drive when you can get a 3TB* drive for the same price?

◧◩◪◨⬒
5. fc417f+sZ1[view] [source] 2026-02-04 03:42:22
>>dietr1+oP1
> I don't think of base 10 being meaningful in binary computers.

They communicate via the network, right? And telephony has always been in base 10 bits as opposed to base two eight bit bytes IIUC. So these two schemes have always been in tension.

So at some point the Ki, Mi, etc prefixes were introduced along with b vs B suffixes and that solved the issue 3+ decades ago so why is this on the HN front page?!

A better question might be, why do we privilege the 8 bit byte? Shouldn't KiB officially have a subscript 8 on the end?

◧◩◪◨⬒⬓
6. purple+Wk2[view] [source] 2026-02-04 07:14:31
>>fc417f+sZ1
To be fair, the octet as the byte has been dominant for decades. POSIX even has the definition “A byte is composed of a contiguous sequence of 8 bits.” I would wager many software engineers don't even know that a non-octet bytes were a thing, given that college CS curricula typically just teach a byte is 8 bits.

I found some search results about Texas Instruments' digital signal processors using 16-bit bytes, and came across this blogpost from 2017 talking about implementing 16-bit bytes in LLVM: https://embecosm.com/2017/04/18/non-8-bit-char-support-in-cl.... Not sure if they actually implemented it, but that was surprising to me that non octet bytes still exist, albeit in a very limited manner.

Do you know of any other uses for bytes that are not 8 bits?

◧◩◪◨⬒⬓⬔
7. zineke+Aq2[view] [source] 2026-02-04 08:01:45
>>purple+Wk2
> Do you know of any other uses for bytes that are not 8 bits?

For "bytes" as the term-of-art itself? Probably not. For "codes" or "words"? 5 bits are the standard in Baudot transmission (in teletype though). 6- and 7-bit words were the standards of the day for very old computers (ASCII is in itself a 7-bit code), especially on DEC-produced ones (https://rabbit.eng.miami.edu/info/decchars.html).

[go to top]