Which is the reality. "kilobyte" means "1000 bytes". There's no possible discussion over this fact.
Many people have been using it wrong for decades, but its literal value did not change.
You are free to intend only one meaning in your own communication, but you may sometimes find yourself being misunderstood: that, too, is reality.
In fact, this is the only case I can think of where that has ever happened.
You can say that one meaning is more correct than the other, but that doesn't vanish the other meaning from existence.
https://www-cs-faculty.stanford.edu/~knuth/news99.html
And he was right.
Context is important.
"K" is an excellent prefix for 1024 bytes when working with small computers, and a metric shit ton of time has been saved by standardizing on that.
When you get to bigger units, marketing intervenes, and, as other commenters have pointed out, we have the storage standard of MB == 1000 * 1024.
But why is that? Certainly it's because of the marketing, but also it's because KB has been standardized for bytes.
> Which is the reality. "kilobyte" means "1000 bytes". There's no possible discussion over this fact.
You couldn't be more wrong. Absolutely nobody talks about 8K bytes of memory and means 8000.
Now, it depends.
E.g., M-W lists both, with even the 1,024 B definition being listed first. Wiktionary lists the 1,024 B definition, though it is tagged as "informal".
As a prescriptivist myself I would love if the world could standardize on kilo = 1000, kibi = 1024, but that'll likely take some time … and the introduction of the word to the wider public, who I do not think is generally aware of the binary prefixes, and some large companies deciding to use the term, which they likely won't do, since companies are apt to always trade for low-grade perpetual confusion over some short-term confusion during the switch.
If we are talking about kilobytes, it could just as easily the opposite.
Unless you were referring to only contracts which you yourself draft, in which case it'd be whatever you personally want.
This is a myth. The first IBM harddrive was 5,000,000 characters in 1956 - before bytes were even common usage. Drives have always been base10, it's not a conspiracy.
Drives are base10, lines are base10, clocks are base10, pretty much everything but RAM is base10. Base2 is the exception, not the rule.
Yeah, I already knew that, lol.
But thanks for bringing it to my attention. :-)