* Yeah, I read the article. Regardless of the IEC's noble attempt, in all my years of working with people and computers I've never heard anyone actually pronounce MiB (or write it out in full) as "mebibyte".
It doesn't matter. "kilo" means 1000. People are free to use it wrong if they wish.
“Kilo” can mean what we want in different contexts and it’s really no more or less correct as long as both parties understand and are consistent in their usage to each other.
90 mm floppy disks. https://jdebp.uk/FGA/floppy-discs-are-90mm-not-3-and-a-half-...
Which I have taken to calling 1440 KiB – accurate and pretty recognizable at the same time.
It's also stupid because it's rare than anyone outside of programming even needs to care exactly how many bytes something else. At the scales that each of kilobyte, megabyte, gigabyte, terabyte etc are used, the smaller values are pretty much insignificant details.
If you ask for a kilogram of rice, then you probably care more about that 1kg of rice is the same as the last 1kg of rice you got, you probably wouldn't even care how many grams that is. Similarly, if you order 1 ton of rice, you do care exactly how many grams it is, or do you just care that this 1 ton is the same as that 1 ton?
This whole stupidity started because hard disk manufacturers wanted to make their drives sound bigger than they actually were. At the time, everybody buying hard disks knew about this deception and just put up with it. We'd buy their 2GB drive and think to ourselves, "OK so we have 1.86 real GB". And that was the end of it.
Can you just imagine if manufacturers started advertising computers as having 34.3GB of RAM? Everybody would know it was nonsense and call it 32GB anyway.
That being said, I think the difference between mib and mb is niche for most people
> All words are made up.
Yes, and the made up words of kilo and kibi were given specific definitions by the people who made them up:
* https://en.wikipedia.org/wiki/Metric_prefix
* https://en.wikipedia.org/wiki/Binary_prefix
> […] as long as both parties understand and are consistent in their usage to each other.
And if they don't? What happens then?
Perhaps it would be easier to use the words definitions as they are set up in standards and regulations so context is less of an issue.
Good for them. People make up their own definitions for words all the time. Some of those people even try to get others to adopt their definition. Very few are ever successful. Because language is about communicating shared meaning. And there is a great deal of cultural inertia behind the kilo = 2^10 definition in computer science and adjacent fields.
Sectors per track or tracks per side is subject to change. Moreover a different filesystem may have non-linear growth of the MFT/superblock that'll have a different overhead.
Kilo was generally understood to mean one thousand long before it was adopted by a standards committee. I know the French love to try and prescribe the use of language, but in most of the world words just mean what people generally understand them to mean; and that meaning can change.
Before all that nonsense, it was crystal clear: a megabyte in storage was unambiguously 1024 x 1024 bytes --- with the exception of crooked mass storage manufacturers.
There was some confusion, to be sure, but the partial success of attempt to redefine the prefixes to their power-of-ten meanings has caused more confusion.
We agree to meaning to communicate and progress without endless debate and confusion.
SI is pretty clear for a reason.
We decidedly do not do that. There's a whole term for new terms that arbitrarily get injected or redefined by new people: "slang". I don't understand a lot of the terms teenagers say now, because there's lots of slang that I don't know because I don't use TikTok and I'm thirty-something without kids so I don't hang out with teenagers.
I'm sure it was the same when I was a teenager, and I suspect this has been going on since antiquity.
New terms are made up all the time, but there's plenty of times existing words get redefined. An easy one, I say "cool" all the time, but generally I'm not talking about temperature when I say it. If I said "cool" to refer to something that I like in 1920's America, they would say that's not the correct use of the word.
SI units are useful, but ultimately colloquialisms exist and will always exist. If I say kilobyte and mean 1024 bytes, and if the person on the other end knows that I mean 1024 bytes, that's fine and I don't think it's "nihilistic".
https://en.wikipedia.org/wiki/Language_planning
(Then you could decide what you think about language planning.)
Can’t use a dictionary, those bastards try to get us to adopt their definitions.
Fair enough.
1000 watts is a kilowatt
1000 hertz is a kilohertz
1000 metres is a kilometre
1000 litres is a kilolitre
1000 joules is a kilojoule
1000 volts is a kilovolt
1000 newtons is a kilonewton
1000 pascals is a kilopascal
1024 bytes is a kilobyte, because that's what we're used to and we don't want to change to a new prefix
That page is part right and part wrong.
It is right in claiming that "3.5-inch" floppies are actually 90 mm.
It is wrong in claiming that the earlier "5.25-inch" floppies weren't metric
"5.25-inch" floppies are actually 130 mm as standardised in ECMA-78 [0]
"8-inch" floppies are actually 200 mm as standardised in ECMA-69 [1]
Actually there's a few different ECMA standards for 130 and 200 mm floppies – the physical dimensions are the same, but using different recording mechanisms (FM vs MFM–those of a certain age may remember MFM as "double density", and those even older may remember FM as "single density"), and single-sided versus double-sided.
[0] ECMA-78: Data interchange on 130 mm flexible disk cartridges using MFM recording at 7 958 ftprad on 80 tracks on each side), June 1986: https://ecma-international.org/publications-and-standards/st...
[1] ECMA-69: Data interchange on 200 mm flexible disk cartridges using MFM recording at 13 262 ftprad on both sides, January 1981: https://ecma-international.org/publications-and-standards/st...
I'm pretty sure any linguist will agree with this definition. All language normalisation is an afterthought.
Inability to communicate isn't what we observe because as I already stated, meaning is shared. Dictionaries are one way shared meaning can be developed, as are textbooks, software source codes, circuits, documentation, and any other artifact which links the observable with language. All of that being collectively labeled culture. The mass of which I analogized with inertia so as to avoid oversimplifications like yours.
My point is that one person's definition does not a culture, make. And that adoption of new word definitions is inherently a group cultural activity which requires time, effort, and the willingness of the group to participate. People must be convinced the change is an improvement on some axis. Dictation of a definition from on high is as likely to result in the word meaning the exact opposite in popular usage as not. Your comment seems to miss any understanding or acknowledgement that a language is a living thing, owned by the people who speak it, and useful for speaking about the things which matter most to them. That credible dictionaries generally don't accept words or definitions until widespread use can be demonstrated.
It seems like some of us really want human language to work like rule-based computer languages. Or think they already do. But all human languages come free with a human in the loop, not a rules engine.
It is worse of a downer when there is a complete failure to make further sense like that, but I'll try to do something.
Of course one chart does not an expert make, I don't understand half of it but at least I worked with 3.5 floppies since they first came out.
3.5 floppies are "soft sectored" media and usually the drives were capable of handling non-standard arrangements too. What made non-standard numbers of sectors uncommon was it would require software most people were not using. DOS and Windows simply prepared virgin magnetic media with 2880 sectors, or reformatted them that way and that was about it.
PC's were already popular when 3.5 size came out, and most of the time they were not virgin magnetic media, they were purchased pre-formatted with 2880 sectors (of 512 bytes per sector) already on the entire floppy, of which fewer sectors were available for user data because a number of sectors are used up by the FAT filesystem overhead.
On the chart you see the 1440kb designation since each sector is considered 1/2 "kilobyte".
512 bytes is pretty close to half a kilobyte ain't it?
(The oddball 1680kb and 1720kb were slightly higher-density sectors, with more of them squeezed into the same size media, most people couldn't easily copy them without using an alternative to DOS or Windows. Sometimes used for games or installation media.)
With Windows when partitioning your drive if you want a 64 GB volume you would likely choose 64000 MB in either the GUI or Diskpart. Each of these GB is exactly 2880000 sectors for some reason ;)
But that's the size of the whole physical partition whether it contains only zeros or a file system. Then when you format it the NTFS filesystem has its own overhead.