This will allow to craft ELF binaries on a modern distro which will run on "older" distros. This is critical for games and game engines. There is an significant upfront only-once work in order to select an "old" glibc ABI.
The quick and dirty alternative being having a toolchain configured to link with an "old" glibc on the side.
This article missed the -static-libstdc++ critical option for c++ applications (the c++ ABI is hell on earth), but did not miss the -static-libgcc and the dynamic loading of system interface shared libs.
The Linux kernel goes to a lot of effort to not break user space, at least for non-exotic core features and syscalls. It seems like a lot of user-space in Linux-land does not make the same effort.
It's particularly bad when it's the C library doing this, since that's at the center of the dependency graph for almost everything.
I believe that, what the article misses is that glibc is maintained and extended with an entirely different community and development model. Windows remains compatible over decades because Microsoft (a) is the sole distributor, and (b) puts an immense effort towards backwards compat. In Linux userspace, it's simply a non-goal across distributions. If you want to ship a binary for a particular distro, you need to build the binary on / for that distro; even within a distro, a major release bump (or especially a major release downgrade) may break a binary.
Ultimately, it's a consequence of Conway’s Law. Microsoft is the sole distributor of Windows, so they can enforce compatibilty with an iron fist, and there are people working for Microsoft whose pay depends on said compatibility. With "Linux" in general, there is no common authority to appeal to, and (again) most vendors don't even promise a seamless userspace upgrade path from one major release to another.
This is unfixable; it will never change -- as long as independent parties are permitted to distribute different operating systems yet call them all "Linux".
Ship multiple binaries, or distribute the source code (and let users build it).
EDIT: you'll notice that "ship multiple binaries" is what distros (especially commercial distros) do. They maintain separate branches, backport fixes to old branches, and employ software maintenance engineers to focus on this kind of work. If you want to target multiple major releases, this is what you have to do, too.
If you (as a commercial ISV) target a commercial distro with long-term support, and can convince your users to use / license the same distro, you'll have a good, stable development experience. You only need to port like once every decade, when you jump major releases.
The Linux user base / the Linux market is fragmented; that's the whole goal. The technical proliferation / inconsistency is just a consequence. Unless you take away the freedom of users to run their own flavors of "Linux", there won't be a uniform Linux target.
In a way, it's funny to even expect otherwise. Why do you expect to ship the same binaries when the foundations are diverse, with no standardization authority that all Linux distributors recognize as such? And even POSIX is an API spec, not an ABI spec.
And, any authority that controls binary aspects will immediately accrue political capital. This is exactly what shouldn't happen in Linux. The fact that anyone can fork (or start) a distro, and contribute to the chaos, is good for freedom.
Frankly, I do not understand who would think glibc symbols themselves would be the challenge in this case. Even if you statically link glibc there's zero guarantee the syscalls will be present in the older Linux (cue .ABI-tag failures). Or even damn ELF format changes (e.g. gnu-style hashes). The simple solution is to build in the older Linux (&glibc).
In my long experience with ancient binaries, glibc has almost never been the problem, and its ability to _run_ ancient binaries is all but excellent; even Linux is more of a problem than glibc is (for starters paths to everywhere in /proc, /sys change every other half-decade).
One of my side projects is building a toolchain to enable C++ cross-compile using the Zig header/source libs.
I didn’t love Zig as a Clang++ replacement because it has a bit too much magic. And it might go ahead? But the underlying library code is a God send
It’s an abomination that Linux uses system libraries when building. Catastrophically terrible and stupid decision.
It should be trivial for any program to compile and specify any arbitrary previous version of glibc as the target.
Linux got this so incredibly wildly wrong. It’s a shame.
I tend to stay on the oldest supported version of Windows until they drop support and haven't ever seen an application that wouldn't run because it's built on a newer version of Windows.
Having to build and maintain a binary packege separately for each version of the same distro probably isn't that appealing to them.
If things go well, it's even better than that: If you target ex. RHEL 8, there's a very good chance that your binaries will work on RHEL 9 and a decent shot at RHEL 10 with zero changes (though of course you should test all versions you want to work). And the same for Ubuntu 20.04/22.04/24.04/... and Debian/SUSE/whatever. Backwards incompatibilities can happen, but within a single stable distro they're not super common so the lazy ISV can probably only really port forward after more than a decade if they really want.
(Incidentally, this isn't a hypothetical: I once had the joy of working on software that targeted RHEL 5, and those binaries ran on RHEL/CentOS 7 without any problems.)
It should be trivial for Windows to cross-compile for Linux for any distro and for any ancient version of glibc.
It is not trivial.
Here is a post describing the mountain range of bullshit that Zig had to move to enable trivial cross-compile and backwards targeting. https://andrewkelley.me/post/zig-cc-powerful-drop-in-replace...
Linux is far and away the worst offender out of Linux, Mac, and Windows. By leaps and bounds.
On EL it's easier, now you would just support 2 or 3 of EL7, EL8, and EL9.
As an example of something I use, Xfdtd only officially supports one version of Ubuntu and 2 versions of EL https://www.remcom.com/system-requirements#xfdtd-system-requ...
In practice, it wasn't too hard to get it running on EL9 or Fedora either...
I got the bug with "TinyGlade" video game (extremely good BTW), which is written in rust, and with the dev we hit that bug. Namely... better have a libgcc with the right ABI... and I can tell you, this has a been a HUGE issue since valve started to distribute games more than a decade ago.
And that's why we have package managers and distro maintainers/packagers. You'll get no help from the community if your stuff is proprietary, just the way it is. Ship the code, distros will pick it up and do the packaging for you to make it available in their distro. It's part of the free software culture that surrounds the Linux ecosystem.
If you absolutely must ship proprietary software, then target an enterprise distro. Ship it for RHEL or Ubuntu LTS and you get, at least, 10 years of a stable base.