> it remains the norm regardless. Both CMake and Autotools use this approach
Yes, it is the norm (in certain circles, but not others). No, it doesn't have to be that way. CMake and Autotools are both genuinely awful. The world would be a better place if Bazel/Buck2/similar were the norm.
I find it extraordinarily bizarre how defensive Linux people get when I say "the status quo is actually bad, but it doesn't have to be!".
> there's half a century of code out there written in languages without that, and explicit feature testing will be with us as long as that code is.
I think you radically overestimate how difficult of a problem this is. It doesn't have to be this way. I promise you!
> native cross-compilation is also not a language feature.
Correct. That's why I called it a feature of a build system. C/C++ standards famously do not even attempt to describe a build system. This has resulted in many elements that really ought to be language features to actually be implementation defined.
> so it's not a good idea to dismiss it as a bad thing.
I didn't say Autotools and CMake weren't awful, I said that approach exists for a reason. You seem not to understand; maybe it's my poor communication skills. And I don't appreciate being called "Linux people," because I am not one, thanks. Not sure why personal attacks are in play here.
The thing you're missing is that Bazel, Buck2, Zig, and all these things that do not require the compile-testing approach is they are not portable. They support Windows, MacOS, and Linux, by special-casing each platform, and even the Linux support is restricted to a few mainstream distros. This is a perfectly valid technical decision, and one I support, but it was not possible for decades.
Now people take Linux for granted, but until relatively recently you had dozens of almost-compatible Unix clones from dozens of vendors, of which Linux was but one. All of them had to be, essentially, written from scratch or based on a BSD release, with the differentiating features implemented on top by a given vendor. Since different developers had different interpretations and priorities for "Unix compatibility," this led to a combinatorial explosion of possible software configurations even just at the libc level, much less kernel API. The Single Unix Specification and POSIX standards were assembled to try to correct this, but there was no way to go back and un-differentiate all these Unix clones. So developers targeted SUS or POSIX, and mostly got what they needed, but something had to account for all the little differences that remained. This was the purpose of Autotools, and Cmake was written because Autotools sucked.
So, these things you're praising which avoid that whole scene, are able to do so because, for better or worse, Linux won. The other Unix clones have either completely died out (SCO Unix etc), are dying out quickly (Solaris etc), or have stabilized into their own distinct platforms (FreeBSD, MacOS, etc).
It has nothing to do with the build system. It has everything to do with platform consistency. That's why Plan 9 can do it without any build system beyond mk (a weird clone of Make), and why the FAANG companies can crank out build systems as get-me-promoted projects -- they're targeting three stable operating systems, instead of an uncountable number of weird almost-compatible monsters.
But again, from this terrible Unix scenery emerged a remarkably robust and portable set of tools (gcc, the gnu coreutils, etc) which became the de facto standard across all the Unix clones. People liked them because they worked the same on every Unix, and because (thanks to Autotools) they could build on every Unix. Because they had this versatile build system, when Linux came around the GNU tools were the easiest to port. Because they were the main tools running on Linux, and people had experience with them from other Unix clones, Linux adoption was quick and easy. And so it grew in popularity...
So yes, I mean "because of." It's easy to hate on complicated old rickety shit like Autotools, and I will be happy when I never have to deal with it again, but pretending it has no value and typing in "ewww" is needlessly dismissive and denies the actual importance this code had in getting us to a place where we can live without it.
First of all, thanks for engaging thoughtfully! The "Linux people" comment was a bit rude, sorry. If it helps it was meant as a general observation and not super targeted at you. Anyhow.
> they are not portable. it was not possible for decades.
I could be missing something, but I don't think this is true at all. The definition of "portable" can be quite fuzzy.
Supporting many different platforms is easy. Coming from gamedev the target platform list looks something like: Windows, macOS, SteamOS, Linux, Android, iOS, PS4, PS5, XSX, Switch. A given codebase may have supported another 10 to 20 consoles depending on how old it. Supporting all the different consoles is quite a bit more work than supporting all the niche nix flavors.
Building anything "new" probably requires building it at least three times. You make mistakes, hit edge causes, learn what's important, and eventually build something that's pretty good. It's a process. One perspective is "the approach exists for a reason". Another perspective is "damn we made a ton of mistakes, I really wish we knew then what we know now!".
> I said that approach exists for a reason
My core thesis here is that the reason is a bad one. It was a mistake. Folks didn't know better back then. That's ok. No shade on the individuals. But it doesn't mean it's a good design. In a parallel universe maybe the better design would have been made before a bad one. Such is life.
> They support Windows, MacOS, and Linux
Publicly. Internally Buck2 supports dozens of weird hardware platforms and OS variants. Adding support for a new hardware or OS platform doesn't have to be hard!
I swear compiling computer programs doesn't have to be hard! Build systems don't have to try to compile and see if it fails or not!
As one example, Windows handles linking a dynamic library much, much better than GCC/Clang on Linux. All you need is a header and a thin import static lib that only contains function stubs. GCC made a mistake by expecting a fully compiled .so to be available. The libc headers across all the different weird Unix variants are 99% identical. Supporting a new variant or version should be as simple as grabbing headers and import lib. Zig jumps through hoops to pre-produce exactly that. It'd be even better if libc headers were amalgamated into a single header. Two files per variant. Easy peasy.
"The libc headers across all the different weird Unix variants are 99% identical."
This is both true and not applicable. The headers are nearly identical. The behaviors of the systems are not. Functions with the same name do different things, functions have different names but have slightly-incompatible shims with the name you expect, the headers have the function primitives but those functions are stubbed to NOPs in the actual library, and a million other things. These differences, furthermore, are not documented. All of the platforms you listed are meticulously documented and many of them have SDKs available. This, again, is not something that existed for many decades.
If libc headers being close were so meaningful, widevine binaries would work on musl libc. They do not. Modern build systems work around these incompatibilities by declaring incompatible platforms unsupported and ignoring them. That's a perfectly valid business decision that makes everyone's lives easier! But it's not the only approach, and other approachs are valid too.
Meanwhile, if you want complex software to accurately and correctly support a broad array of platforms, you use Autotools to compile-and-see what behavior you're dealing with, and then something like libtool to polyfill the differences. Doing this is not the result of ignorance, but the result -- borne of hard experience -- of trying to get code working on divergent undocumented platforms.
I don't really understand your point in the last paragraph. Neither GCC nor Clang handle linking. GNU ld or gold or some linker is invoked to do that.
> if you want complex software to accurately and correctly support a broad array of platforms, you use Autotools to compile-and-see what behavior you're dealing with, and then something like libtool to polyfill the differences. Doing this is not the result of ignorance, but the result -- borne of hard experience -- of trying to get code working on divergent undocumented platforms.
Yes, when you have differences in behavior you need to create an abstraction layer that behaves in a single, unified way.
"The Right Thing" is to do that once ever for a given target. Write a bootstrap tool if you want. Or let one person spend one week testing and implementing shims for a brand new platform. Which is what you need to do anyways for embedded platforms that can't host the compiler. In any case don't force every user to run overly complex and brittle scripts when the result never changes for a given target.
Oh hey, this is exactly what the source article argues! At least I'm not alone.
> Neither GCC nor Clang handle linking. GNU ld or gold or some linker is invoked to do that.
Bleh. ld is part of the GNU toolset. Pretend I said "standard Linux linking behavior". As you said, Linux's popularity stems from a collection of tools that work together nicely. The point I tried and failed to make is "compiling and linking doesn't have to be hard". Implementing an abstraction layer for a new platform isn't particularly hard either. Probably best to forget I said it.
Yeah, it's definitely a Chesterton's Fence problem; GP is dismissing Autotools because he has never had to deal with the problem which Autotools solves.
Yes, it is the norm (in certain circles, but not others). No, it doesn't have to be that way. CMake and Autotools are both genuinely awful. The world would be a better place if Bazel/Buck2/similar were the norm.
I find it extraordinarily bizarre how defensive Linux people get when I say "the status quo is actually bad, but it doesn't have to be!".
> there's half a century of code out there written in languages without that, and explicit feature testing will be with us as long as that code is.
I think you radically overestimate how difficult of a problem this is. It doesn't have to be this way. I promise you!
> native cross-compilation is also not a language feature.
Correct. That's why I called it a feature of a build system. C/C++ standards famously do not even attempt to describe a build system. This has resulted in many elements that really ought to be language features to actually be implementation defined.
> so it's not a good idea to dismiss it as a bad thing.
"because of" vs "despite of"