ChatGPT解决这个技术问题 Extra ChatGPT

What are the differences between Autotools, Cmake and Scons?

What are the differences between Autotools, Cmake and Scons?

This topic is already discussed in Scons wiki. I suggest that you visit the following links: 1. scons.org/wiki/SconsVsOtherBuildTools Have you visited a very similar discussion thread in Ubuntu Forum?
You can check this pdf www-alt.gsi.de/documents/DOC-2007-Sep-17-1.pdf ; it has pros and cons and also some details about each tool.

G
Giel

In truth, Autotools' only real 'saving grace' is that it is what all the GNU projects are largely using.

Issues with Autotools:

Truly ARCANE m4 macro syntax combined with verbose, twisted shell scripting for tests for "compatibility", etc.

If you're not paying attention, you will mess up cross-compilation ability (It should clearly be noted that Nokia came up with Scratchbox/Scratchbox2 to side-step highly broken Autotools build setups for Maemo/Meego.) If you, for any reason, have fixed, static paths in your tests, you're going to break cross-compile support because it won't honor your sysroot specification and it'll pull stuff from out of your host system. If you break cross-compile support, it renders your code unusable for things like OpenEmbedded and makes it "fun" for distributions trying to build their releases on a cross-compiler instead of on target.

Does a HUGE amount of testing for problems with ancient, broken compilers that NOBODY currently uses with pretty much anything production in this day and age. Unless you're building something like glibc, libstdc++, or GCC on a truly ancient version of Solaris, AIX, or the like, the tests are a waste of time and are a source for many, many potential breakages of things like mentioned above.

It is pretty much a painful experience to get an Autotools setup to build usable code for a Windows system. (While I've little use for Windows, it is a serious concern if you're developing purportedly cross-platform code.)

When it breaks, you're going to spend HOURS chasing your tail trying to sort out the things that whomever wrote the scripting got wrong to sort out your build (In fact, this is what I'm trying to do (or, rather, rip out Autotools completely- I doubt there's enough time in the rest of this month to sort the mess out...) for work right now as I'm typing this. Apache Thrift has one of those BROKEN build systems that won't cross-compile.)

The "normal" users are actually NOT going to just do "./configure; make"- for many things, they're going to be pulling a package provided by someone, like out of a PPA, or their distribution vendor. "Normal" users aren't devs and aren't grabbing tarballs in many cases. That's snobbery on everyone's part for presuming that is going to be the case there. The typical users for tarballs are devs doing things, so they're going to get slammed with the brokenness if it's there.

It works...most of the time...is all you can say about Autotools. It's a system that solves several problems that only really concerns the GNU project...for their base, core toolchain code. (Edit (05/24/2014): It should be noted that this type of concern is a potentially BAD thing to be worrying about- Heartbleed partially stemmed from this thinking and with correct, modern systems, you really don't have any business dealing with much of what Autotools corrects for. GNU probably needs to do a cruft removal of the codebase, in light of what happened with Heartbleed) You can use it to do your project and it might work nicely for a smallish project that you don't expect to work anywhere except Linux or where the GNU toolchain is clearly working correctly on. The statement that it "integrates nicely with Linux" is quite the bold statement and quite incorrect. It integrates with the GNU toolsuite reasonably well and solves problems that IT has with it's goals.

This is not to say that there's no problems with the other options discussed in the thread here.

SCons is more of a replacement for Make/GMake/etc. and looks pretty nice, all things considered However...

It is still really more of a POSIX only tool. You could probably more easily get MinGW to build Windows stuff with this than with Autotools, but it's still really more geared to doing POSIX stuff and you'd need to install Python and SCons to use it.

It has issues doing cross-compilation unless you're using something like Scratchbox2.

Admittedly slower and less stable than CMake from their own comparison. They come up with half-hearted (the POSIX side needs make/gmake to build...) negatives for CMake compared to SCons. (As an aside, if you're needing THAT much extensibility over other solutions, you should be asking yourself whether your project's too complicated...)

The examples given for CMake in this thread are a bit bogus.

However...

You will need to learn a new language.

There's counter-intuitive things if you're used to Make, SCons, or Autotools.

You'll need to install CMake on the system you're building for.

You'll need a solid C++ compiler if you don't have pre-built binaries for it.

In truth, your goals should dictate what you choose here.

Do you need to deal with a LOT of broken toolchains to produce a valid working binary? If yes, you may want to consider Autotools, being aware of the drawbacks I mentioned above. CMake can cope with a lot of this, but it worries less with it than Autotools does. SCons can be extended to worry about it, but it's not an out-of-box answer there.

Do you have a need to worry about Windows targets? If so, Autotools should be quite literally out of the running. If so, SCons may/may not be a good choice. If so, CMake's a solid choice.

Do you have a need to worry about cross-compilation (Universal apps/libraries, things like Google Protobufs, Apache Thrift, etc. SHOULD care about this...)? If so, Autotools might work for you so long as you don't need to worry about Windows, but you're going to spend lots of time maintaining your configuration system as things change on you. SCons is almost a no-go right at the moment unless you're using Scratchbox2- it really doesn't have a handle on cross-compilation and you're going to need to use that extensibility and maintain it much in the same manner as you will with Automake. If so, you may want to consider CMake since it supports cross-compilation without as many of the worries about leaking out of the sandbox and will work with/without something like Scratchbox2 and integrates nicely with things like OpenEmbedded.

There is a reason many, many projects are ditching qmake, Autotools, etc. and moving over to CMake. So far, I can cleanly expect a CMake based project to either drop into a cross-compile situation or onto a VisualStudio setup or only need a small amount of clean up because the project didn't account for Windows-only or OSX-only parts to the codebase. I can't really expect that out of an SCons based project- and I fully expect 1/3rd or more Autotools projects to have gotten SOMETHING wrong that precludes it building right on any context except the host building one or a Scratchbox2 one.


The section mentioning Heartbleed detracts from this frustrated rant. The problem there was OpenSSL NOT using a configurator type package to detect broken malloc's but re-implementing system libraries and defeating platform developers who produced libraries which would have detected flaws much sooner. One reason to port programs is they become higher quality as they are less dependant on those little assumptions you don't evcen realise you're making
The thing is that it doesn't detract from it at all. With Autotools, you're worrying about compensating FOR things like that- those re-implementations is an EXPANSION of that very thinking. Taking it to the next level as it were. You should view that from that backdrop...at which point it doesn't detract from it at all. You didn't finger the root cause in your reasoning- just what happened. I'm fingering the exact THINKING that brought it to that along with Shellshock and a few others like it. If it's broken, you should FIX it. If you can't- you should ask why keep it going.
I don't doubt that autotools and scons suck, but this answer does a poor job at noting the cons of CMake (my preferred build system, only because of the very, very sad state of build systems). Basically, CMake is a fractal of inconsistency with built-in support for individual edge cases rather than some core abstraction that handles most things. It's definitely the duct tape and bailing wire of programming. I'm sure I'm doing something wrong, but it doesn't support VisualStudio unless you find one VS project per CMakeLists.txt acceptable (I'm not a VS user, but my Winfriends tell me it's bad).
Unlike other hated programming tools, there isn't sufficient material to make a book called, "CMake, the good parts" (maybe a pamphlet or a blog post), and it's more of a fractal of bad design than PHP.
@JAB I'm told by VS users that this is not idiomatic. In fact, CMake was replaced as the build system for our project because no one could figure out how to produce said "idiomatic" VS Project files.
W
William Pursell

An important distinction must be made between who uses the tools. Cmake is a tool that must be used by the user when building the software. The autotools are used to generate a distribution tarball that can be used to build the software using only the standard tools available on any SuS compliant system. In other words, if you are installing software from a tarball that was built using the autotools, you are not using the autotools. On the other hand, if you are installing software that uses Cmake, then you are using Cmake and must have it installed to build the software.

The great majority of users do not need to have the autotools installed on their box. Historically, much confusion has been caused because many developers distribute malformed tarballs that force the user to run autoconf to regenerate the configure script, and this is a packaging error. More confusion has been caused by the fact that most major linux distributions install multiple versions of the autotools, when they should not be installing any of them by default. Even more confusion is caused by developers attempting to use a version control system (eg cvs, git, svn) to distribute their software rather than building tarballs.


True, though you make it sound like it's bad to use a VCS to distribute software. If the contents of a checkout or clone are the same as the contents of a tarball, why would you recommend tarballs?
@Toor I do not understand your question. All of the projects I work on produce a tarball which is distinct from the VCS checkout. Keeping the two distinct helps with reliable distribution.
It is true that a lot of projects ask people to use the VCS to get the latest release, but that is only appropriate for developers. Unfortunately, many users fail to understand the distinction between the release tarball and the VCS. Attempting to build from the VCS imposes more dependencies. The user may need asciidoc or help2man or doxygen or, importantly, the correct autotool combination. This has been a huge marketing problem for the autotools. Users wrongly think they need autoconf installed because they do not understand the difference between the tarball and the VCS.
Why can't a project distribute Cmake's output, project files and Makefiles for common platforms eg) Win, Linux and MacOSX? It seems like same situation as with lex/yacc tools, you include the genarated C code so it can be built on platforms without the tools
While I think the points mentioned in this discussion are correct, I do believe that this discussion is missing the point. The user needs to install whole bunch of other things and whether or not he additional needs to install cmake should not be a big issue. I would worry more about libraries, the compiler tool chain, and other tools which are needed to build the software.
u
user502515

It's not about GNU coding standards.

The current benefits of autotools — specifically when used with automake — is that they integrate very well with building Linux distribution.

With cmake for example, it's always "was it -DCMAKE_CFLAGS or -DCMAKE_C_FLAGS that I need?" No, it's neither, it's "-DCMAKE_C_FLAGS_RELEASE". Or -DCMAKE_C_FLAGS_DEBUG. It's confusing - in autoconf, it's just ./configure CFLAGS="-O0 -ggdb3" and you have it.

In integration with build infrastructures, scons has the problem that you cannot use make %{?_smp_mflags}, _smp_mflags in this case being an RPM macro that roughly expands to (admin may set it) system power. People put things like -jNCPUS here through their environment. With scons that's not working, so the packages using scons may only get serialed built in distros.


+1, and the world would be a better place if this were true. Sadly, ./configure CFLAGS=-O0 fails often with packages that overwrite CFLAGS in the makefile and require instead that the user run ./configure --enable-debug. (eg tmux).
It's no more confusing than Autotools and as William rightly points out, it gets busted all to hell with an improperly framed CFLAGS = "" in the makefile. Quite simply this is another one of those "old saw" items. And the CMake examples are bogus...again... You can do the first if you're not specifying RELEASE or DEBUG. FAIL.
C
Community

What is important to know about the Autotools is that they are not a general build system - they implement the GNU coding standards and nothing else. If you want to make a package that follows all the GNU standards, then Autotools are an excellent tool for the job. If you don't, then you should use Scons or CMake. (For example, see this question.) This common misunderstanding is where most of the frustration with Autotools comes from.


GNU standards can be disabled in Automake with AUTOMAKE_OPTIONS = -foreign in your Makefile.am. (Or -foreign in your autogen.sh. I think almost everyone uses this.)
Yes, but that only controls whether Automake enforces superficial GNU rules like whether a README file is required. It doesn't change the basic premise of building a GNU-style tarball with rules like installcheck and distclean. If you try to change that sort of behavior while still using Autotools, then you're just wasting your time.
They (in theory) allow all software packages to be built and installed in exactly the same way.
They, in theory, allow you to deal with BROKEN compilers and OSes more appropriately than with other tools and to apply superficial rules like @ptomato highlighted there. If you're using a modern (say GCC or clang...for examples) tool on a modern operating system with nothing really goofy, say Linux or current *BSD, you don't NEED the "rules" there. They, actually, make much, much more work for you and can't help, honestly for cross-compilation. Nearly half of all usages these days are for embedded Linux and *BSD. WHY are you going with things that can't help you there?
Not sure why you downvoted the answer, since you seem to acknowledge it in your comment...?
s
sleske

While from a developers point of view, cmake is currently the most easy to use, from a user perspective autotools have one big advantage

autotools generate a single file configure script and all files to generate it are shipped with the distribution. it is easy to understand and fix with help of grep/sed/awk/vi. Compare this to Cmake where a lot of files are found in /usr/share/cmak*/Modules, which can't be fixed by the user unless he has admin access.

So, if something does not quite work, it can usually easily be "fixed" by using Standard Unix tools (grep/sed/awk/vi etc.) in a sledgehammer way without having to understand the buildsystem.

Have you ever digged through your cmake build directory to find out what is wrong? Compared to the simple shellscript which can be read from top to bottom, following the generated Cmake files to find out what is going on is quite difficult. ALso, with CMake, adapting the FindFoo.cmake files requires not only knowledge of the CMake language, but also might require superuser privileges.


I don't think problems with Autotools are "easily fixed" in comparison with CMake...
Because autotools is a complex system (just like CMake). If you think Autotools are "easily fixed" (there may be reaons to think this), it would be good IMHO to explain in the answer in what way problems are easier to fix.
autotools generate a single file configure script and all files to generate it are shipped with the distribution. it is easy to understand and fix with help of grep/sed/awk/vi. Compare this to Cmake where a lot of files are found in /usr/share/cmak*/Modules, which can't be fixed by the user unless he has admin access. Have you ever digged through your cmake build directory to find out what is wrong? Compared to the simple shellscript which can be read from top to bottom, following the generated Cmake files to find out what is going on is quite difficult
Yes, that makes sens, thanks. I took the liberty of adding it to your answer. Feel free to revert :-).
You can always use your own local version of cmake; there's no reason to use what's installed at the system level. For anyone doing cross-platform work, that's completely normal; it's certainly the way I use cmake most of the time.