I often need to install software in an environment that's different from where it's going to be run, e.g. I install it on a file server, where the target directory is/export/apps/stow/, and then I use "stow" to put it in/export/apps, which clients mount as/usr/site, so they see "/usr/site/bin/", and are set up to look in/usr/site/lib for libraries, and so forth.
I don't know if this is instrinsic to newer build schemes or not, but my recent experience has been that "old-style" (autotools-based) packages work just fine, they interoperate well with stow, accept the "--prefix" argument to configure, and work just fine for the clients. Cmake-based packages tend to hard-code path names into start-up scripts, which then break on the clients, which view the app in a different hierarchy -- they don't have/export, in particular.
Now, it may well be that these are badly-written cmake scripts, and cmake is perfectly capable of doing it right, I honestly don't know. But it seems to me that cmake (and Python's easy-install, and every other new build scheme I've run across in the past few years) are all part of a new generation of tools which really want the build-time and run-time environments to be the same, because they're built around the "single-user isolated workstation" model.
But it's not true. Lots of us still have centralized file servers that use NFS exports to make centrally-managed Linux applications available to many clients. The new tools make some things easier, but this, they make harder.
Also, uphill through the snow both ways, and we liked it, get off my lawn, kids today don't know nothin', no respect I tell you.
I've run into the same problem with cmake. I don't really have that problem with python tools as python's virtual environment tools [python.org] seem to handle things nicely. Tools such as pip [python.org] natively handle virtual environments, automatically installing into it when one is active.
Also, there are lots of nice wrappers [python.org] to work with python's tools, for developers, such as gogo [bitbucket.org].
"I don't really have that problem with python tools as python's virtual environment tools seem to handle things nicely. Tools such as pip natively handle virtual environments, automatically installing into it when one is active."
Which is another aspect of the very same problem. So their solution to ignoring they should seggregate functional development from bug fixing, that they should take API stability as an almost sacred cow is "reinventing" the "static linked environment"?
I don't know if this is instrinsic to newer build schemes or not, but my recent experience has been that "old-style" (autotools-based) packages work just fine, they interoperate well with stow, accept the "--prefix" argument to configure, and work just fine for the clients.
Not true. No true at all.
I used stow for many many years. And it's true, a lot of packages just worked. You'd do a "configure --prefix/run/time/target", then "make install prefix=/install/time/target", and it would work. But this is
Have you tried to define CMAKE_INSTALL_PREFIX [cmake.org], CMAKE_INSTALL_RPATH [cmake.org] / CMAKE_SKIP_RPATH [cmake.org], etc when configuring the software you are trying to build? I'd say they pretty much fix whatever problems you are running into. BTW, CMake also supports "make install DESTDIR=/whatever"
And there's worse -- supporting multiple simultaneous build targets.
Most of my stuff I build
{optimized, debug, optimized-profiling}
x {gfortran/gcc, g95/gcc, sunf95/suncc, ifort/icc}
for a set of 12 simultaneous build-targets.
Conventional build systems do not support multiple simultaneous
build-targets well.
The most important thing to know is that the --prefix argument (for a correctly designed configure script which follows the conventions) indicates the run-time installation directory of the program. I.e. the path given to --prefix may actually be compiled into the program and then used by that program at run-time.
The average free software user compiles the program on the same machine where he will run it. And so the prefix is also the place where the program is copied during "make install".
I'm sure he knows this since he is already using Stow [gnu.org]. Stow works pretty well for having multiple versions of different software packages built from source and installed simultaneously, and having a proper package management system for it all. Though these days I use Checkinstall [asic-linux.com.mx] - having the final package as a.deb or.rpm make it a bit easier to distribute the built packages.
"Now, it may well be that these are badly-written cmake scripts, and cmake is perfectly capable of doing it right"
The critics about the book make a point about "the Autotools are not simply a set of tools but foremost the encoded embodiment of a set of practices and expectations in the way software should be packaged the GNU way".
It's not that cmake can or can't be properly used in order to provide a platform-independent prebuild environment but it lacks the grayback experience. As the old motto says, "tho
because they are not (still) aware of the whole landscape and its corner-cases
Youngsters can undervalue knowledge of "the whole landscape and its corner-cases", but old farts can also overvalue it. The beauty of new tools like CMake is that they can leave the past behind, and stop worrying about corner cases on obsolete platforms. At one time I was an absolute expert on MSDOS and Windows 3.1. I leave that off my resume now because it no longer has any practical value, and it just makes me look old:-(
Actually building on multiple platforms without maintaining separate build files for each is the problem...
CMake was created to build Kitware's other products, most notably VTK and ITK. To date, I've built both, and other things built on top of them on three platforms, with several variations: GCC on Linux, both 32bit and 64bit, MinGW and Visual C on Windows. I don't need to install anything else apart from CMake and the compiler (and associated Make package) on each of those platforms, run it once, and the
In most cases, the MSVS project files are a courtesy for the people who are probably going to be using MSVS. I remember piles of 1990's open-source projects which ran on Windows, and everyone was asking for the project files. What is this other compiler you speak of - Cygwin isn't a compiler it's a fake unix environment, you can't make a Windows program without MSVS they say. Plus, whomever is doing the Windows port probably uses MSVS anyway.
For a true open-source solution, they would provide the command
The other compiler is MinGW32 [mingw.org] - there are others, like Borland C, and ICC...
I think CMake supports them all.
There are still piles of projects running on Windows. Right now, I'm on Windows (work machine), with KDE 4.4, Inkscape, Gimp, VTK and ITK among other things installed. Remember, lots of devs work on Windows, and quite a large body of users too. Not to mention other environments, like embedded systems which might or might not be able to work with full autotools...
Cmake among others has effectively replaced autotools. It's FAR easier to deal with, cross platform, fast, will build makefiles, visual studio solutions, and X-code, and supports testing and other things.
There are some other ones around too like Scons, but the point is, anyone starting a new project now with autotools is a dolt or a masochist or both.
Autotools is dead. Let's let it be buried in peace, please.
I've just recently been in the situation of selecting a build system for a project with an existing codebase. I looked at the obvious alternatives, including cmake.
In the end, I chose autotools.
When you're doing a non-trivial project, cmake doesn't become any less complicated than autoconf and automake anymore - if your build is complex, you have to deal with that complexity somewhere after all. And there's a lot more and better resources for using autotools than cmake around, for figuring out odd corner cases. If you have a somewhat odd build requirement, chances are somebody else has already solved it using autotools already.
From my experience so far, most of what people dislike about using autotools come from Automake. But Automake is of course completely optional to use, and Autoconf - which provides most of the benefits - was made to be standalone. If you have a system with existing makefiles, it makes a lot of sense to simply use Autoconf to configure the app and the makefiles and leave Automake alone.
From my experience so far, most of what people dislike about using autotools come from Automake. But Automake is of course completely optional to use, and Autoconf - which provides most of the benefits - was made to be standalone. If you have a system with existing makefiles, it makes a lot of sense to simply use Autoconf to configure the app and the makefiles and leave Automake alone.
That's until you go into building shared libraries for multiple platforms. Then automake and libtool are your saviors. Also, I find automated header dependency tracking nice to use.
I agree -- autoconf is independent, and does a great job handling system configuration stuff without involving automake -- but I think you're being a bit unfair to automake.
For projects that "fit" automake, it's actually a wonderful tool, as it allows a highly concise description of the package contents and dependencies, with almost zero fat and overhead, and does pretty much all the typical boilerplate stuff (convenience targets, separate build-directory support, installation, automatic dependency generat
"For projects that "fit" automake, it's actually a wonderful tool,..."
You're absolutely right, and it seems to save a huge amount of headaches when you can use it. Note though that I'm adapting an existing codebase, that needs a certain structure to be buildable using non-unix tools and systems. While I can get a more or less buildable version using Automake there's enough oddities to deal with that adapting the previous hand-rolled makefile seems preferable.
decided to make the experience easier to newcomers by sharing his years of experience and carefully crafted bag of tricks.
Even better would be reading that this gentleman had gotten behind efforts to make working with the tools easier. Simply teaching me tricks though welcome, is not good enough. Working with the tool(s) still is difficult.
That's like saying an airline that flies to everywhere in the world except the US, India and China is not really an 'international' airline. Not correct. Cross-platform essentially means that it runs on multiple important platforms, not necessarily including the largest one.
Oh but I might still buy a copy. Just to wipe my @ss with. I can't begin to think of the hours I've wasted debugging build failures of this heap of cr@p.
Why invent Makefile writing scripts or even programs when make and Makefiles can easily do all that is required for cross-platform (and cross-target) compilation?
http://sourceforge.net/projects/mk-configure/ [sourceforge.net]
Autotools do not need a book (Score:5, Insightful)
... they should be replaced by something else.
Re: (Score:1, Troll)
Re:Autotools do not need a book (Score:5, Interesting)
I often need to install software in an environment that's different from where it's going to be run, e.g. I install it on a file server, where the target directory is /export/apps/stow/, and then I use "stow" to put it in /export/apps, which clients mount as /usr/site, so they see "/usr/site/bin/", and are set up to look in /usr/site/lib for libraries, and so forth.
I don't know if this is instrinsic to newer build schemes or not, but my recent experience has been that "old-style" (autotools-based) packages work just fine, they interoperate well with stow, accept the "--prefix" argument to configure, and work just fine for the clients. Cmake-based packages tend to hard-code path names into start-up scripts, which then break on the clients, which view the app in a different hierarchy -- they don't have /export, in particular.
Now, it may well be that these are badly-written cmake scripts, and cmake is perfectly capable of doing it right, I honestly don't know. But it seems to me that cmake (and Python's easy-install, and every other new build scheme I've run across in the past few years) are all part of a new generation of tools which really want the build-time and run-time environments to be the same, because they're built around the "single-user isolated workstation" model.
But it's not true. Lots of us still have centralized file servers that use NFS exports to make centrally-managed Linux applications available to many clients. The new tools make some things easier, but this, they make harder.
Also, uphill through the snow both ways, and we liked it, get off my lawn, kids today don't know nothin', no respect I tell you.
Re:Autotools do not need a book (Score:4, Informative)
I've run into the same problem with cmake. I don't really have that problem with python tools as python's virtual environment tools [python.org] seem to handle things nicely. Tools such as pip [python.org] natively handle virtual environments, automatically installing into it when one is active.
Also, there are lots of nice wrappers [python.org] to work with python's tools, for developers, such as gogo [bitbucket.org].
Re: (Score:3, Interesting)
"I don't really have that problem with python tools as python's virtual environment tools seem to handle things nicely. Tools such as pip natively handle virtual environments, automatically installing into it when one is active."
Which is another aspect of the very same problem. So their solution to ignoring they should seggregate functional development from bug fixing, that they should take API stability as an almost sacred cow is "reinventing" the "static linked environment"?
Try to install two disparaged
Re: (Score:2)
I don't know if this is instrinsic to newer build schemes or not, but my recent experience has been that "old-style" (autotools-based) packages work just fine, they interoperate well with stow, accept the "--prefix" argument to configure, and work just fine for the clients.
Not true. No true at all.
I used stow for many many years. And it's true, a lot of packages just worked. You'd do a "configure --prefix /run/time/target", then "make install prefix=/install/time/target", and it would work. But this is
Re: (Score:2)
Other build-package problems (Score:2)
Re: (Score:2)
Hi Urban Garlic,
The most important thing to know is that the --prefix argument (for a correctly designed configure script which follows the conventions) indicates the run-time installation directory of the program. I.e. the path given to --prefix may actually be compiled into the program and then used by that program at run-time.
The average free software user compiles the program on the same machine where he will run it. And so the prefix is also the place where the program is copied during "make install".
T
Re: (Score:2)
You must install into some temporary directory
I'm sure he knows this since he is already using Stow [gnu.org]. Stow works pretty well for having multiple versions of different software packages built from source and installed simultaneously, and having a proper package management system for it all. Though these days I use Checkinstall [asic-linux.com.mx] - having the final package as a .deb or .rpm make it a bit easier to distribute the built packages.
Re: (Score:2, Insightful)
"Now, it may well be that these are badly-written cmake scripts, and cmake is perfectly capable of doing it right"
The critics about the book make a point about "the Autotools are not simply a set of tools but foremost the encoded embodiment of a set of practices and expectations in the way software should be packaged the GNU way".
It's not that cmake can or can't be properly used in order to provide a platform-independent prebuild environment but it lacks the grayback experience. As the old motto says, "tho
Re: (Score:2, Funny)
because they are not (still) aware of the whole landscape and its corner-cases
Youngsters can undervalue knowledge of "the whole landscape and its corner-cases", but old farts can also overvalue it. The beauty of new tools like CMake is that they can leave the past behind, and stop worrying about corner cases on obsolete platforms. At one time I was an absolute expert on MSDOS and Windows 3.1. I leave that off my resume now because it no longer has any practical value, and it just makes me look old :-(
Re:Autotools do not need a book (Score:4, Funny)
Re: (Score:2)
Actually building on multiple platforms without maintaining separate build files for each is the problem...
CMake was created to build Kitware's other products, most notably VTK and ITK. To date, I've built both, and other things built on top of them on three platforms, with several variations: GCC on Linux, both 32bit and 64bit, MinGW and Visual C on Windows. I don't need to install anything else apart from CMake and the compiler (and associated Make package) on each of those platforms, run it once, and the
Re: (Score:2)
In most cases, the MSVS project files are a courtesy for the people who are probably going to be using MSVS. I remember piles of 1990's open-source projects which ran on Windows, and everyone was asking for the project files. What is this other compiler you speak of - Cygwin isn't a compiler it's a fake unix environment, you can't make a Windows program without MSVS they say. Plus, whomever is doing the Windows port probably uses MSVS anyway.
For a true open-source solution, they would provide the command
Re: (Score:2)
The other compiler is MinGW32 [mingw.org] - there are others, like Borland C, and ICC...
I think CMake supports them all.
There are still piles of projects running on Windows. Right now, I'm on Windows (work machine), with KDE 4.4, Inkscape, Gimp, VTK and ITK among other things installed. Remember, lots of devs work on Windows, and quite a large body of users too. Not to mention other environments, like embedded systems which might or might not be able to work with full autotools...
Re: (Score:2)
Indeed, but a book can make it easier to develop such a thing.
Re: (Score:1, Interesting)
They have been.
Cmake among others has effectively replaced autotools. It's FAR easier to deal with, cross platform, fast, will build makefiles, visual studio solutions, and X-code, and supports testing and other things.
There are some other ones around too like Scons, but the point is, anyone starting a new project now with autotools is a dolt or a masochist or both.
Autotools is dead. Let's let it be buried in peace, please.
Re: (Score:3, Insightful)
If that were true you wouldn't have needed to say it
Re: (Score:2)
The efficient markets hypothesis as applied to software. If it was better, we'd already use it!
Re:Autotools do not need a book (Score:5, Interesting)
I've just recently been in the situation of selecting a build system for a project with an existing codebase. I looked at the obvious alternatives, including cmake.
In the end, I chose autotools.
When you're doing a non-trivial project, cmake doesn't become any less complicated than autoconf and automake anymore - if your build is complex, you have to deal with that complexity somewhere after all. And there's a lot more and better resources for using autotools than cmake around, for figuring out odd corner cases. If you have a somewhat odd build requirement, chances are somebody else has already solved it using autotools already.
From my experience so far, most of what people dislike about using autotools come from Automake. But Automake is of course completely optional to use, and Autoconf - which provides most of the benefits - was made to be standalone. If you have a system with existing makefiles, it makes a lot of sense to simply use Autoconf to configure the app and the makefiles and leave Automake alone.
This is a lengthy but really illuminating document on using the autotools, that specifically goes through using autoconf alone and on how to adapt an existing project: http://www.freesoftwaremagazine.com/books/agaal/brief_introduction_to_gnu_autotools/ [freesoftwaremagazine.com]
Re: (Score:1)
"In case you didn't notice, the article you linked to was written by the same guy who wrote the book being reviewed, John Calcote."
Ah, you're right. Time to buy that book, then.
Re: (Score:1)
From my experience so far, most of what people dislike about using autotools come from Automake. But Automake is of course completely optional to use, and Autoconf - which provides most of the benefits - was made to be standalone. If you have a system with existing makefiles, it makes a lot of sense to simply use Autoconf to configure the app and the makefiles and leave Automake alone.
That's until you go into building shared libraries for multiple platforms. Then automake and libtool are your saviors. Also, I find automated header dependency tracking nice to use.
Re: (Score:3, Interesting)
I agree -- autoconf is independent, and does a great job handling system configuration stuff without involving automake -- but I think you're being a bit unfair to automake.
For projects that "fit" automake, it's actually a wonderful tool, as it allows a highly concise description of the package contents and dependencies, with almost zero fat and overhead, and does pretty much all the typical boilerplate stuff (convenience targets, separate build-directory support, installation, automatic dependency generat
Re: (Score:1)
"For projects that "fit" automake, it's actually a wonderful tool, ..."
You're absolutely right, and it seems to save a huge amount of headaches when you can use it. Note though that I'm adapting an existing codebase, that needs a certain structure to be buildable using non-unix tools and systems. While I can get a more or less buildable version using Automake there's enough oddities to deal with that adapting the previous hand-rolled makefile seems preferable.
Even better would be this (Score:2)
decided to make the experience easier to newcomers by sharing his years of experience and carefully crafted bag of tricks.
Even better would be reading that this gentleman had gotten behind efforts to make working with the tools easier. Simply teaching me tricks though welcome, is not good enough. Working with the tool(s) still is difficult.
Re: (Score:2)
Re: (Score:1, Informative)
Oh but I might still buy a copy. Just to wipe my @ss with. I can't begin to think of the hours I've wasted debugging build failures of this heap of cr@p.
mk-configure (Score:1)
Re: (Score:2)
Over the years I also had made a meta-makefile which does much of what I needed.
Latest version is on github at
But even then quite often I just use qmake!
--jeffk++
Re: (Score:2)
Re: (Score:2)
My thoughts exactly. Autotools is/are abominations.
Re: (Score:2)