My project is a library and automake is configured to build it and test it. There is also additional target which builds demo application for my library. It's defined in Makefile.am as EXTRA_PROGRAMS. I'd like to be able to install with make install or similar. Is there a way to do it but still keep optionality of this target (i.e. simply defining this target in bin_PROGRAMS will make this target required)?
The usual way to do this sort of thing is to have configure substitute the value into bin_PROGRAMS conditionally. In your Makefile.am this would look like:
bin_PROGRAMS = main-program $(test_program)
EXTRA_PROGRAMS = test-program
Then in configure.in you'd do something like:
if mumble; then
test_program=test-program
fi
AC_SUBST(test_program)
I've run into a problem with automake that I can't seem to find a clean solution for, which seem like it should be possible (even simple), but nothing simple works.
Basically the problem I have is with a source file that includes an autogenerated header file. I can add the dependencies to generate the header file just file, and once the header exists, everything works, as automake's auto dependency generation takes care of everything. The problem is the first time you run make in a clean tree, the dependency files don't exist, so automake doesn't know to generate the header file, which makes the compile of the file including the header fail without generating any dependencies. Its a chicken-and-egg problem -- you need to manually tell (auto)make to build the header file.
The obvious solution is just to add a dependency to the Makefile.am file for the header, but that doesn't work, since having a dependency for a target override automake's automatic rule generation, as the docs say:
Note that Automake does not make any distinction between rules with commands and rules that only specify dependencies. So it is not possible to append new dependencies to an automake-defined target without redefining the entire rule.
For now I've hacked around the problem by 'hiding' the dependency from automake, but this only works for GNU-make:
Makefile.am:
bin_PROGRAMS = foo
foo_SOURCES = main.c foobar.c baz.c
gen.h: system.spec
...command to regen gen.h
# foobar.c #includes gen.h, so it needs to exist prior to compiling foobar.c
$(eval foo-foobar.o: gen.h)
This does the trick, but seems ugly. Is there a better automake-safe way of doing this?
Automake supplies BUILT_SOURCES to solve this problem. Files added to this are built before ordinary compilations are done -- it is specifically intended for generated headers and sources.
In your case this should suffice:
BUILT_SOURCES = gen.h
sample Makefile.am should be like
bin_PROGRAMS = foo
foo_SOURCES = main.c foobar.c baz.c
nodist_foo_SOURCES = gen.h
BUILT_SOURCES = gen.h
CLEANFILES = gen.h
gen.h: Makefile system.spec
command to regen gen.h
#above line should begin with a <TAB>
Automake 1.14 is causing us a few issues. At first, automake errored with the complaint:
warning: source file 'X' is in a subdirectory but option 'subdir-objects' is disabled
So I enabled subdir-objects, but now it isn't recompiling some files. For example, lets say
src/a/foo.c is compiled in SUBDIR a but in src/b, I would like to compile it again with different preprocessor flags, however since ../a/foo.o already exists, make doesn't rebuild it. This is because subdir-objects changes am_b_OBJECTS to look for ../a/foo.o instead of foo.o. Is there a way I can get around the original complaint and instruct make to build the file a second time with the appropriate preprocessor flags? This all worked on previous versions of automake.
I would settle for executing rm ../a/foo.o before compiling src/b but I don't know how to edit the Makefile.am to make that happen.
This happens if you're using subdir-objects under the same tree from different Makefile.am files. As automake can't see you're using the same source file with different parameters it'll assume it was rebuilt correctly.
The proper solution to this is to not use separate Makefile.am files and instead rephrase the build system as non-recursive automake and so in that case it would then build foo.c as foo-a.o and foo-b.o.
I've added code to an existing large application and need to make GLib a requirement, as my code relies on it. For development, I just manually edited the Makefile to add
-lglib-2.0
To the LIBS= variable and
-I/usr/include/glib-2.0 -I/usr/lib64/glib-2.0/include $<
to the line starting with ${CC}.
However, I am at a loss for how to make this permanent/portable in the app -- i.e. when someone executes ./configure in the future, the resulting Makefile should also include the above (as appropriate, since these depend on pkg-config output, I've learned). The codebase I updated includes the following files from the gnu tool chain:
Makefile.in
Makefile.manual
config.h.in
configure
configure.in
I only have a handful of CS degrees and a few years of development experience, so the GNU toolchain remains utterly impenetrable to me. :-/ From googling around, I'm under the impression there should also be a configure.ac file or something where I should add a macro for requiring glib, but no such file is included in the package and I'm at the point of learned helplessness with the whole automake/autoconf/configure/makefile business. Thanks in advance for any advice or pointers!
You should not edit any generated files manually. This includes the final Makefile used to build the application.
In configure.ac, every dependency is listed, thus checking for GLib should go in there. From this file, your final configure shell script is generated.
GLib provides a pkgconfig description so you almost always want to use this to get the correct compile and link flags.
Combining pkgconfig and Autotools is just a matter of calling the PKG_CHECK_MODULES macro. The Autotools Mythbuster is an excellent source that describes how to do it.
In the end it boils down to adding these lines to your configure.ac:
PKG_PROG_PKG_CONFIG
PKG_CHECK_MODULES([GLIB], [glib-2.0])
and these lines to your Makefile.am:
foo_CXXFLAGS = $(GLIB_CFLAGS)
foo_LIBS = $(GLIB_LIBS)
The last sentence in the article caught my eye
[F]or C/C++ developers and
students interested in learning to
program in C/C++ rather than users of
Linux. This is because the compiling
of source code is made simple in
GNU/Linux by the use of the 'make'
command.
I have always used gcc to compile my C/C++ programs, whereas javac to compile my Java programs. I have only used make to install programs to my computer by configure/make/make install.
It seems that you can compile apparently all your programs with the command make.
What is the difference between make and gcc?
Well ... gcc is a compiler, make is a tool to help build programs. The difference is huge. You can never build a program purely using make; it's not a compiler. What make does it introduce a separate file of "rules", that describes how to go from source code to finished program. It then interprets this file, figures out what needs to be compiled, and calls gcc for you. This is very useful for larger projects, with hundreds or thousands of source code files, and to keep track of things like compiler options, include paths, and so on.
gcc compiles and/or links a single file. It supports multiple languages, but does not knows how to combine several source files into a non-trivial, running program - you will usually need at least two invocations of gcc (compile and link) to create even the simplest of programs.
Wikipedia page on GCC describes it as a "compiler system":
The GNU Compiler Collection (usually shortened to GCC) is a compiler system produced by the GNU Project supporting various programming languages.
make is a "build tool" that invokes the compiler (which could be gcc) in a particular sequence to compile multiple sources and link them together. It also tracks dependencies between various source files and object files that result from compilation of sources and does only the operations on components that have changed since last build.
GNUmake is one popular implementation of make. The description from GNUmake is as follows:
Make is a tool which controls the generation of executables and other non-source files of a program from the program's source files.
Make gets its knowledge of how to build your program from a file called the makefile, which lists each of the non-source files and how to compute it from other files.
gcc is a C compiler: it takes a C source file and creates machine code, either in the form of unlinked object files or as an actual executable program, which has been linked to all object modules and libraries.
make is useful for controlling the build process of a project. A typical C program consists of several modules (.c) and header files (.h). It would be time-consuming to always compile everything after you change anything, so make is designed to only compile the parts that need to be re-compiled after a change.
It does this by following rules created by the programmer. For example:
foo.o: foo.c foo.h
cc -c foo.c
This rule tells make that the file foo.o depends on the files foo.c and foo.h, and if either of them changes, it can be built by running the command on the second line. (The above is not actual syntax: make wants the commands indented by a TAB characters, which I can't do in this editing mode. Imagine it's there, though.)
make reads its rules from a file that is usually called a Makefile. Since these files are (traditionally) written by hand, make has a lot of magic to let you shorten the rules. For example, it knows that a foo.o can be built from a foo.c, and it knows what the command to do so is. Thus, the above rule could be shortened to this:
foo.o: foo.h
A small program consisting of three modules might have a Makefile like this:
mycmd: main.o foo.o bar.o
$(CC) $(LDFLAGS) -o mycmd main.o foo.o bar.o
foo.o: foo.h bar.h
bar.o: bar.h
make can do more than just compile programs. A typical Makefile will have a rule to clean out unwanted files:
clean:
rm -f *.o core myapp
Another rule might run tests:
check: myapp
./myapp < test.input > test.output
diff -u test.correct test.output
A Makefile might "build" documentation: run a tool to convert documentation from some markup language to HTML and PDF, for example.
A Makefile might have an install rule to copy the binary program it builds to wherever the user or system administrator wants it installed.
And so on. Since make is generic and powerful, it is typically used to automate the whole process from unpacking a source tarball to the point where the software is ready to be used by the user.
There is a whole lot of to learn about make if you want to learn it fully. The GNU version of make has particularly good documentation: http://www.gnu.org/software/make/manual/ has it in various forms.
Make often uses gcc to compile a multitude of C or C++ files.
Make is a tool for building any complex system where there are dependancies between the various system components, by doing the minimal amount of work necessary.
If you want to find out all the things make can be used for, the GNU make manual is excellent.
make uses a Makefile in the current directory to apply a set of rules to its input arguments. Make also knows some default rules so that it executes even if it doesn't find a Makefile (or similar) file in the current directory. The rule to execute for cpp files so happens to call gcc on many systems.
Notice that you don't call make with the input file names but rather with rule names which reflect the output. So calling make xyz will strive to execute rule xyz which by default builds a file xyz (for example based on a source code file xyz.cpp.
gcc is a compiler like javac. You give it source files, it gives you a program.
make is a build tool. It takes a file that describes how to build the files in your project based on dependencies between files, so when you change one source file, you don't have to rebuild everything (like if you used a build script). make usually uses gcc to actually compile source files.
make is essentially an expert system for building code. You set up rules for how things are built, and what they depend on. Make can then look at the timestamps on all your files and figure out exactly what needs to be rebuilt at any time.
gcc is the "gnu compiler collection". There are many languages it supports (C, C++, Ada, etc depending on your setup), but still it is just one tool out of many that make may use to build your system.
You can use make to compile your C and C++ programs by calling gcc or g++ in your makefile to do all the compilation and linking steps, allowing you to do all these steps with one simple command. It is not a replacement for the compiler.
'gcc' is the compiler - the program that actually turns the source code into an executable. You have to tell it where the source code is, what to output, and various other things like libraries and options.
'make' is more like a scripting language for compiling programs. It's a way to hide all the details of compiling your source (all those arguments you have to pass the compiler). You script all of the above details once in the Makefile, so you don't have to type it every time for every file. It will also do nifty things like only recompile source files that have been updated, and handle dependancies (if I recompile this file, I will then need to recompile THAT file.)
The biggest difference is that make is turing complete (Are makefiles Turing complete?) while gcc is not.
Let's take the gcc compiler for example.
It only knows how to compile the given .cpp file into .o file given the files needed for compilation to succeed (i.e. dependencies such as .h files).
However, those dependencies create a graph. e.g., b.o might require a.o in the compilation process which means it needs to be compiled independently beforehand.
Do you, as a programer want to keep track of all those dependencies and run them in order for your target .o file to build?
Of course not. You want something to do that task for you.
Those are build tools - tools that help making the build process (i.e. building the artifacts like .o files) easier. One such tool is make.
I hope that clarifies the difference :)