I have been having troubles getting my makefiles to work the way I want. First off, I would like to say this is POSIX make, as in http://www.opengroup.org/onlinepubs/009695399/utilities/make.html I am needing my build system to work with both BSDs and GNUs(Linux).
What I am wanting is a zero maintenance makefile. I want it to just compile all .c and .asm files in src/ and place the object files in objs/ and then to link everything in objs/ to a binary file.
I can do a lot, but I can't get it to separate the source and obj files.
I am ok if this requires a little built-in shell scripting (using POSIX defined /bin/sh), but I can just not get the dependencies to work right. I want it to only build the object file if the source file is newer.
My closest is this:
${C_OBJS}: ${HDRS} ${*:objs/%=src/%}.c
${CC} ${CFLAGS} -c ${*:objs/%=src/%}.c -o $*.o
This has the problem that I must still specify C_OBJS=objs/foo.o and such and also it is just barely not POSIX and therefore, compiles with BSD make but not GNU make.
The POSIX version of make does not explicitly support file names with slashes in them, nor does it make provision for separating source files in a different directory from the object files. And, as noted by #caskey, it does not support any notation using '%' characters, though it notes that such rules exist and recommends that they be reserved for use as metacharacters.
Consequently, you probably cannot do what you want with standard POSIX make.
In practice, you can often do what you seek with specific implementations of make, but the resulting makefile has limited portability.
Consider using a makefile generation systems of some sort - cmake or the auto-tools (autoconf, libtool, automake, etc). Or one of the many reworkings of the basic concepts of make:
scons
ant
cake
cook
bras
...and a dozen I've forgotten or not heard of...
POSIX make doesn't support constructs like?
objs/%.o : src/%.c
${CC} ${CFLAGS} -c $< -o $#
Forgot the question mark at the end, hope that makes my comment more clear.
Related
I am very much new to make files , I am facing very basic problem , My Makefile doesn't detect changes I made to source files . The problem is , when I first time generate consoleapp binary from my source file i get expected output . But When I change source file again and when I run make again it says
make: 'consoleapp' is up to date , So what changes I have to give to make file so that it detects my changes
Below is my Makefile :
consoleapp:
g++ consoleapp.cpp -o consoleapp
clean:
rm -rf *.o consoleapp
This is my Source File :
#include <iostream>
using namespace std;
int main()
{
cout<<"I am ok \n"; // I am changing this line again after giving make
return 0;
}
make relies on the makefile author to tell it what each target's prerequisites are -- that is, which other targets or files affect the construction of the target in question, so that if they are newer or themselves out of date then the target is out of date and should be rebuilt. As your other answer already indicates, you do not designate any prerequisites for your targets, so make considers them out of date if and only if they don't exist at all.
That's actually problematic for both targets, albeit in different ways. For target consoleapp, which represents an actual file that you want to build, the failure to specify any prerequisites yields the problem you ask about: make does not recognize that changes to the source file necessitate a rebuild. The easiest way to fix that would be to just add the source file name to the recipe's header line, after the colon:
consoleapp: consoleapp.cpp
g++ consoleapp.cpp -o consoleapp
Generally speaking, however, it is wise to minimize duplication in your makefile code, and to that end you can use some of make's automatic variables to avoid repeating target and prerequisite names in your rule's recipe. In particular, I recommend always using $# to designate the rule's target inside its recipe:
consoleapp: consoleapp.cpp
g++ consoleapp.cpp -o $#
It's a bit more situational for prerequisites. In this case, all the prerequisites are source files to be compiled, and furthermore there is only one. If you are willing to rely on GNU extensions then in the recipe you might represent the sources via either $< (which represents the first prerequisite), or as $^ (which represents the whole prerequisite list, with any duplicates removed). For example,
consoleapp: consoleapp.cpp
g++ $^ -o $#
If you are not using GNU make, however, or if you want to support other people who don't, then you are stuck with some repetition here. You can still save yourself some effort, especially in the event of a change to the source list, by creating a make variable for the sources and duplicating that instead of duplicating the source list itself:
consoleapp_SRCS = consoleapp.cpp
consoleapp: $(consoleapp_SRCS)
g++ $(consoleapp_SRCS) -o $#
I mentioned earlier that there are problems with both of your rules. But what could be wrong with the clean rule, you may ask? It does not create a file named "clean", so its recipe will be run every time you execute make clean, just as you want, right? Not necessarily. Although that rule does not create a file named "clean", if such a file is created by some other means then suddenly your clean rule will stop working, as that file will be found already up to date with respect to its (empty) list of prerequisites.
POSIX standard make has no solution for that, but GNU make provides for it with the special target .PHONY. With GNU make, any targets designated as prerequisites of .PHONY are always considered out of date, and the filesystem is not even checked for them. This is exactly to support targets such as clean, which are used to designate actions to perform that do not produce persistent artifacts on the file system. Although that's a GNU extension, it is portable in the sense that it uses standard make syntax and the target's form is reserved for extensions, so a make that does not support .PHONY in the GNU sense is likely either to just ignore it or to treat it as an ordinary rule:
.PHONY: clean
clean:
rm -rf *.o consoleapp
because your target has no dependence. Please use this codes that rely to all cpp file in current dir to update binary.
SRCS=consoleapp.cpp
consoleapp: $(SRCS)
g++ $< -o $#
This question and its answer explains the importance of linking command line order.
However, I deal with a lot of makefiles containing lines like
$(CC) $(LDFLAGS) $^ -o $#
Aparently, commands like these just work on some systems, but not on mine. Is there a way to work around this behaviour other than finding and patching all Makefiles like these?
I am using gcc (Ubuntu/Linaro 4.6.3-1ubuntu5) 4.6.3
EDIT for clarification:
I do a lot of patching and integration as part of my job(Usually buildroot or LTIB) , and I come across multiple makefiles written like this. Also some example compiling commands on the web follow the same pattern.
So, the problem is that some idiot has created a makefile that puts all the library options before the .o file. Presumably there is some broken toolchain out there that doesn't care, but that doesn't help you.
There's only two options here:
Fix the makefile.
Create a compiler wrapper that reorders the options. Such a wrapper would be broken in general, but could make it work for the exact pattern used by your makefiles.
I did wonder if --start-group would help you here, but a quick experiment suggests not.
We use Microsoft NMAKE to compile a large number of native C++ and some Intel Fortran files. Typically the makefiles contains lines such as this (for each file):
$(LINKPATH)\olemisc.obj : ole2\olemisc.cpp $(OLEMISC_DEP)
$(CCDEBUG) ole2\olemisc.cpp
$(GDEPS) ole2\olemisc.cpp
OLEMISC_DEP =\
e:\ole2\ifaceole.hpp\
e:\ole2\cpptypes.hpp\
etc.
It works fine, but compiles one file at a time. We would like to take advantage of multi core processors and compile more than one file at a time. I would appreciate some advice about the best way to make that happen, please. Here is what I have so far.
One: GNU make lets you execute parallel jobs using the --jobs=2 option for example and that works fine with GCC (we cant use GCC sadly). But Microsoft's NMAKE does not seem to support such an option. How compatible would the two name programs be, and if we did start using GNU MAKE, can you run two cl.exe processes at the same time? I would expect them to complain about the PDB (debug) file being locked, or does one of the newer cl.exe command line arguments get you around that?
Two: cl.exe has a /MP (build with multiple processes) flag, which lets you compile multiple files at the same time if passed together via the command line, for example:
cl /MP7 a.cpp b.cpp c.cpp d.cpp e.cpp
But using this would require changes to the makefile. Our make files are generated by a our own program from other files, so I can easily change what we put in the makefiles. But how do you combine the dependencies from different cpp files together in the makefile so they get compiled together via one cl.exe call? Each .obj is a different target with a set of commands to make it?
Or do I change the makefile to not call cl.exe, but rather some other little executable that we write, which then collects a series of .cpp files together and shells out to cl.exe passing multiple arguments? That would work and seems doable, but also seems overly complicated and I cant see anyone else doing that.
Am I missing something obvious? There must be a simpler way of accomplishing this?
We do not use Visual Studio or a solution file to do the compiles, because the list of files is extensive, we have a few special items in our makefiles, and theoretically do not want to be overly tied to MS C++ etc.
I thoroughly recommend GNU make on windows. I tend to use cygwin make as the environment it creates tends to be very portable to Unix-like platforms (Mac and Linux for a start). Compiling using the Microsoft toolchain, in parallel and with 100% accurate dependencies and CPU usage works very well. You have other requirements though.
As far as your nmake question goes, look up batch-mode inference rules in the manual. Basically, nmake is able to call the C compiler once, passing it a whole load of C files in one go. Thus you can use the compiler's /MP... type switches.
Parallel compiling built into the compiler? Pah! Horribly broken I say. Here is a skeleton anyway:
OBJECTS = a.obj b.obj c.obj
f.exe: $(OBJECTS)
link $** -o $#
$(OBJECTS): $$(#R).c
# "The only syntactical difference from the standard inference rule
# is that the batch-mode inference rule is terminated with a double colon (::)."
.c.obj::
cl -c /MP4 $<
EDIT
If each .obj has its own dependencies (likely!), then you simply add these as separate dependency lines (i.e., they don't have any shell commands attached).
a.obj: b.h c.h ../include/e.hpp
b.obj: b.h ../include/e.hpp
∶
Often such boiler plate is generated by another tool and !INCLUDEd into the main makefile. If you are clever, then you can generate these dependencies for free as you compile. (If you go this far, then nmake starts to creak at the seams and you should maybe change to GNU make.)
One further consideration to keep in mind here is this: You basically have to define one batch rule for each path and extension. But if you have two files with the same name in two different source directories with a batch inference rule for both of those directories, the batch rule might not pick the one you want.
Basically the make system knows it needs to make a certain obj file, and as soon as it finds an inference rule that lets it do that, it will use it.
The work around is to not have duplicate named files, and if that cant be avoided, dont use inference or batch rules for those files.
Ok, I spent some time this morning working on this, and thanks to bobbogo, I got it to work. Here are the exact details for anyone else who is considering this:
Old style makefile which compiles one file at a time has tons of this:
$(LINKPATH)\PS_zlib.obj : zlib\PS_zlib.cpp $(PS_ZLIB_DEP)
$(CC) zlib\PS_zlib.cpp
$(LINKPATH)\ioapi.obj : zlib\minizip\ioapi.c $(IOAPI_DEP)
$(CC) zlib\minizip\ioapi.c
$(LINKPATH)\iowin32.obj : zlib\minizip\iowin32.c $(IOWIN32_DEP)
$(CC) zlib\minizip\iowin32.c
Note that each file is compiled one at a time. So now you want to use the fancy Visual Studio 2010 /MP switch "/MP[n] use up to 'n' processes for compilation" to compile multiple files at the same time. How? Your makefile needs to make use of batch inference rules in nmake, as follows:
$(LINKPATH)\PS_zlib.obj : zlib\PS_zlib.cpp $(PS_ZLIB_DEP)
$(LINKPATH)\ioapi.obj : zlib\minizip\ioapi.c $(IOAPI_DEP)
$(LINKPATH)\iowin32.obj : zlib\minizip\iowin32.c $(IOWIN32_DEP)
#Batch inference rule for extension "cpp" and path "zlib":
{zlib}.cpp{$(LINKPATH)}.obj::
$(CC) $(CCMP) $<
#Batch inference rule for extension "c" and path "zlib\minizip":
{zlib\minizip}.c{$(LINKPATH)}.obj::
$(CC) $(CCMP) $<
In this case, elsewhere, we have
CCMP = /MP4
Note that nmake inference batch rules do not support wildcards or spaces in the paths. I found some decent nmake documentation somewhere that states that you need to create a separate rule for every extension and source file location, you can not have one rule if the files are in the different locations. Also, files that use #import can not be compiled with /MP.
We have a tool that generates our makefiles, so it now also also generates the batch inference rules.
But it works! The time to compile one large dll went from 12 minutes down to 7 minutes! Woohoo!
Is there a mechanism in make to allow for default global implicit rules that are available anywhere, similar to the built-in rules?
Make provides some built-inimplicit rules for compiling C/C++/Fortran files, without even requiring a Makefile for simple cases. However, when compiling other languages (e.g. Go programming language files), a Makefile is always required. I would like to extend my Makeenvironment to have implicit rules available by default.
This is not normally desirable, as it would cause your Makefile to be less portable; it wouldn't work on somebody else's machine if they didn't have it set up that way.
However, if you want to do this, create a "global" Makefile somewhere with your default rules for Go files, then add its path to the MAKEFILES environment variable. This global Makefile will be processed before any Makefile when you run "make", just as if you had included its source at the top of the file.
I'm assuming you're referring to the fact that you can do
make hello.o
and make will automatically know how to make the .o from a .c file (or indeed from a .f or .p, if one exists) - but you want to do this for custom file types (say, building a .bar from a .foo.
The most portable way of doing this is as follows (in your Makefile):
.SUFFIXES: .foo .bar
.foo.bar:
foo2bar -in $> -out $#
The first line (.SUFFIXES) warns make that you'll be treating these as special suffixes; the second line says "here's a recipe for making a .bar from a .foo. The third line gives the command for doing this - $> and $# get changed by make to the input and output filenames.
NOTE: The indent for the third line MUST be a tab character.
A much more flexible method, that only works with GNU make, is to use its support for implicit rules. If you can guarantee you'll be using GNU make then this is probably to be recommended.
While I agree with dmazzoni, I just though I'd add my make recipe for a Go Makefile:
# Include default Golang Make magic
include $(GOROOT)/src/Make.$(GOARCH)
# Hack the following line
your_program: your_program.$O
$(LD) -o $# $^
# Compiles .go-files into architecture-specific binaries
%.$O: %.go
$(GC) -o $# $^
clean:
rm your_program *.$O
(Note: the $O is DOLLAR + UPPERCASE-o - not zero!)
While I haven't tested it on all the machines I have available, i believe it should port fairly well.
I am new to Automake and I am attempting to compile without linking. My goal is to generate a simple Makefile as shown below using Automake.
CFLAG = -Wall
build: Thread.o
Thread.o: Thread.cc Thread.h
g++ $(CFLAG) -c Thread.cc
clean:
rm -f *.o
My attempt so far has brought me to the following Makefile.ac.
noinst_PROGRAMS = thread
thread_SOURCES = Thread.cc
EXTRA_DIST= Thread.h
How can I simulate my original Makefile?
One way is to do this is to fool Automake by providing link command that does not link:
thread_LINK = true
Other than that, I wouldn't be suprised if Automake did not have such feature.
For your example, you can just ask Automake to build your .o file directly, e.g.:
$ make Thread.o
I believe this is an implicit rule, so you won't see it in the output Makefile.
In general, Automake generates variables containing all the objects required for each executable or library target. It's pretty straightforward to use them in your Makefile, since it just generates their names by appending _OBJECTS to the target name. You could make your own target in Makefile.am like this:
build-thread: $(thread_OBJECTS)
Then you could build just Thread.o (and any other objects needed for thread) like this:
$ make build-thread
Or if you had multiple targets foo, bar, and baz, you could make your compile-only target in Makefile.am like this:
build: $(foo_OBJECTS) $(bar_OBJECTS) $(baz_OBJECTS)
The only pain here is that you'll need to maintain this list yourself based on the targets in your Makefile.am. You can invoke it at the command line like this:
$ make build
Automake is not designed to produce object. It will build either programs or libraries.
It's hard to answer your question without knowing why you'd want to compile a single object file and not something else. Maybe there is a cleaner answer to your "real" problem.
A Makefile.am you could write is
noinst_LIBRARIES = libThread.a
libThread_a_SOURCES = Thread.cc Thread.h # No need to put headers in EXTRA_DIST
The resulting Makefile would build a library libThread.a containing only libThread.o, ans because *.a libraries are just a collection of object files there is no linking involved.
The above Makefile.am also causes the emitted Makefile to contain rules to compile libThread.o, so you can add a build: rule if you like.
If you really want Automake to emit this compile rule, but not build the library, you could go with
EXTRA_LIBRARIES = libThread.a # EXTRA here means "output build rules but don't
# build unless something depends on it".
libThread_a_SOURCES = Thread.cc Thread.h
build: Thread.$(OBJEXT)
Now you are explicitely requiring the file Thread.$(OBJEXT) to be built only when you type make build, as in your original Makefile.
(Automake uses .$(OBJEXT) rather than .o to support extensions like .obj in DOS variants.)
First off, automake is a tool to auto make making Makefiles; make in and of itself is a whole different beast (and I'm pretty sure that what you were looking for was a make solution).
Here's the easiest GNU based Makefile to accomplish what you want:
all: Thread.o
This fills in something (by default) like the following (please change 4-space whitespace to hard tabs):
all: Thread.o
Thread.o: Thread.cc
$(COMPILE.cpp) $(OUTPUT_OPTION) $<
The COMPILE.cpp and OUTPUT_OPTION macros of course expand by default to GNU make specified values and aren't portable; $< is AT&T Make standard syntax though according to pmake(1)'s manpage though.
GNU make has a concept of implicit vs explicit rules, patterns, suffixes, etc that you could use, but that's not portable to all versions of make, and hence that's why all of the Makefile is plainly spelled out in terms of targets and variables as POSIX doesn't describe many of the desired scenarios for how one should write a Makefile.
Run gmake -p for more details and take a look at the texinfo manual for gmake in the topic of implicit, explicit rules, patterns, suffixes, etc.