Makefile builds target even if prerequisites haven't changed - makefile

Most makefiles have a structure such as this:
.PHONY: prebuild
all: $(TARGET)
prebuild: Makefile
$(shell DEPDIR=$(DEPDIR) mkdir -p $(DEPDIR)/../common >/dev/null)
# do other work related to preparing for the object files to be built such as run a script to modify a header file included by $(TARGET).c
$(TARGET): $(TARGET).c prebuild
$(CC) $(CFLAGS) -o $(TARGET) $(TARGET).c
The implicit rules know how to build $(TARGET).o from $(TARGET).c, and doesn't do any work if $(TARGET).o is already newer than $(TARGET).c. This happens when make is run multiple times without changing the source file.
However, building the all target above will seemingly always rerun the $(CC) $(CFLAGS) -o $(TARGET) $(TARGET).c link to link the application and create the application binary. This happens even if that binary already exists and doesn't need to be recreated. In some larger projects, this process can take a long time (tens of seconds), which is sometimes not desirable.
Edit #1: The issue has to do something with an extra phony target that I do want to run ONCE before the object files are built. In my case, I'm running a script which takes Makefile variables and possibly updates a header file that is included in the C file. But, if the Makefile doesn't change, the prebuild target isn't run. However, the $(TARGET) target is still run even if prebuild doesn't do anything (for instance, because Makefile wasn't changed). FYI: because of the structure of my build system, I have prebuild run always because my build system is used for a variety of applications that can dynamically redefine prebuild.
How can this Makefile be restructured to not do the linking again when not necessary?
Edit #2:
Here's a simplified example that seems to illustrate my issue:
Before running, create a new directory and touch a b
.PHONY: prebuild main all
all: main
prebuild: a Makefile
#echo prebuild ran
main: prebuild
#echo main ran
When I run, I get this output:
prebuild ran
main ran
This is what happens no matter how many times I run make, even though the prerequisite a nor Makefile doesn't change. What I expect to happen is prebuild doesn't run (because a and Makefile don't change) and main also doesn't run because prebuild doesn't run. Clearly, I'm misunderstanding something.

The problem is that extra dependency triggering your rebuild.
Try this:
.PHONY: all
OUTPUTDIR=common/
TARGET=finalexe
all: $(OUTPUTDIR)/$(TARGET)
$(OUTPUTDIR)/$(TARGET): $(TARGET).c | $(OUTPUTDIR)
$(CC) $(CFLAGS) -o $# $(TARGET).c
$(OUTPUTDIR):
mkdir -p $#
In this above example, 'finalexe' will be created if A. it doesn't yet exist or B. if finalexe.c was modified. The timestamp on the OUTPUTDIR is not checked.

Related

Creating multiple executables in Makefile

Im fairly new to makefiles. I want to compile multiple executables through my makefile, and it was to my understanding that having a target with multiple entries would run the recipe for that target through all entries. My example is:
$(EXE): $(OBJS)
g++ -o $# $< -L$(LIBPATH) -lOSApi -lrt -lpthread
My EXE variable contains all files that should be created, something like: prog1 prog2 and so on. My OBJS contains prog1.o prog2.o and so on.
When running make i create all .o files perfectly, but i only create one executable. I have tried replacing $# with $(EXE) and such, but no luck so far.
Any help would be appreciated.
EDIT:
I found the solution through MadScientist, who suggested to add an all target, and then changing my executable target to:
$(EXE): % : %.o
g++ -o $# $< -L$(LIBPATH) -lOSApi -lrt -lpthread
.PHONY: all clean
all: $(EXE)
Which to my understanding makes every target in my EXE target dependant on its corresponding .o file.
It would help greatly if you provided a full (small) sample. In the question you don't show us what the definition of EXE or OBJS is which makes it hard to say exactly.
Also, please be sure to format your question correctly.
By default make only builds the FIRST target in the makefile. It doesn't build ALL the targets in the makefile. So, if EXE contains multiple targets and the first rule in your makefile is $(EXE) : ... then only the first target in that list will be built.
You should add a new target before the above, saying that you want the default to build all the exe's. You can call it anything you like but the convention is to call it all:
all: $(EXES)
(you can also add a .PHONY: all for safety). Now the first target in the makefile is all, and as prerequisites it will build all the targets in the EXES variable.

Delete targets with recipes failed in Makefile

I tried to use .DELETE_ON_ERROR target in makefile in order to delete both $(OBJ)
and executable files if the recipe fails, but it doesn't work. If I put an error inside any object file than while compiling the pattern rule an error occurs and it stops. The old object file is still on its place but I expect .DELETE_ON_ERROR to remove it.
Can anyone test the code? Can -include $(DEP) or flag -DDBG influence? The goal is to delete both the .o file that failed and the executable.
OUTPUT = executable
CPP := $(shell find $(SRC) -type f -name "*.cpp")
OBJ := $(CPP:.cpp=.o)
DEP := $(OBJ:.o=.d)
CXX := g++
CXXFLAGS =-MMD -MP -DDBG
INCLUDES = -I.
.DELETE_ON_ERROR :
$(OUTPUT): $(OBJ)
$(CXX) $^ -o $#
%.o: %.cpp
$(CXX) $(CXXFLAGS) $(INCLUDES) -c $< -o $#
-include $(DEP)
.PHONY : clean
clean:
rm -rf $(OBJ) $(DEP)
EDIT: According to the Ondrej K. solution to fix this problem you need to add #touch command before compilator in order to make the object files changed (the docs read "delete the target of a rule if it has changed".). So, the code should look like this:
%.o: %.cpp
#touch $#
$(CXX) $(CXXFLAGS) $(INCLUDES) -c $< -o $#
Not sure what failure you're seeing, but I am afraid there really isn't a good way for you to do that. .o files and executable ($(OUTPUT)) are separate rules. If the latter fails, former is already out of consideration. See the documentation:
.DELETE_ON_ERROR:
If .DELETE_ON_ERROR is mentioned as a target anywhere in the makefile, then make will delete the target of a rule if it has changed and its recipe exits with a nonzero exit status, just as it does when it receives a signal. See Errors in Recipes.
In other words, if your the target producing a binary object failed after .o target itself got updated, make would prune the changed file. But if your executable did not link, it won't go back and delete object files.
Not sure it'd be nice, but if you really needed to, you could probably achieve this by refactoring your makefile to basically have direct exec + objs from source prerequisites rule with a single recipe. Obvious downside, such rule would mean single .c file change causing all files being recompiled (basically negating substantial benefit of using make).
EDIT: I'll expand on the comment a bit to clarify. What you seem to want is: in case there is a broken .c file and compilation fails, remove the old .o file. That is quite clearly not how .DELETE_ON_ERROR works though. If the .o file already got updated, and then the rule failed, it would remove it ("delete the target of a rule if it has changed"), but in case of a mentioned syntactical problem, the compiler would fail before it would produced an .o file.
So, if for instance you updated your (pattern) rule for compilation so that it first touches (effectively updates timestamp) on the .o file and then tries to compile. After the compiler call and rule failed make would consider the target of the failed root to have been updated and remove it. Alternatively you could also change to rule to first try to rm the expected '.o' file in which case you actually wouldn't need to use .DELETE_ON_ERROR (and if there is no change in the relevant sources, the rule does not get used, so it's actually not as terrible as it sounds). Either way is not exactly very clean, but leads towards the behavior I understand you're describing.
It is possible that the Compiler crashes while writing the Output file. In this case, there is a corrupt output file that is newer than its sources. Make will stop due to the error, but on next run, it won't recompile the output file as it is newer than ist sources - and the make will fail again and again in the build step.
With the .DELETE_ON_ERROR rule, make will delete the Output file if the compiler (or whatever build step failed) exits with an error after touching (and corrupting) the Output file, so it will be recompiled on next run. (if the Compiler failed without touching the old output file, it will always be recompiled on next run anyway)

Makefile doesn't rebuild the obj's when the CFLAGS are modified?

As we know that the binary depends on the obj's, and the obj's depends on the .c files ( assuming a C Project). Let's say, I have a env.mk file. This file has a flag like 'export NO_DISPLAY=YES'. In the main Makefile, I have the following.
ifeq ($(NO_DISPLAY),YES)
CFLAGS += -D__DISPLAY_DISABLE
endif
Obviously, env.mk is included in the main make file. whenever, I change the flag value 'NO_DISPLAY'. The makefile never rebuilts the executable again. However, the same works fine when the .o files are deleted. I understand that the reason behind it as it depends on the .c,.h files. The .c .h files are not modified, therefore makefile ignores to rebuild it. But, I would like makefile to rebuild the code if the CFLAGS value is changed. How can I do it? Please note, I don't want to delete the objs and rebuild it.
target_dbg: $(patsubst ./src/%.c,./obj_dbg/%.o,$(wildcard ./src/*.c))
#echo "Target main rule__dbg $(NPROCS)"
$(CC) $(patsubst ./src/%.c,./obj_dbg/%.o,$(wildcard ./src/*.c)) $(LIBS) -o gif_dbg
./obj_dbg/%.o: ./src/%.c ./include/*.h
#echo "I am called first..dbg"
#mkdir -p ./obj_dbg
#$(CC) $(CFLAGS) -E $<
$(CC) $(CFLAGS) $(LDFLAGS) -DDEBUG -c $< -o $#
Any help will be appreciated.
Make simply works by examining timestamps on files. You hardly want every build artefact to depend on your Makefile (at least not while actively developing it) but if you seriously want Make to handle this dependency, you could put the CFLAGS definition in a secondary file buildflags.mk, include it from the main Makefile, and make all object files depend on buildflags.mk.
I hardly think anybody would actually do this in practice, though. There will always be situations where the only way to be sure you get a clean build is to flush everything and start over. Make sure you have good and up-to-date realclean and/or distclean targets, and make sure you remember to use them when you make fundamental changes to your build infrastructure. Having a nightly build job (or similar) which starts the build from a completely clean slate -- e.g. by checking out a new copy into a temporary directory -- is also obviously a good idea.
Alternatively, or additionally, include a copy of the build flags as a static string in each object file, so you can verify them later, perhaps using a --help option or similar.
You could use make's -B option to force a rebuild each time you change your CFLAGS. See this answer.

parallel make: two targets depend on the same prerequisite, what happens?

I need to build sources to binary file and two a static library.
Here is an example (I replaced recipes with ';' for brevity):
objects := a.o b.o ...
.PHONY: all build build_lib
all: build build_lib
build: bin/app
build_lib: bin/libapp.a
bin/app: $(objects) ;
bin/libapp.a $(objects) ;
obj/%.o: %.cpp ;
Will there be problems with parallel build? Can two make processes try to rebuild the same *.o file at once, making a broken build?
I guessed that, they can, so I rewritten the code this way:
objects := a.o b.o ...
.PHONY: all build build_lib
all: $(objects) | bin/app bin/libapp.a
build: bin/app
build_lib: bin/libapp.a
bin/app: $(objects) ;
bin/libapp.a $(objects) ;
obj/%.o: %.cpp ;
But the --debug=b output still shows:
Processing target file `all'.
File `all' does not exist.
Processing target file `bin/app'.
File `bin/app' does not exist.
Processing target file `obj/client.o'.
Need to rebuild target `obj/client.o'.
...
File `sb_all' does not exist.
File `bin/app' does not exist.
File `bin/libapp.a' does not exist.
File `sb_all' does not exist.
File `bin/app' does not exist.
File `bin/libapp.a' does not exist.
...
Need to rebuild target `bin/app'.
g++ -lgd ...
Need to rebuild target `bin/libapp.a'.
ar ...
File `all' does not exist.
Target file `all' rebuilt successfully.
So it seems that my $(objects) target does not run before order-only prerequisites, or do I incorrectly read output? And did I need this change anyway?
No, there is no problem with it. Make will not have any problem with parallelism and multiple targets (in the same instance of make) depending on the same prerequisite. If you have recursive instances of make and multiple different make instances try to build the same target you'll have problems.
Order-only doesn't have any impact on parallelism at all. Make will still invoke things in parallel if possible. The only way to impact the order in which rules are run is to declare a prerequisite relationship between those targets. Here you're just saying that both the higher-level targets must be built before the all target, so that doesn't do anything to reduce parallelism.
Luckily as I said above, you don't have to. As long as your makefile correctly defines the dependency relationship between any two targets, make will handle the bigger picture just fine.

How do you implement a Makefile that remembers the last build target?

Let's say you have a Makefile with two pseudo-targets, 'all' and 'debug'. The 'debug' target is meant to build the same project as 'all', except with some different compile switches (like -ggdb, for example). Since the targets use different compile switches, you obviously need to rebuild the entire project if you switch between the two. But GNUmake doesn't naturally recognize this.
So if you type make all you'll get
Building ...
...
Then if you type make debug, you get
make: Nothing to be done for `debug'.
So my question is: how do you implement a clean solution in the Makefile to notice that the last build used a different pseudo-target, or different compile switches, than the one you want currently? If they are different, the Makefile would rebuild everything.
Put the build products into different directory trees (whilst keeping one copy of the source of course). That way you are always just a short compile from an up-to-date build, be it debug or release (or even others). No possibility of confusion either.
EDIT
Sketch of the above.
src := 1.c 2.c 3.c
bare-objs := ${src:%.c=%.o}
release-objs := ${bare-objs:%=Release/%}
debug-objs := ${bare-objs:%=Debug/%}
Release/prog: ${release-objs}
Debug/prog: ${debug-objs}
${release-objs}: Release/%.o: %.c # You gotta lurve static pattern rules
gcc -c $< -o $#
${debug-objs}: Debug/%.o: %.c
gcc -c $< -o $#
Release/prog Debug/prog:
gcc $^ -o $#
.PHONY: all
all: Release/prog ; echo $# Success
.PHONY: debug
debug: Debug/prog ; echo $# Success
(Disclaimer: not tested, nor even run through make.)
There you go. It's even -j safe so you can do make -j5 all debug. There is a lot of obvious boiler plate just crying out for tidying up.
Keeping variant sets of object files (as in bobbogo's solution) is probably the best way, but if for some reason you don't want to do that, you can use empty files as markers, to indicate which way you last built the executable:
%-marker:
#rm -f $(OBJECTS) *-marker
#touch $#
debug: GCCFLAGS += -ggdb
debug: SOMEOTHERFLAG = WHATEVER
all debug: % : %-marker
#echo making $#
#$(MAKE) -S GCCFLAGS='$(GCCFLAGS)' SOMEOTHERFLAG='$(SOMEOTHERFLAG)' main
There are other variants on this idea; you could have a small file containing the flag settings, which the makefile would build and include. That would be clever, but not really any cleaner than this.
The only clean solution is to incorporate the difference into the target names.
E.g. you can define a variable $(DEBUG) and consistently use it in all targets that depend on the compile step.

Resources