GNU Make: Batching multiple targets in single command invocation - makefile

Consider the following setup:
$ touch 1.src 2.src 3.src
$ cat Makefile
%.dst: %.src
#convert -o "$#" "$<"
We can compile our .src files into .dst files by running make 1.dst 2.dst 3.dst which calls the convert (just a placeholder) tool three times.
This setup is fine if there is little overhead in calling convert. However, in my case, it has a startup penalty of a few seconds for every single call. Luckily, the tool can convert multiple files in a single call while paying the startup penalty only once, i.e. convert -o '{}.dst' 1.src 2.src 3.src.
Is there a way in GNU make to specify that multiple src files should be batched into a single call to convert?
Edit: To be more precise, what feature I am looking for: Say that 1.dst is already newer than 1.src so it doesn't need to be recompiled. If I run make 1.dst 2.dst 3.dst, I would like GNU make to execute convert -o '{}.dst' 2.src 3.src.
A quick and dirty way would be creating a .PHONY rule that simply converts all src files to dst files but that way I would convert every src file each and every time. Further more, specifying dst files as prerequisites in other rules would also no longer be possible.
Thanks in advance!

If you have GNU make 4.3 or above, you can use grouped targets like this:
DST_FILES = 1.dst 2.dst 3.dst
SRC_FILES = $(_DST_FILES:.dst=.src)
all: $(DST_FILES)
$(DST_FILES) &: $(SRC_FILES)
convert -o '{}.dst' $?
#touch $(DST_FILES)
If your convert is only updating some of the targets then you need the explicit touch to update the rest.
Here's a way to do it with passing a goal on the command line that might work; change DST_FILES to:
DST_FILES := $(or $(filter %.dst,$(MAKECMDGOALS)),1.dst 2.dst 3.dst)

Is there a way in GNU make to specify that multiple src files should be batched into a single call to convert?
It is possible, but messy, to write make rules for build steps that produce multiple targets with a single run of the recipe, such that the recipe is executed just once if any of the targets needs to be updated. However, you clarify that
[if] 1.dst is already newer than 1.src [, and] I run make 1.dst 2.dst 3.dst, I would like GNU make to execute convert -o '{}.dst' 2.src 3.src.
. That's a slightly different problem. You can use the $? automatic variable in a recipe to get the prerequisites that are newer than the rule's target, but for that to serve the purpose, you need a rule with a single target.
Here's one slightly convoluted way to make it work:
DST_FILES = 1.dst 2.dst 3.dst
SRC_FILES = $(DST_FILES:.dst=.src)
$(DST_FILES): dst.a
ar x $< $#
dst.a: $(SRC_FILES)
convert -o '{}.dst' $?
x='$?'; ar cr $# $${x//src/dst}
The dst.a archive serves as the one target with all the .src files as prerequisites, so as to provide a basis for use of $?. Additionally, it provides a workaround for the problem that whenever that target is updated, it becomes newer than all the then-existing .dst files: .dst files that are out of date with respect to the archive but not with respect to the corresponding .src file are extracted from the archive instead of being rebuilt from scratch.

Related

Make file Doesn't detect changes in source files

I am very much new to make files , I am facing very basic problem , My Makefile doesn't detect changes I made to source files . The problem is , when I first time generate consoleapp binary from my source file i get expected output . But When I change source file again and when I run make again it says
make: 'consoleapp' is up to date , So what changes I have to give to make file so that it detects my changes
Below is my Makefile :
consoleapp:
g++ consoleapp.cpp -o consoleapp
clean:
rm -rf *.o consoleapp
This is my Source File :
#include <iostream>
using namespace std;
int main()
{
cout<<"I am ok \n"; // I am changing this line again after giving make
return 0;
}
make relies on the makefile author to tell it what each target's prerequisites are -- that is, which other targets or files affect the construction of the target in question, so that if they are newer or themselves out of date then the target is out of date and should be rebuilt. As your other answer already indicates, you do not designate any prerequisites for your targets, so make considers them out of date if and only if they don't exist at all.
That's actually problematic for both targets, albeit in different ways. For target consoleapp, which represents an actual file that you want to build, the failure to specify any prerequisites yields the problem you ask about: make does not recognize that changes to the source file necessitate a rebuild. The easiest way to fix that would be to just add the source file name to the recipe's header line, after the colon:
consoleapp: consoleapp.cpp
g++ consoleapp.cpp -o consoleapp
Generally speaking, however, it is wise to minimize duplication in your makefile code, and to that end you can use some of make's automatic variables to avoid repeating target and prerequisite names in your rule's recipe. In particular, I recommend always using $# to designate the rule's target inside its recipe:
consoleapp: consoleapp.cpp
g++ consoleapp.cpp -o $#
It's a bit more situational for prerequisites. In this case, all the prerequisites are source files to be compiled, and furthermore there is only one. If you are willing to rely on GNU extensions then in the recipe you might represent the sources via either $< (which represents the first prerequisite), or as $^ (which represents the whole prerequisite list, with any duplicates removed). For example,
consoleapp: consoleapp.cpp
g++ $^ -o $#
If you are not using GNU make, however, or if you want to support other people who don't, then you are stuck with some repetition here. You can still save yourself some effort, especially in the event of a change to the source list, by creating a make variable for the sources and duplicating that instead of duplicating the source list itself:
consoleapp_SRCS = consoleapp.cpp
consoleapp: $(consoleapp_SRCS)
g++ $(consoleapp_SRCS) -o $#
I mentioned earlier that there are problems with both of your rules. But what could be wrong with the clean rule, you may ask? It does not create a file named "clean", so its recipe will be run every time you execute make clean, just as you want, right? Not necessarily. Although that rule does not create a file named "clean", if such a file is created by some other means then suddenly your clean rule will stop working, as that file will be found already up to date with respect to its (empty) list of prerequisites.
POSIX standard make has no solution for that, but GNU make provides for it with the special target .PHONY. With GNU make, any targets designated as prerequisites of .PHONY are always considered out of date, and the filesystem is not even checked for them. This is exactly to support targets such as clean, which are used to designate actions to perform that do not produce persistent artifacts on the file system. Although that's a GNU extension, it is portable in the sense that it uses standard make syntax and the target's form is reserved for extensions, so a make that does not support .PHONY in the GNU sense is likely either to just ignore it or to treat it as an ordinary rule:
.PHONY: clean
clean:
rm -rf *.o consoleapp
because your target has no dependence. Please use this codes that rely to all cpp file in current dir to update binary.
SRCS=consoleapp.cpp
consoleapp: $(SRCS)
g++ $< -o $#

How to force a certain groups of targets to be always run sequentially?

Is there a way how to ask gmake to never run two targets from a set in parallel?
I don't want to use .NOTPARALLEL, because it forces the whole Makefile to be run sequentially, not just the required part.
I could also add dependencies so that one depends on another, but then (apart from being ugly) I'd need to build all of them in order to build the last one, which isn't necessary.
The reason why I need this is that (only a) part of my Makefile invokes ghc --make, which takes care of its dependencies itself. And it's not possible to run it in parallel on two different targets, because if the two targets share some dependency, they can rewrite each other's .o file. (But ghc is fine with being called sequentially.)
Update: To give a specific example. Let's say I need to compile two programs in my Makefile:
prog1 depends on prog1.hs and mylib.hs;
prog2 depends on prog2.hs and mylib.hs.
Now if I invoke ghc --make prog1.hs, it checks its dependencies, compiles both prog1.hs and mylib.hs into their respective object and interface files, and links prog1. The same happens when I call ghc --make prog2.hs. So if they the two commands get to run in parallel, one will overwrite mylib.o of the other one, causing it to fail badly.
However, I need that neither prog1 depends on prog2 nor vice versa, because they should be compilable separately. (In reality they're very large with a lot of modules and requiring to compile them all slows development considerably.)
Hmmm, could do with a bit more information, so this is just a stab in the dark.
Make doesn't really support this, but you can sequential-ise two targets in a couple of ways. First off, a real use for recursive make:
targ1: ; recipe1...
targ2: ; recipe2...
both-targets:
${MAKE} targ1
${MAKE} targ2
So here you can just make -j both-targets and all is fine. Fragile though, because make -j targ1 targ2 still runs in parallel. You can use dependencies instead:
targ1: ; recipe1...
targ2: | targ1 ; recipe2...
Now make -j targ1 targ2 does what you want. Disadvantage? make targ2 will always try to build targ1 first (sequentially). This may (or may not) be a show-stopper for you.
EDIT
Another unsatisfactory strategy is to explicitly look at $MAKECMDGOALS, which lists the targets you specified on the command-line. Still a fragile solution as it is broken when someone uses dependencies inside the Makefile to get things built (a not unreasonable action).
Let's say your makefile contains two independent targets targ1 and targ2. Basically they remain independent until someone specifies on the command-line that they must both be built. In this particular case you break this independence. Consider this snippet:
$(and $(filter targ1,${MAKECMDGOALS)),$(filter targ2,${MAKECMDGOALS}),$(eval targ1: | targ2))
Urk! What's going on here?
Make evaluates the $(and)
It first has to expand $(filter targ1,${MAKECMDGOALS})
Iff targ1 was specified, it goes on to expand $(filter targ2,${MAKECMDGOALS})
Iff targ2 was also specified, it goes on to expand the $(eval), forcing the serialization of targ1 and targ2.
Note that the $(eval) expands to nothing (all its work was done as a side-effect), so that the original $(and) always expands to nothing at all, causing no syntax error.
Ugh!
[Now that I've typed that out, the considerably simpler prog2: | $(filter prog1,${MAKECMDGOALS})
occurs to me. Oh well.]
YMMV and all that.
I'm not familiar with ghc, but the correct solution would be to get the two runs of ghc to use different build folders, then they can happily run in parallel.
Since I got stuck at the same problem, here is another pointer in the direction that make does not provide the functionality you describe:
From the GNU Make Manual:
It is important to be careful when using parallel execution (the -j switch; see Parallel Execution) and archives. If multiple ar commands run at the same time on the same archive file, they will not know about each other and can corrupt the file.
Possibly a future version of make will provide a mechanism to circumvent this problem by serializing all recipes that operate on the same archive file. But for the time being, you must either write your makefiles to avoid this problem in some other way, or not use -j.
What you are attempting, and what I was attempting (using make to insert data in a SQLite3 database) suffers from the exact same problem.
I needed to separate the compilation from other steps (cleaning, building dirs and linking), as I wanted to run the compilation with more core processes and the -j flag.
I managed to solve this, with different makefiles including and calling each other. Only the "compile" make file is running in parallel with all the cores, the rest of the process is syncronous.
I divided my makefile in 3 separate scripts:
settings.mk: contains all the variables and flag definitions
makefile: has all the targets except the compilation one (It has .NOTPARALLEL directive). It calls compile.mk with -j flag
compile.mk: contains only the compile operation (without .NOTPARALLEL)
In settings.mk I have:
CC = g++
DB = gdb
RM = rm
MD = mkdir
CP = cp
MAKE = mingw32-make
BUILD = Debug
DEBUG = true
[... all other variables and flags needed, directories etc ...]
In makefile I have Link and compilation target as these:
include .makefiles/settings.mk
[... OTHER TARGETS (clean, directories etc)]
compilation:
#echo Compilation
#$(MAKE) -f .makefiles/compile.mk --silent -j 8 -Oline
#Link
$(TARGET): compilation
#echo -e Linking $(TARGET)
#$(CC) $(LNKFLAGS) -o $(TARGETDIR)/$(TARGET) $(OBJECTS) $(LIBDIRS) $(LIB)
#Non-File Targets
.PHONY: all prebuild release rebuild clean resources directories run debug
.NOTPARALLEL: all
# include dependency files (*.d) if available
-include $(DEPENDS)
And this is my compile.mk:
include .makefiles/settings.mk
#Defauilt
all: $(OBJECTS)
#Compile
$(BUILDDIR)/%.$(OBJEXT): $(SRCDIR)/%.$(SRCEXT)
#echo -e Compiling: $<
#$(MD) -p $(dir $#)
#$(CC) $(COMFLAGS) $(INCDIRS) -c $< -o $#
#Non-File Targets
.PHONY: all
# include dependency files (*.d) if available
-include $(DEPENDS)
Until now, it's working.
Note that I'm calling compile.mk with -j flag AND -Oline so that parallel processing doesn't mess up with the output.
Any syntax color can be setted in the makefile main script, since the -O flag invalidates escape color codes.
I hope it can help.
I had a similar problem so ended up solving it on the command line, like so:
make target1; make target2
to force it to do the targets sequentially.

GNU Make to list modifed source files since last invoke of make?

I know that
Make figures out automatically which files it needs to update, based on which source files have changed. It also automatically determines the proper order for updating files, in case one non-source file depends on another non-source file.
As a result, if you change a few source files and then run Make, it does not need to recompile all of your program. It updates only those non-source files that depend directly or indirectly on the source files that you changed.
Now I want to know whether I can ask Make to list out these modified sources?
You'll need a dummy file which uses all of your sources as prerequisites:
mod_list: foo.c bar.cc baz.cpp
#echo modified sources: $?
#touch $#
You can keep the list of sources as a separate variable:
WATCHED_SOURCES = foo.c bar.cc baz.cpp
mod_list: $(WATCHED_SOURCES)
#echo modified sources: $?
#touch $#
Or use a wildcard to look at all sources present:
WATCHED_SOURCES = $(wildcard *.c *.cc *.cpp *.whatever)
mod_list: $(WATCHED_SOURCES)
#echo modified sources: $?
#touch $#
One easy way is to use the dry run option to make, which is either -n or --dry-run or a couple of other choices, depending in part on exactly which implementation you are using. This tells you what make would do if executed which will, inter alia, show you what source files it would re-compile.
It's all in the man pages.

Code generation and make rule expansion

Assume I have a make rule:
.PHONY:gen
gen: auto.template
generate-sources auto.template
that creates a bunch of files, for example auto1.src, auto2.src, auto3.src and so on.
If I now have rules to build targets from *.src files, like this:
$(patsubst %.src,%.target,$(wildcard *.src)): %.target: %.src
build $< > $#
How can I tell make to first execute the gen rule and then expand the preconditions for the second rule template? GNU extensions are welcome.
Note: I would like to keep it in one make invocation; A trivial solution to this would be to put the second rule in a secondary Makefile.secondrun and call $(MAKE) -f Makefile.secondrun after gen was processed. But I was wondering if there is a better option.
Building off Beta's answer, here's how you can do it using makefile remaking in GNU make, which is not the same thing as recursive make. Rather, it updates an included makefile using a rule in the main makefile, then restarts the original make instance. This is how *.d dependency files are typically generated and used.
# Get the list of auto-generated sources. If this file doesn't exist, or if it is older
# than auto.template, it will get built using the rule defined below, according to the
# standard behavior of GNU make. If autosrcs.mk is rebuilt, GNU make will automatically
# restart itself after autosrcs.mk is updated.
include autosrcs.mk
# Once we have the list of auto-generated sources, getting the list of targets to build
# from them is a simple pattern substitution.
TARGETS=$(patsubst %.src,%.target,$(AUTO_SRCS))
all: $(TARGETS)
# Rule describing how to build autosrcs.mk. This generates the sources, then computes
# the list of autogenerated sources and writes that to autosrcs.mk in the form of a
# make variable. Note that we use *shell* constructs to get the list of sources, not
# make constructs like $(wildcard), which could be expanded at the wrong time relative
# to when the source files are actually created.
autosrcs.mk: auto.template
./generate-sources auto.template
echo "AUTO_SRCS=`echo *.src`" > autosrcs.mk
# How to build *.target files from *.src files.
%.target: %.src
#echo 'build $< > $#'
Short answer: you can't. Make determines all of the rules it will have to execute before it executes any rule.
Longer answer: maybe you can. As you say, you can use recursive Make explicitly, or surreptitiously by, say, building a file which your makefile will include (I'm looking at you, Jack Kelly). Or if you could somehow obtain a list of the files which gen will build, you could write a rule around that. Or you could take a leap of faith like this:
%.target: %.src
build $< > $#
%.src: gen;

makefile conditionals

Note: using MinGW's make (should be GNU make)
i have a couple of -include statements in my makefile to import dependencies which were generated using g++ -MM. However I would like to only do this when necessary. I have several different build targets and I don't want all of their respective dependency files to be included since this takes a while (suppose I'm running make clean: no need to include them in this case)
Here's the format of my makefile.
DEPS_debug = $(patsubst %.cpp,build_debug/%.d,$(SRC))
OBJ_debug = $(patsubst %.cpp,build_debug/%.o,$(SRC))
all: program_debug
-include $(DEPS_debug) #make: include: Command not found
program_debug: $(OBJ_debug)
$(CC) $(CFLAGS) $(OBJ_debug) -o $#
If you really don't want to include those files needlessly, you have a couple of options:
You can put in a conditional as Diego Sevilla suggests (but I would recommend using MAKECMDGOALS so that you can write a more flexible version, specific to targets, e.g. you'll include foo.d if and only if you're making foo.o).
You can use make recursively (heresy!), invoking $(MAKE) for each target object, using a makefile that includes that target's dependencies.
But actually including the file takes negligible time, it's the rebuilding of the file (automatic for any included file that's out of date) that takes time.
If needless rebuilding is what you want to avoid, you can use a very clever trick. When must foo.d be rebuilt? Only when something about foo has changed. But in that case foo.o must also be rebuilt. So don't have a seperate rule for foo.d, just rebuild it as a side effect of making foo.o. That way you can include all dependency files and not waste time rebuilding them if they aren't needed.
EDIT:
I'm astounded that merely including these files can add 2-3 seconds to make clean. My last paragraph is off the mark, so let me expand on the first two options.
If all is the only target for which these files should be included, and you make all from the command line (and not e.g. make all tests tarball install kitchenSink), then this will do it:
ifeq ($(MAKECMDGOALS),all)
-include $(DEPS_debug)
endif
Note that this will not include foo.d if you make foo.o. You can write a more sophisticated conditional, something like
$(foreach targ,$(MAKECMDGOALS),$(eval $(call include_deps $(targ)))...
but that's pretty advanced, so let's get a simple version working first.
If you'd rather avoid the conditional and use recursive Make, the simplest way is to split the makefile in two:
makefile:
all:
$(MAKE) -f makefile.all
clean:
rm whatever
...other rules
makefile.all:
DEPS_debug = $(patsubst %.cpp,build_debug/%.d,$(SRC))
OBJ_debug = $(patsubst %.cpp,build_debug/%.o,$(SRC))
-include $(DEPS_debug)
all: program_debug
program_debug: $(OBJ_debug)
$(CC) $(CFLAGS) $(OBJ_debug) -o $#
Indenting a line by a TAB makes make think it's a command to be passed to the shell (as you found out). It doesn't work that way.
The - in front of include suppresses errors that might result from DEPS_debug not existing (e.g. when running clean or release without having had a dependency-file-generating call first). Since DEPS_debug is not a dependency of those rules (clean / release), your dependency files do not get generated when you call them, and everything is fine. I don't really see the problem you're having - you don't have to make the include conditional.
Perhaps you'd like to change your approach, though. Instead of having a seperate *.d target, with a seperate -M preprocessor pass, you might want to try something like -MMD -MP which generates the dependency files inline during code generation, in your standard *.c -> *.o pass.
(I know this sounds completely wrong at first, but when you think about it, it makes sense. Makefile logic is a bit backwards that way, unless you're familiar with functional programming.)
includes are independent of the rules, as they are makefile indications, not compilation indications. You can, however, use makefile conditionals based on special makefile variables such as MAKECMDGOALS, that is set to the default goal:
ifeq ($(MAKECMDGOALS),all)
-include whatever
endif
This is included when no default goal is specified. You can change the condition to specify the exact goal you want to check to include other sub-makefiles.

Resources