Im trying to run this simple makefile commands but get the error - 'Nothing to be done for 'all''
FILES = file1.c file2.c file3.c
all:test
test:
for file in $(FILES);
do
echo $$file;
done
The target test has no dependencies and therefore no reason to be built, which is inherited by the target all. It has instructions, but it should include FILES as its prerequisites. What you're doing appears to be ingredients-first, but test is the target. Working backwards is what make is best at. You may benefit from an article called "Auto-Dependency Generation" which takes the opposite approach (you appear to think like I do.)
test: $(FILES)
Then you could do something like the following:
$(FILES:.o:.c): %.o: %.c
$(CC) -c -o $# $<
The first part is a set of possible targets, the list of objects corresponding to the list of sources, and the second is a specific but nameless object (it will assume the name of the corresponding source.) Later on, the target, e.g. test, can be the name of your executable taking these objects as both dependencies and objects to link statically. For my purposes I typically use shared libraries but this is irrelevant to the question at hand.
Edit: untested, will revise if issues ensue
Related
I have inherited a large branched project? that requires a volatile set of .a archives $(LIB_FILES) to be included into link target, located in some directories $(LIB_DIRS). I can write an expression like this:
LIBDEP = $(foreach ldir, $(LIB_DIRS), \
$(filter $(addprefix %/, $(LIB_FILES)), $(wildcard $(ldir)/* )))
The problem is that they might not exist at moment of make's invocation and would be built by invoking $(MAKE) inside of another target's rule, which is a prerequisite to the link step.
The problem is actual list of files that should be created varies on external factors determined at their build steps, that I can't hard-code it properly, without turning makefile into a spaghetti mess and said variable is not re-evaluated at the moment of link command invocation.
I have suspicion that $(eval ) function can be used somehow, but manual is not very forthcoming as well as I didn't found examples of its use in this way.
Toolchain: GCC and binutils, make 3.81
Another solution is to create an explicit dependency of your make script on the output of the step which currently creates the variable $(LIB_FILES). This is what the manual is dealing with in the chapter How makefiles are remade and it aims at the technique which make is best at, namely deriving dependencies from the existence and timestamp of files (instead of variables). The following hopefully depicts your situation with the process of deducing a new set of libraries simulated by the two variables $(LIBS_THIS_TIME) and $(LIB_CONFIG_SET).
LIBS_THIS_TIME = foo.a:baz.a:bar.a
LIB_CONFIG_SET = $(subst :,_,$(LIBS_THIS_TIME))
include libdeps.d
linkstep:
#echo I am linking $^ now
touch $#
libdeps.d: $(LIB_CONFIG_SET)
-rm libdeps.d
$(foreach lib,$(subst :, ,$(LIBS_THIS_TIME)),echo linkstep: $(lib) >> libdeps.d;)
$(LIB_CONFIG_SET):
touch $#
If make finds that libdeps.d is not up to date to your current library configuration it is remade before make executes any other rule, although it is not the first target in the makefile. This way, if your build process creates a new or different set of libraries, libdeps.d would be remade first and only then make would carry on with the other targets in your top makefile, now with the correct dependecy information.
It sometimes happens that you need to invoke make several times in succession. One possibility to do this is to use conditionals:
ifeq ($(STEP),)
all:
<do-first-step>
$(MAKE) STEP=2 $#
else ifeq ($(STEP),2)
all:
<do-second-step>
$(MAKE) STEP=3 $#
else ifeq ($(STEP),3)
all:
<do-third-step>
endif
In each step you can generate new files and have them existing for the next step.
For some reason make is recompiling the source files that are outside my
project tree every time. I don't normally have such files, but I noticed it while I was testing this Makefile and I would like to understand why. It behaves normally with respect to the source files inside my project tree. Here are my rules. First I have a function
src_to_obj = $(basename $(addprefix $(build_dir)/,$(1))).o
Then I populate the list of program_objects by calling foreach on the list of sources and applying the above function to each one.
$(bin_dir)/$(program_name): $(program_objects)
$(linker) $(link_flags) $(CPPFLAGS) $(LDFLAGS) $(TARGET_ARCH) \
$^ $(OUTPUT_OPTION)
define build_template =
$$(call src_to_obj,$(1)) : $(1) $$(call dependency_file,$(1))
// make dependency and build directories
$$(compile) $$(OUTPUT_OPTION) $$<
// rename dependency
endef
$(foreach source,$(program_sources),\
$(eval $(call build_template,$(source))))
As I said, this works just fine for the source files in the project directory. For example, the Makefile lives in ~/Projects/foo and most of the source code is in ~/Projects/foo/src. But if I pull in a file A.cc from the home directory,then that file is recompiled every time I run make.
EDIT:
Following Mad-Scientists suggestion, I ran make -d and examined the output. That made the problem clear. Basically, if I feed this a source file ~/bad.cc, then the rule looks like:
build/~/bad.o : ~/bad.cc .deps/~/bad.d
...
However, my script for the rule is creating the file .deps/home/jim/bad.d. Therefore, every time I run make the dependency file is "not there" and the object file has to be remade. I am using the exact same expressions to refer to the file name in the script. However I am guessing that the shell is expanding the ~ to /home/jim. So somehow I have to escape the ~ in the recipe. Thanks to Mad Scientist for pointing out the -d option for make.
I don't have time to make a short self contained correct example right now, but if I can't solve this problem I will come up with one and post it here later.
EDIT EDIT:
So I solved the problem with the following hack:
program_sources = $(shell echo $(program_sources))
This pre-expands all names so that they are never different in the shell.
Is there a way how to ask gmake to never run two targets from a set in parallel?
I don't want to use .NOTPARALLEL, because it forces the whole Makefile to be run sequentially, not just the required part.
I could also add dependencies so that one depends on another, but then (apart from being ugly) I'd need to build all of them in order to build the last one, which isn't necessary.
The reason why I need this is that (only a) part of my Makefile invokes ghc --make, which takes care of its dependencies itself. And it's not possible to run it in parallel on two different targets, because if the two targets share some dependency, they can rewrite each other's .o file. (But ghc is fine with being called sequentially.)
Update: To give a specific example. Let's say I need to compile two programs in my Makefile:
prog1 depends on prog1.hs and mylib.hs;
prog2 depends on prog2.hs and mylib.hs.
Now if I invoke ghc --make prog1.hs, it checks its dependencies, compiles both prog1.hs and mylib.hs into their respective object and interface files, and links prog1. The same happens when I call ghc --make prog2.hs. So if they the two commands get to run in parallel, one will overwrite mylib.o of the other one, causing it to fail badly.
However, I need that neither prog1 depends on prog2 nor vice versa, because they should be compilable separately. (In reality they're very large with a lot of modules and requiring to compile them all slows development considerably.)
Hmmm, could do with a bit more information, so this is just a stab in the dark.
Make doesn't really support this, but you can sequential-ise two targets in a couple of ways. First off, a real use for recursive make:
targ1: ; recipe1...
targ2: ; recipe2...
both-targets:
${MAKE} targ1
${MAKE} targ2
So here you can just make -j both-targets and all is fine. Fragile though, because make -j targ1 targ2 still runs in parallel. You can use dependencies instead:
targ1: ; recipe1...
targ2: | targ1 ; recipe2...
Now make -j targ1 targ2 does what you want. Disadvantage? make targ2 will always try to build targ1 first (sequentially). This may (or may not) be a show-stopper for you.
EDIT
Another unsatisfactory strategy is to explicitly look at $MAKECMDGOALS, which lists the targets you specified on the command-line. Still a fragile solution as it is broken when someone uses dependencies inside the Makefile to get things built (a not unreasonable action).
Let's say your makefile contains two independent targets targ1 and targ2. Basically they remain independent until someone specifies on the command-line that they must both be built. In this particular case you break this independence. Consider this snippet:
$(and $(filter targ1,${MAKECMDGOALS)),$(filter targ2,${MAKECMDGOALS}),$(eval targ1: | targ2))
Urk! What's going on here?
Make evaluates the $(and)
It first has to expand $(filter targ1,${MAKECMDGOALS})
Iff targ1 was specified, it goes on to expand $(filter targ2,${MAKECMDGOALS})
Iff targ2 was also specified, it goes on to expand the $(eval), forcing the serialization of targ1 and targ2.
Note that the $(eval) expands to nothing (all its work was done as a side-effect), so that the original $(and) always expands to nothing at all, causing no syntax error.
Ugh!
[Now that I've typed that out, the considerably simpler prog2: | $(filter prog1,${MAKECMDGOALS})
occurs to me. Oh well.]
YMMV and all that.
I'm not familiar with ghc, but the correct solution would be to get the two runs of ghc to use different build folders, then they can happily run in parallel.
Since I got stuck at the same problem, here is another pointer in the direction that make does not provide the functionality you describe:
From the GNU Make Manual:
It is important to be careful when using parallel execution (the -j switch; see Parallel Execution) and archives. If multiple ar commands run at the same time on the same archive file, they will not know about each other and can corrupt the file.
Possibly a future version of make will provide a mechanism to circumvent this problem by serializing all recipes that operate on the same archive file. But for the time being, you must either write your makefiles to avoid this problem in some other way, or not use -j.
What you are attempting, and what I was attempting (using make to insert data in a SQLite3 database) suffers from the exact same problem.
I needed to separate the compilation from other steps (cleaning, building dirs and linking), as I wanted to run the compilation with more core processes and the -j flag.
I managed to solve this, with different makefiles including and calling each other. Only the "compile" make file is running in parallel with all the cores, the rest of the process is syncronous.
I divided my makefile in 3 separate scripts:
settings.mk: contains all the variables and flag definitions
makefile: has all the targets except the compilation one (It has .NOTPARALLEL directive). It calls compile.mk with -j flag
compile.mk: contains only the compile operation (without .NOTPARALLEL)
In settings.mk I have:
CC = g++
DB = gdb
RM = rm
MD = mkdir
CP = cp
MAKE = mingw32-make
BUILD = Debug
DEBUG = true
[... all other variables and flags needed, directories etc ...]
In makefile I have Link and compilation target as these:
include .makefiles/settings.mk
[... OTHER TARGETS (clean, directories etc)]
compilation:
#echo Compilation
#$(MAKE) -f .makefiles/compile.mk --silent -j 8 -Oline
#Link
$(TARGET): compilation
#echo -e Linking $(TARGET)
#$(CC) $(LNKFLAGS) -o $(TARGETDIR)/$(TARGET) $(OBJECTS) $(LIBDIRS) $(LIB)
#Non-File Targets
.PHONY: all prebuild release rebuild clean resources directories run debug
.NOTPARALLEL: all
# include dependency files (*.d) if available
-include $(DEPENDS)
And this is my compile.mk:
include .makefiles/settings.mk
#Defauilt
all: $(OBJECTS)
#Compile
$(BUILDDIR)/%.$(OBJEXT): $(SRCDIR)/%.$(SRCEXT)
#echo -e Compiling: $<
#$(MD) -p $(dir $#)
#$(CC) $(COMFLAGS) $(INCDIRS) -c $< -o $#
#Non-File Targets
.PHONY: all
# include dependency files (*.d) if available
-include $(DEPENDS)
Until now, it's working.
Note that I'm calling compile.mk with -j flag AND -Oline so that parallel processing doesn't mess up with the output.
Any syntax color can be setted in the makefile main script, since the -O flag invalidates escape color codes.
I hope it can help.
I had a similar problem so ended up solving it on the command line, like so:
make target1; make target2
to force it to do the targets sequentially.
Note: using MinGW's make (should be GNU make)
i have a couple of -include statements in my makefile to import dependencies which were generated using g++ -MM. However I would like to only do this when necessary. I have several different build targets and I don't want all of their respective dependency files to be included since this takes a while (suppose I'm running make clean: no need to include them in this case)
Here's the format of my makefile.
DEPS_debug = $(patsubst %.cpp,build_debug/%.d,$(SRC))
OBJ_debug = $(patsubst %.cpp,build_debug/%.o,$(SRC))
all: program_debug
-include $(DEPS_debug) #make: include: Command not found
program_debug: $(OBJ_debug)
$(CC) $(CFLAGS) $(OBJ_debug) -o $#
If you really don't want to include those files needlessly, you have a couple of options:
You can put in a conditional as Diego Sevilla suggests (but I would recommend using MAKECMDGOALS so that you can write a more flexible version, specific to targets, e.g. you'll include foo.d if and only if you're making foo.o).
You can use make recursively (heresy!), invoking $(MAKE) for each target object, using a makefile that includes that target's dependencies.
But actually including the file takes negligible time, it's the rebuilding of the file (automatic for any included file that's out of date) that takes time.
If needless rebuilding is what you want to avoid, you can use a very clever trick. When must foo.d be rebuilt? Only when something about foo has changed. But in that case foo.o must also be rebuilt. So don't have a seperate rule for foo.d, just rebuild it as a side effect of making foo.o. That way you can include all dependency files and not waste time rebuilding them if they aren't needed.
EDIT:
I'm astounded that merely including these files can add 2-3 seconds to make clean. My last paragraph is off the mark, so let me expand on the first two options.
If all is the only target for which these files should be included, and you make all from the command line (and not e.g. make all tests tarball install kitchenSink), then this will do it:
ifeq ($(MAKECMDGOALS),all)
-include $(DEPS_debug)
endif
Note that this will not include foo.d if you make foo.o. You can write a more sophisticated conditional, something like
$(foreach targ,$(MAKECMDGOALS),$(eval $(call include_deps $(targ)))...
but that's pretty advanced, so let's get a simple version working first.
If you'd rather avoid the conditional and use recursive Make, the simplest way is to split the makefile in two:
makefile:
all:
$(MAKE) -f makefile.all
clean:
rm whatever
...other rules
makefile.all:
DEPS_debug = $(patsubst %.cpp,build_debug/%.d,$(SRC))
OBJ_debug = $(patsubst %.cpp,build_debug/%.o,$(SRC))
-include $(DEPS_debug)
all: program_debug
program_debug: $(OBJ_debug)
$(CC) $(CFLAGS) $(OBJ_debug) -o $#
Indenting a line by a TAB makes make think it's a command to be passed to the shell (as you found out). It doesn't work that way.
The - in front of include suppresses errors that might result from DEPS_debug not existing (e.g. when running clean or release without having had a dependency-file-generating call first). Since DEPS_debug is not a dependency of those rules (clean / release), your dependency files do not get generated when you call them, and everything is fine. I don't really see the problem you're having - you don't have to make the include conditional.
Perhaps you'd like to change your approach, though. Instead of having a seperate *.d target, with a seperate -M preprocessor pass, you might want to try something like -MMD -MP which generates the dependency files inline during code generation, in your standard *.c -> *.o pass.
(I know this sounds completely wrong at first, but when you think about it, it makes sense. Makefile logic is a bit backwards that way, unless you're familiar with functional programming.)
includes are independent of the rules, as they are makefile indications, not compilation indications. You can, however, use makefile conditionals based on special makefile variables such as MAKECMDGOALS, that is set to the default goal:
ifeq ($(MAKECMDGOALS),all)
-include whatever
endif
This is included when no default goal is specified. You can change the condition to specify the exact goal you want to check to include other sub-makefiles.
When I change a Makefile, its rules may have changed, so they should be reevaluated, but make doesn't seem to think so.
Is there any way to say, in a Makefile, that all of its targets, no matter which, depend on the Makefile itself?
(Regardless of its name.)
I'm using GNU make.
This looks like one more simple, useful, logical thing that Make should be able to do, but isn't.
Here is a workaround. If the clean rule is set up correctly, Make can execute it whenever the makefile has been altered, using an empty dummy file as a marker.
-include dummy
dummy: Makefile
#touch $#
#$(MAKE) -s clean
This will work for most targets, that is targets that are actual files and that are removed by clean, and any targets that depend on them. Side-effect targets and some PHONY targets will slip through the net.
Since GNU make version 4.3 it is now possible with the use of those two special variable:
.EXTRA_PREREQS
To add new prerequisite to every target
MAKEFILE_LIST
To get the path of the make file
To have every target depend on the current make file:
Put near the top of the file (before any include since it would affect the MAKEFILE_LIST) the following line:
.EXTRA_PREREQS:= $(abspath $(lastword $(MAKEFILE_LIST)))
To have every target depend on the current make file and also the make files which were included
Put the following line at the end of your file:
.EXTRA_PREREQS+=$(foreach mk, ${MAKEFILE_LIST},$(abspath ${mk}))
The only answer I know to this is to add makefile explicitly to the dependencies. For example,
%.o: %.c makefile
$(CC) $(CFLAGS) -c $<