Preface:
Yes, my makefiles are written badly.
No, I/we didn't write them; we inherited this code base from another company.
I want to know if it's possible to fix my problem WITHOUT rewriting them.
Question
Is there a way to reference targets from another makefile and use those as prerequisites?
Say you have:
all: libs binary
binary: # I need to add prereqs here
blah
blah2
blah3
For binary, I need to targets in other makefiles as prereqs.
I cannot just include those makefiles, and therefore those targets, because those makefiles define identical variables but with different values.
Is it possible to do something like:
binary: C:/mk1:foo C:/mk2:bar
blah
blah2
blah3
UPDATE
In case it's not clear, makefilesC:/mk1 and C:/mk2are part of the same makefile project that is being executed via some top level makefile with make --jobs=X so in theory all makefiles could be being made in parallel.
Sometimes Recursive Make [duhn-duhn-duhnnnn!] is the right tool for the job:
binary: foo bar
blah
blah2
blah3
.PHONY: foo bar
foo:
$(MAKE) -f mk1 $#
bar:
$(MAKE) -f mk2 $#
The PHONY forces Make to execute those rules and invoke the other makefiles to (perhaps) rebuild foo and bar even if they already exist (because this makefile doesn't know what prerequisites they may have).
What about using the include makefile (or sinclude) mechanism to incorporate the inherited makefile? This should work as long as your own targets have different names.
You can also concatenate makefiles by specifying multiple -f makefile options. They are concatenated in order.
Related
Imagine we have an existing (untouchable) Makefile with target "foo", and another included Makefile which I can modify. I would like to add a new target called "runafter" which shall be executed after "foo" was run. So the user keeps calling "foo" and some additional code shall be run afterwards.
The usual way to achieve this would be to rename the original ones and do something like:
foo_old:
...
foo: foo_old
# run some code or call another target explicitly
$(MAKE) runafter
But that only works if you can rename foo. If not, how could I extend the behavior of the existing target? Everything I tried to do with foo: ... apparently causes overriding of the old foo target (with warning). But I just want to run some code afterwards!
I do not see how to do this from the included makefile but if you use GNU make then you can add a makefile named makefile instead of Makefile:
$ cat makefile
foo:
$(MAKE) -f Makefile $#
$(MAKE) runafter
runafter:
...
From the GNU make man page:
If no -f option is present, make will look for the makefiles GNUmakefile, makefile, and Makefile, in that order.
So you can also name it GNUmakefile if you wish. With one or the other running make foo should do what you want.
Using make's ''Remaking Makefiles'' feature I am generating parts of my makefile with include directives (see Makefile: defining rules and prerequisites in recipes). Now I'm stuck with being unable to see how I can express dependencies between included makefiles. They seem to be all evaluated at once.
Consider the following minimal makefile that illustrates my problem:
all:
-include foo.make
-include bar.make
foo.make: Makefile
echo FOO:=blub bla baz > foo.make
bar.make: Makefile foo.make
echo BAR:=$(FOO) > bar.make
If I now run make I will get:
$ cat foo.make
FOO:=blub bla baz
$ cat bar.make
BAR:=
Why? Since bar.make depends on foo.make, shouldn't the evaluation of bar.make wait until it successfully included foo.make?
And how do I fix this problem and make sure that bar.make is either re-evaluated later or only evaluated once foo.make exists, is included and can define the variable BAR?
The reason I cannot combine foo.make and bar.make into a single makefile and rule is two-fold:
Firstly, in my real setup, bar.make depends on more intermediate targets which in turn transitively depend on foo.make. So at the time foo.make can be created, the content of bar.make cannot yet be made.
Secondly, in my real setup, foo.make and bar.make do not just define variables but also eval() define/endef blocks. So I have to write:
-include makefile_with_prerequisite_variables
define MYDEF
sometarget-$1: $(TARGET_$1_PREREQUISITES)
[...]
endf
-include makefile_with_eval_call_statements
The content of makefile_with_prerequisite_variables and makefile_with_eval_call_statements cannot go into a single makefile snippet:
If I would put makefile_with_eval_call_statements above MYDEF together with makefile_with_prerequisite_variables then the $eval( $call( MYDEF)) statements in it would not work because MYDEF is only declared afterward.
If I would put makefile_with_prerequisite_variables below MYDEF together with makefile_with_eval_call_statements then the recipes defined in MYDEF would not have proper prerequisits because the $(TARGET_$1_PREREQUISITES) variables would then be declared afterward by makefile_with_prerequisite_variables.
In summary, I need to include two different makefiles where one depends upon the other. I do not know how I can express this relationship such that the content of one makefile would only be created after the other makefile is up-to-date and included into the main makefile.
First, your makefile creation in this simple example can easily be fixed by escaping the value of $(FOO) so that it's not expanded when bar.make is created but rather deferred until it's read in. So:
bar.make: Makefile foo.make
echo 'BAR:=$$(FOO)' > $#
However, that might not be sufficient in your more complex real-life makefiles.
GNU make works like this: first parse all the makefiles. Then for every included makefile, treat it as a goal and try to build it (e.g., act as if the user invoked make include1.mk include2.mk include3.mk ...). Then at the end of that, if any of the included makefiles was rebuilt, re-exec ourselves and start the entire process over from scratch.
GNU make does NOT work like this: parse makefiles, try to rebuild the first included makefile and if it's rebuilt, re-exec; if it's not rebuilt go on to the next included makefile, etc.
A simple trick you can use if you have to have this type of order is to put the include of bar.make into foo.make:
all:
-include foo.make
foo.make: Makefile
printf -- '-include bar.make' > $#
echo FOO:=blub bla baz >> $#
bar.make: Makefile foo.make
echo 'BAR:=$$(FOO)' > $#
By doing this you ensure that if foo.make doesn't exist, make can't see the include of bar.make and so it won't try to build it. Only after the first re-exec will make see the include of bar.make and try to build it.
One thing: if you get the latest version of GNU make you no longer need to use the -include trick. You can just use include even with generated makefiles.
I have a makefile in directory foo and would like to use the same makefile in a subdirectory bar. I have been doing the following:
all:
<do work in foo>
cd bar;
make -f ../Makefile <target to make in bar>
This gets very messy when I try to do target specific variable values as I need to pass them on the command line when calling make in bar. Is there a cleaner way to do this?
I cannot tell from the question whether the following solution suites your needs, it might - or might not - work for you.
If your situation is that you simply want the same Makefile features available, include could be a solution. You can create a Makefile in directory bar in which you do everything you need specific to bar, and besides that, you do:
include ../foo/Makefile
Caveat! This doesn't work straight-forward. There cannot be two recipes with the same name. For example, if you want foo/Makefile to do recipeBar for all, and you want foo/Makefile to do recipeFoo and recipeBar for all, the following does not work:
foo/Makefile:
.PHONY: all
all:
recipeFoo
bar/Makefile:
.PHONY: all
all:
reciveBar
include foo/Makefile
Instead, the recipes have to be separated into unique names. However, dependency rules can be there multiple times, so it's not really a challenge to workaround this caveat. So, the following would work:
foo/Makefile:
.PHONY: all
all: allFoo
.PHONY: allFoo
allFoo:
recipeFoo
bar/Makefile:
.PHONY: all
all: allBar
.PHONY: allBar
allBar:
recipeBar
include foo/Makefile
Now, if you run make in bar, it would run recipeFoo and recipeBar.
If the sequence matters to you and recipeFoo must run before recipeBar, make allBar dependent on allFoo, like this:
bar/Makefile:
.PHONY: all
all: allBar
.PHONY: allBar
allBar: allFoo
recipeBar
include foo/Makefile
If you want your target-specific variables available when you call another make (for which I recommend to use $(MAKE) not make), you can export your variables - with the corresponding consequences (environment space overflow risk on some Windows versions, .
For example, if you have a target-specific variable FOO for target all in Makefile, and you want that when calling Submake.mak that variable is known, it works like this:
Makefile:
all: export FOO:=bar
.PHONY: all
all:
$(MAKE) -f Submake.mak
Submake.mak:
.PHONY: all
all:
echo $(FOO)
Create a link (hard or symbolic, your choice) in bar to ../Makefile. Then, as Carl points out in his comment, you can make -C bar and everything should work. (As of gmake 3.81, at least, make switches to the new directory first, then does its thing. I cannot speak for gmake 4.0.)
The question is about parallel making w/ GNU makefile.
Given a folder structure as below, the goal is to deliver a makefile that it supports make release/debug/clean in parallel.
project folder structure:
foo
+-foo1
+-foo2
+-foo3
The makefile may be sth like:
SUBDIR = foo1 foo2 foo3
.PHONY $(SUBDIR) release debug clean
release: $(SUBDIR)
$(SUBDIR):
$(MAKE) -C $# release
debug: $(SUBDIR)
#below is incorrect. $(SUBDIR) is overriden.
$(SUBDIR):
$(MAKE) -C $# debug
..
Sub directory list are set as phony targets for parallel making. but it lost the information of original target (release, debug, clean etc).
One method is to suffix the names for the directories and recover it in commands, but it is weird. another method might be to use variables, but not sure how to work it out.
The questions is:
How to write the rules for directories, that supports parallel making w/ different targets (release/debug/clean)?
Any hints are greatly appreciated.
Setting variables on the command line certainly works. You can also use MAKECMDGOALS (see the GNU make manual):
$(SUBDIR):
$(MAKE) -C $# $(MAKECMDGOALS)
When I change a Makefile, its rules may have changed, so they should be reevaluated, but make doesn't seem to think so.
Is there any way to say, in a Makefile, that all of its targets, no matter which, depend on the Makefile itself?
(Regardless of its name.)
I'm using GNU make.
This looks like one more simple, useful, logical thing that Make should be able to do, but isn't.
Here is a workaround. If the clean rule is set up correctly, Make can execute it whenever the makefile has been altered, using an empty dummy file as a marker.
-include dummy
dummy: Makefile
#touch $#
#$(MAKE) -s clean
This will work for most targets, that is targets that are actual files and that are removed by clean, and any targets that depend on them. Side-effect targets and some PHONY targets will slip through the net.
Since GNU make version 4.3 it is now possible with the use of those two special variable:
.EXTRA_PREREQS
To add new prerequisite to every target
MAKEFILE_LIST
To get the path of the make file
To have every target depend on the current make file:
Put near the top of the file (before any include since it would affect the MAKEFILE_LIST) the following line:
.EXTRA_PREREQS:= $(abspath $(lastword $(MAKEFILE_LIST)))
To have every target depend on the current make file and also the make files which were included
Put the following line at the end of your file:
.EXTRA_PREREQS+=$(foreach mk, ${MAKEFILE_LIST},$(abspath ${mk}))
The only answer I know to this is to add makefile explicitly to the dependencies. For example,
%.o: %.c makefile
$(CC) $(CFLAGS) -c $<