Marking a makefile dependency as optional or otherwise unimportant - makefile

I have a Makefile with a pair of rules like the following:
file.c: filesvn
filesvn: .svn/entries
action1
action2
Within the svn repository, this of course works fine. The file is dependent upon living in a subversion repository. When exported from the repository, this does not work (No rule to make target...), and I would like to fix that. I have tried to simply bring a previously-generated version of filesvn out into the exported directory, but Make still insists on verifying filesvn's dependency.
Simply deleting the dependency of .svn/entries does work, of course, but then the spirit of the rule is broken since watching for a revision update is the goal.
Is there a way to get Make to not care that the .svn/entries file is not there?
The merits of such a technique are not really part of the question. I can't fundamentally change this, but if there is a change that maintains the spirit that might work. An answer of "You can't" is perfectly valid of course. :)

You can use the wildcard function to ignore the file unless it exists:
filesvn: $(wildcard .svn/entries)
Aside: Subversion 1.7 changes the format of the local working copy, so this rule would stop working even in working copies.

Pattern % rules
For pattern rules, you need .SECONDEXPANSION:
.SECONDEXPANSION:
%.a: %.b $$(wildcard $$*.c)
cc $< -o $#
See also: How can I make a pattern rule dependency optional in a Makefile?
Tested in Make 4.1.

You can add a target that generates copy of actual makefile with changes that drop the svn dependency, run make with it and then deletes it.

Related

How to trigger the rebuild of a Makefile prerequisite file ONLY when a specific target is called?

I haven't found an answer so far, so I think this is not a repeat question.
I have some Makefile along the lines of:
include prerequisite_2
all: prerequisite_1 prerequisite_2
clean:
rm *.mod
prerequisite_1:
mkdir somedir
prerequisite_2:
re-write existing file
The issue is that I want the prerequisite_2 to rebuild whenever the default goal is called (all) or when prerequisite_2 is called from the command line, and I know I can use touch prerequisite_2, FORCE or .PHONY to achieve this. However, I DO NOT want it to run every time (the written file contains dependency information for the Fortran files involved) as it doesn't make sense to also rebuild this when calling: make clean
Is it possible to emulate the effects of FORCE or .PHONY only when the depending targets are called?
You can see what the goal targets are by looking at the MAKECMDGOALS variable.
So you can do something like:
ifeq (,$(if $(MAKECMDGOALS),$(filter-out all prerequisite-2,$(MAKECMDGOALS))))
include prerequisite-2
endif
The if condition will be true if MAKECMDGOALS is the empty string, or if it contains only all and/or prerequisite-2 but not if it contains any other target.
Usually, this is not what you want though. Usually you want to disable the include only if certain targets (clean is the classic example) are used.
This exact situation is even discussed in the GNU make manual.

Include generated makefile without warning message

For a project of mine I am automatically generating makefiles and including them, like this:
all:
#echo 'SUCCESS is $(SUCCESS)'
clean:
rm depend.mk
depend.mk:
#echo 'Creating $#'
#echo 'SUCCESS := 1' > $#
.PHONY: all clean
include depend.mk
This works, but the include line generates a warning message:
$ make
Makefile:13: depend.mk: No such file or directory
Creating depend.mk
SUCCESS is 1
I would like to silence that first warning line saying that depend.mk doesn't exist. I know it doesn't exist since I have a rule written to generate it, so the warning is unnecessary (unless of course there isn't a rule for it). I do NOT want make to ignore the error where the included file doesn't exist and there is no rule for it, so prefixing include with a - to ignore the error will not work for me. I'd like something similar to bash's convention of piping stderr to /dev/null like some_cmd 2>/dev/null but for including in make.
The sample above is a very simplified example of this case. In my actual project there are a lot of automatically generated makefiles (via clang's automatic dependency generation) being included, meaning a fresh run of make will flood my screen with these warning messages.
Is anything like this possible, or am I just going to have to deal with the annoying warning messages?
I've encountered and (re-re-re-re-)solved this problem a number of times myself. Really, the problem is in the thinking surrounding when the dependency files are generated and used.
This link has the detailed description of the "resolution": http://make.mad-scientist.net/papers/advanced-auto-dependency-generation/
Basically it comes down to the fact that dependency files are really only necessary for rebuilding, not the initial building of your library/executable. Resultantly you don't need to have a rule for generating dependency files up front (which is in fact less efficient), you instead should generate them during the object file step as intermediate files marked precious (so they're created and tracked as side-effect files that should never be cleaned up automatically). Subsequent builds will then have the files available, which is exactly what you were trying to achieve overall. You can then make it a "-include" on the dependency files, with the foreknowledge that your object file build step will fail if the dependency file generation fails, giving an immediate error, as you've mentioned is preferred, rather than an obscure and indirect one much later.
I've actually done a couple rather large build systems implementing this method, and it does work quite well, including ones that used non-GNU toolchains. To an outside user it appears identical, but internally it performs more efficiently and isn't hiding potentially important errors.
I tried many (many!) things to see if I could prevent or redirect the error message. No luck.
But when I tried -include (include with a leading dash), it didn't give an error, and make with clean, all, depend.mk and 'default' all worked properly and as expected.
Is there a particular reason you didn't want to use the -include variant? Seems to do exactly what you're looking for, and doesn't alter how the Makefile works in any way, just doesn't show the error during the first pass through the Makefile.

Avoid regenerating files that won't change

I have a Makefile with several rules of this form
protolist.c: $(PROTOCOLS) Makefile src/genmodtable.sh
$(SHELL) $(srcdir)/src/genmodtable.sh \
$# $(filter-out %Makefile %genmodtable.sh, $^)
As the name implies, protolist.c winds up containing a list of all the "protocols" defined by the .c files in $(PROTOCOLS). The contents of this file do formally depend on everything in $(PROTOCOLS), the Makefile, and the generator script, but it's very rare for the file to actually change when one of those .c files is edited. Therefore, genmodtable.sh is coded to not change the timestamp of protolist.c if it's not going to make any change to its contents. This causes Make to skip rebuilding protolist.o and its dependencies when it's not really necessary.
That all works fine; the problem is that, because protolist.c now appears to be out of date with respect to its dependencies, Make thinks it has to try to regenerate protolist.c on every run. This isn't a performance issue -- the script is very fast -- but it is confusing behavior. I dimly recall an idiom, involving timestamp files, that could be used to stop Make from doing this, but I have not been able to reconstruct it or find it described anywhere. Does anyone know what it is?
(Also if anyone can suggest how to get rid of that silly $(filter-out ...) construct, that would be helpful, as that is the only GNUmakeism in this Makefile.)
This appears similar to an issue with Fortran programming and make, involving the files generated when compiling a module. (Not relevant, other than that is where I picked up how to do this.)
What you want is have make compare the timestamp of protolist.o to the timestamp of protolist.c, which remains 'old', and make the decision to run the recipe for protolist.c, depending on the timestamp of, well, a timestamp file, which gets updated each time the recipe is run.
In order to make this work, you have to link the two together with an empty rule.
protolist.o: protolist.c
[...]
protolist.c: protolist.c.time ;
protolist.c.time: $(PROTOCOLS) Makefile src/genmodtable.sh
$(SHELL) $(srcdir)/src/genmodtable.sh \
protolist.c $(filter-out %Makefile %genmodtable.sh, $^)
touch protolist.c.time
In my own makefiles, I have to declare the timestamp files as prerequisites of the special target .PRECIOUS, to prevent make from deleting them, but I'm using pattern rules; I'm not 100% sure, but I think this isn't necessary when using explicit rules, like here.
To avoid the $(filter-out ...) construct, can you not simply replace it with $(PROTOCOLS)?
(Although, personally, I would stick to Paul's First Rule of Makefiles: Don't hassle with writing portable makefiles, use a portable make instead.)

Makefile: need to do a target before including another makefile

Part of my Makefile:
CPUDEPS=./mydeps.cpu
(...)
deps: $(CPUDEPS)
$(CPUDEPS): $(CCFILES)
#echo [DEPS] CPU
$(CMDECHO)makedepend -Y -s'# CPU sources dependencies generated with "make deps"' \
-w4096 -f- -- $(CFLAGS) -- $^ 2> /dev/null > $(CPUDEPS)
(...)
sinclude $(CPUDEPS)
Problem 1: includes are done during the first phase of processing, targets during the second phase; so, if ./mydeps.cpu doesn't exist and I "make deps", I get first the error
Makefile:335: ./mydeps.cpu: No such file or directory
I hide the error using sinclude instead of include, but the problem is still there: the old file is included, not the just-generated-one. Have to run it twice to include the updated file. This is because make does a two-phase processing; is there any way to tell make to complete the target deps before parsing the includes?
Problem 2: even if the file ./mydeps.cpu doesn't exist and make deps actually creates it, I always get a "make: Nothing to do for deps". This doesn't happen with other targets. I don't understand why and how to avoid it.
Problem 1 is non-existant: before building a target, make automatically rebuilds makefiles (with implicit rules if no explicit rule is provided). So having a rule for the makefile ensures that will always be up to date, there is no need to run deps twice. Additionally, since CPUDEPS is a makefile, it will be updated automatically before any other rule is run, so dependencies will always be updated if necessary and make deps is not needed. You can probably notice this by yourself by observing the [DEPS] line being echoed if any of the CCFILES becomes more recent that the dependency file.
For Problem 2, adding anything to the recipe ensures that make doesn't complain about having nothing to do. If there is nothing else, you can use something like #echo OK to give feedback to the user, or a simple #true if you prefer totally silent makes.
What you are trying to achieve is useless: you can use the dependencies file that was created during the previous build. That's enough.
The main reasoning behind that rule is:
if you haven't changed any of your files, then the dependencies file is up-to-date, and there's nothing to build.
if you have changed anything, even very deep into your #include chain, on an existing file that were used by previous build, then the dependencies file have already caught it. You'll rebuild what is needed.
if you change something in a new file (you add that file!) then it was not used by previous build, and not listed in dependencies. But if you really want to use it, then you have to modify at least one of your other files that was used before, and you're back on the previous case.
The solution is to create the dependencies file during the normal process of the compilation, and to optionally include it (with sinclude) if it is present.

Depend on the make file itself

In the event that a Makefile itself is changed, a safe bet would be to consider all targets out of date.
Is there a clever way to add this dependency? Are there any alternatives?
Make sure the object files depend on the makefile:
$(OBJFILES) : Makefile
Where Makefile is the name of the make file.
A safe bet, but a terrible idea. Example: you're using automake and update Makefile.am to add a single source file. The correct response is to compile just the new file and link it in. In your scheme everything would be rebuilt.
Moreover, adding the dependency isn't going to do anything unless you touch the file, something like:
$(SRCS): Makefile
touch $#
This will then trip up editors that use the mtime to detect concurrent modification (emacs is one example).
If you're doing something major, just run make clean all after doing the change.
Since GNU make version 4.3 it is now possible with the use of those two special variable:
.EXTRA_PREREQS
To add new prerequisite to every target
MAKEFILE_LIST
To get the path of the make file
To have every target depend on the current make file:
Put near the top of the file (before any include since it would affect the MAKEFILE_LIST) the following line:
.EXTRA_PREREQS:= $(abspath $(lastword $(MAKEFILE_LIST)))
To have every target depend on the current make file and also the make files which were included
Put the following line at the end of your file:
.EXTRA_PREREQS+=$(foreach mk, ${MAKEFILE_LIST},$(abspath ${mk}))

Resources