I'm trying to write a Makefile that automatically calls BibTeX on files that match a specific wildcard but don't exist when I first run Make. Specifically, I have the following:
.FORCE:
all: pdf
foo=something
lat: *.tex
pdflatex $(foo).tex
pdf: lat
open $(foo).pdf &
%.aux: .FORCE
bibtex $#
bib: lat $(foo)?.aux
pdflatex $(foo).tex
pdflatex $(foo).tex
open $(foo).pdf &
What I want to happen is that the following will occur when I run make bib:
pdflatex will be called on $(foo).tex, generating files $(foo)1.aux, $(foo)2.aux, etc.
bibtex will be called on $(foo)1.aux, then $(foo)2.aux, etc.
pdflatex will be called twice on $(foo).tex
open will be called on $(foo).pdf
However, this doesn't happen: in particular, it looks as if Make evaluates the prerequisites $(foo)?.aux up-front, at a point where the files $(foo)1.aux, $(foo)2.aux, etc. don't exist. As a result, BibTeX is never called on them. If I rerun make bib, however, things work, because the files now exist after being created on the previous run.
Question: Is forcing Make to re-evaluate prerequisites for a target the right way to fix this? If so, how can I get it to re-evaluate the prerequisites for bib after running pdflatex as part of lat? If not, how can I achieve what I want please?
What I do in my Maiefile for LaTeX files is rename the targets.
That way, you can have different target names, depending on which phase has been used to create them. This is according to the spirit of make's pattern rules, which assume that files with different contents also have different extensions. So I have rules like this:
%.aux1 : %.tex
rm -f $*.aux
pdflatex -draftmode $*
mv -f $*.aux $#
%.bbl : %.aux1
cp -pf $< $*.aux
bibtex $* || : > $#
%.aux2 : %.bbl
cp -pf $*.aux1 $*.aux
pdflatex -draftmode $*
mv -f $*.aux $#
%-tex.pdf: %.aux2
cp -pf $< $*.aux
pdflatex -jobname $*-tex $*
You can't do this in a completely straightforward way, since make fundamentally assumes that one run through a target's commands will update the target. That is, there's no way in principle that you can tell make that ‘you need to run these commands twice’.
You can try to get round this with (admirably clever) tricks such as #reinerpost suggests, but a problem with that general approach is that sometimes/often a single run of BibTeX (or makeindex, or whatever) is sufficient.
After having tried various types of tricks in the past, what I generally do here is to make a command list which explicitly includes two BibTeX calls where necessary:
%.bbl: %.aux
bibtex $(#:.bbl=)
if grep -q Rerun $(#:.bbl=.log) >/dev/null; then \
bibtex $(#:.bbl=); \
fi
That command list re-runs BibTeX if the log file includes the ‘Label(s) may have changed. Rerun to get cross-references right’ message.
To be honest, what I actually do is just the single line bibtex $(#:.bbl=). When I'm writing a document, I inevitably re-run make so many times that the list of references comes out correct very quickly. This means that this target doesn't work for the ‘recreate the final version from a clean directory’ case, but that's sufficiently rare that I tend not to obsess about it.
Whenever I catch myself re-solving this problem, I now recognise that I'm trying to push water up-hill because I'm bored writing this document, so I go and do something else.
I just wanted to share an alternative solution: Using submake processes:
If so, how can I get it to re-evaluate the prerequisites for bib after running pdflatex as part of lat?
You can somewhat achieve that, by adding make lat to the recipe for bib. This will start a new make process targetting at bib. The sub-make doesn't know anything aboutits parents targets/prerequisites. (Such a concept is usually used, when some huge project is built from different smaller projekts each of which have different makefiles.)
This can be done in multiple layers (although it will be confusing):
bib: $(foo)?.aux lat check_for_bib
check_for_bib:
if grep -q Rerun $(#:.bbl=.log) >/dev/null; then make bib; fi
pdf: lat check_for_bib
open $(foo).pdf &
Note that I had to change some orders of prerequisites. The pseud-code would be something like:
latex compilation
while log suggests update:
update aux
latex compilation
Each iteration of the while loop will take place in a separate make process.
Related
Consider the following setup:
$ touch 1.src 2.src 3.src
$ cat Makefile
%.dst: %.src
#convert -o "$#" "$<"
We can compile our .src files into .dst files by running make 1.dst 2.dst 3.dst which calls the convert (just a placeholder) tool three times.
This setup is fine if there is little overhead in calling convert. However, in my case, it has a startup penalty of a few seconds for every single call. Luckily, the tool can convert multiple files in a single call while paying the startup penalty only once, i.e. convert -o '{}.dst' 1.src 2.src 3.src.
Is there a way in GNU make to specify that multiple src files should be batched into a single call to convert?
Edit: To be more precise, what feature I am looking for: Say that 1.dst is already newer than 1.src so it doesn't need to be recompiled. If I run make 1.dst 2.dst 3.dst, I would like GNU make to execute convert -o '{}.dst' 2.src 3.src.
A quick and dirty way would be creating a .PHONY rule that simply converts all src files to dst files but that way I would convert every src file each and every time. Further more, specifying dst files as prerequisites in other rules would also no longer be possible.
Thanks in advance!
If you have GNU make 4.3 or above, you can use grouped targets like this:
DST_FILES = 1.dst 2.dst 3.dst
SRC_FILES = $(_DST_FILES:.dst=.src)
all: $(DST_FILES)
$(DST_FILES) &: $(SRC_FILES)
convert -o '{}.dst' $?
#touch $(DST_FILES)
If your convert is only updating some of the targets then you need the explicit touch to update the rest.
Here's a way to do it with passing a goal on the command line that might work; change DST_FILES to:
DST_FILES := $(or $(filter %.dst,$(MAKECMDGOALS)),1.dst 2.dst 3.dst)
Is there a way in GNU make to specify that multiple src files should be batched into a single call to convert?
It is possible, but messy, to write make rules for build steps that produce multiple targets with a single run of the recipe, such that the recipe is executed just once if any of the targets needs to be updated. However, you clarify that
[if] 1.dst is already newer than 1.src [, and] I run make 1.dst 2.dst 3.dst, I would like GNU make to execute convert -o '{}.dst' 2.src 3.src.
. That's a slightly different problem. You can use the $? automatic variable in a recipe to get the prerequisites that are newer than the rule's target, but for that to serve the purpose, you need a rule with a single target.
Here's one slightly convoluted way to make it work:
DST_FILES = 1.dst 2.dst 3.dst
SRC_FILES = $(DST_FILES:.dst=.src)
$(DST_FILES): dst.a
ar x $< $#
dst.a: $(SRC_FILES)
convert -o '{}.dst' $?
x='$?'; ar cr $# $${x//src/dst}
The dst.a archive serves as the one target with all the .src files as prerequisites, so as to provide a basis for use of $?. Additionally, it provides a workaround for the problem that whenever that target is updated, it becomes newer than all the then-existing .dst files: .dst files that are out of date with respect to the archive but not with respect to the corresponding .src file are extracted from the archive instead of being rebuilt from scratch.
Is there a way to let make determine the number of files to be recompiled before actually compiling? The problem is this: Consider having a quite big project with hundreds of source files. It would very convenient to have a rough idea of how long compilation will take, but to know that, one needs to know the number of files to be compiled.
The general answer is no, because your build could generate files which themselves are inputs to other rules which generate more files. And so on. However if a rough answer is good enough you can try the --dry-run flag. From the GNU make documentation...
“No-op”. Causes make to print the recipes that are needed to make the targets up to date, but not actually execute them. Note that some recipes are still executed, even with this flag (see How the MAKE Variable Works). Also any recipes needed to update included makefiles are still executed (see How Makefiles Are Remade).
As you can see, despite its name even the --dry-run flag will change the state of your build.
"make -n" will do the dry run. But you can't get the list of files to be rebuilt. May be you can write shell script to store the last modified time of files and get the list of files.
I think a found a decent solution for unix. Here SRC are your source files, HDR your headers and DEP the dependency files (something like DEP:=$(OBJ:.o=.d) )
isInDepFile+=$(shell grep -q $(modifiedFile) $(depFile) 1>&2 2> /dev/null && echo $(depFile))
COMPFILES=
checkDepFiles=$(foreach depFile,$(DEP), $(eval filesToCompile+=$(isInDepFile))) $(thinOutDepFiles)
thinOutDepFiles=$(foreach fileToCompile,$(filesToCompile),$(eval DEP=$(filter-out $(fileToCompile),$(DEP))))
countFilesToCompile: $(SRC) $(HDR)
$(eval modifiedFiles=$?)
$(foreach modifiedFile,$(modifiedFiles), $(call checkDepFiles))
$(eval numOfFilesToCompile = $(words $(filesToCompile)))
$(eval numDepFiles = $(words $(DEP)))
$(eval NumSRCFiles = $(words $(SRC)))
#echo $(NumSRCFiles) sources
#echo $(numDepFiles) files to leave
#echo $(numOfFilesToCompile) files to compile
#touch $#
This first generates a list of modified files within your source and header files lists. Then for each modified file it checks all dependency files for its filename. If a dependency file contains the current file we are looking at, it is added to the list of filesToCompile. It is also removed from the list of dependency files to avoid duplication.
This can be invoked in the main building rule of your project. The advantage of that over the dry run is that it gives you a simple number to work with.
It looks to me like Makefile rules can be roughly classified into "positive" and "negative" ones: "positive" rules create missing or update outdated files, while "negative" ones remove files.
Writing prerequisites for "positive" rules is quite easy: if the target and the prerequisite are file names, make by default runs the recipe if the target is missing or outdated (a missing file in this context may be viewed as an infinitely old file).
However, consider a "negative" rule, for example for target clean. A usual way to write it seems to be something like this:
clean:
rm -f *.log *.synctex.gz *.aux *.out *.toc
This is clearly not the best way to do:
rm is executed even when there is nothing to do,
its error messages and exit status need to be suppressed with -f flag, which has other (possibly undesirable) effects, and
the fact that there were nothing to do for target clean is not reported to the user, unlike what is normal for "positive" targets.
My question is: how to write a Makefile rule that shall be processed by make only if certain files are present? (Like what would be useful for make clean.)
how to write a Makefile rule that shall be processed by make only if certain files are present? (Like what would be useful for make clean.)
You can do it like so:
filenames := a b c
files := $(strip $(foreach f,$(filenames),$(wildcard $(f))))
all: $(filenames)
$(filenames):
touch $#
clean:
ifneq ($(files),)
rm -f $(files)
endif
Example session:
$ make
touch a
touch b
touch c
$ make clean
rm -f a b c
$ make clean
make: Nothing to be done for 'clean'.
Useful perhaps for some purposes, but it strikes me as a strained refinement for make clean.
This can be easily remedied:
clean:
for file in *.log *.synctex.gz *.aux *.out *.toc; do \
if [ -e "$file" ]; then \
rm "$$file" || exit 1; \
else \
printf 'No such file: %s\n' "$file" \
fi \
done
The if statement is necessary unless your shell supports and has enabled nullglob or something similar.
If your printf supports %q you should use that instead of %s to avoid possible corruptions of your terminal when printing weird filenames.
A meta-answer is: are you sure you want to do this?
The other answers suggest to me that the cure is worse than the disease, since one involves an extension to POSIX make (ifneq), and the other uses a compound command which spreads over seven lines. Both of these are sometimes necessary expedients – I'm not criticising either answer – but both are things I avoid in a Makefile if I can. If I found myself wanting to do this in a clean rule, perhaps for the reason you mention in your comment to #MikeKinghans' answer, I'd try quite hard to change the rest of the Makefile to avoid needing this.
Reflecting on your three original points in turn:
rm is executed even when there is nothing to do: so what? The alternatives still need to, for example, expand the *.log *.synctex.gz ... so there's only miniscule efficiency gain to avoiding the rm. Make is a high-level tool which generally does not concern itself with efficiency.
its error messages and exit status need to be suppressed with -f flag: the -f flag doesn't generally suppress errors and the exit status, it merely indicates to rm that a non-existing or non-permissioned file is not to be regarded as an error.
the fact that there were nothing to do for target clean is not reported to the user: should the user really care?
The last point is the most interesting. People asking about make, on Stackoverflow and elsewhere, sometimes make things hard for themselves by trying to use it as a procedural language – make is not Python, or Fortran. Instead, it's a goal programming language (if we want to get fancy about it): you write snippets of rules to achieve sub-goals, so that the user (you, later) doesn't have to care about the details or the directory's current state, but can simply indicate a goal, and the program does whatever's necessary to get there. So whether there's is or isn't anything to do, the user ‘shouldn't’ care.
I think the short version of this answer is: it's idiomatic to keep make rules as simple (and thus as readable and robust) as possible, even at the expense of a little crudity or repetition.
I'm trying to create a makefile which downloads some pre-requisite files to a path.
But the foreach documentation is sadly lacking in detail and examples.
I want something like:
image_files = a b
image_versions = 701.2 802.1
image_path = images
images = $(foreach ...) ??
I'd like that to result in an expansion to:
images/701.2/a
images/701.2/b
images/802.1/a
images/802.1/b
And have a phony target to download them from a URL like:
mytarget: $(images)
wget somepath $<
How do I do that?
Ok I have gotten a little further with this. But I'm still a little perplexed as to how I can get this to work.
tag = my-registry:8443/boot-server-data
versions = 557.0.0 607.0.0
images_a = $(foreach ver, $(versions), images/$(ver)/coreos_production_pxe_image.cpio.gz)
images_b = $(foreach ver, $(versions), images/$(ver)/coreos_production_pxe.vmlinuz)
all: build
.PHONY: build $(images_a) $(images_b)
build:
./make-profiles
docker build -t $(tag) .
docker push $(tag)
$(images_a):
wget http://stable.release.core-os.net/amd64-usr/$(foreach version... but depends on each image)/coreos_production
How do you do this?
In fact I only want it to download the images if they aren't there. But for some reason it downloads it every time. It's literally been years since I used Make. I normally use another build tool, but that build tool needs to be modified to make it do what I want here. So I thought I'd just whip this up in the meantime. It's prooving to be a little harder than expected.
You are pretty close, but the problem does not lie with foreach. Let's have a look at just the bit that does the downloading. When make reads the makefile it ends up with something like (after shortening the names a bit for clarity):
images/1/file.cpio.gz images/2/file.cpio.gz:
<recipe>
If, for some reason, make decides to rebuild images/1/file.cpio.gz say, at this point it will expand the recipe, and pass each line of that expansion to a separate shell.
Your job is to write a recipe that does not care whether the target is images/1/file.cpio.gz or images/2/file.cpio.gz. That's another way of saying the recipe should use macros like $# (it will expand to the target).
A sketch:
${images_a}:
wget -O $# http://stable.release.core-os.net/amd64-usr/$#
You may have to munge $# so that wget gets the right url. Just one example:
${images_a}:
wget -O $# http://stable.release.core-os.net/$(dirname $#)/deeper/$(notdir $#)
One complaint about your original makefile: the dependencies are wrong. build needs the downloads to have completed before it runs.
.PHONY: build
build: $(images_a) $(images_b)
...
The images are not phony (just ensure you don't lie to make abut their filenames) either.
The massive advantage of writing your makefile in this way is that it's parallel safe (and that's the whole point of make). When -j is in force, both wgets can proceed at the same time, halving the download time.
I have a Makefile with the following type of rule:
%.html:
./generate-images.py > $#
make $(patsubst %.png,%.gif,$(wildcard *.png))
The generate-images script writes not only the HTML file (to stdout) but several .png files to the current directory. The goal here is to convert them to .gif. (not really, but this is an example)
This works if I invoke it directly. The problem is: If I invoke it from another rule where foo.html is a dependency, the wildcard statement fails to find any files. In other words, it just called make with no arguments, which is not what I want here.
What's the deal with the wildcard? Or, is there a better way to do this?
While your problem may be something different, I clearly see one.
The whole text of all commands within the rule is simultaneously processed so that make's functions and variables get expanded. Assume you have no .png files in the directory, and you invoke make so it should regenerate them: a.png and b.png. Then, after you invoke make, the text of the rule would effectively look like this:
file.html:
./generate-images.py > file.html
make
because at the moment of reading the makefile there were no .png files! After the first line is executed, the files will appear, but the next line was already generated to be just "make".
And only when you invoke your makefile for the second time, will it expand to
file.html:
./generate-images.py > file.html
make a.gif b.gif
This is not what you want. So I suggest doing it in The Right Way.
# If you have batch conversion program, this may be helpful
images.stamp: *.png
convert_all_images $?
touch images.stamp
# OR, if you want convert one-by-one with means of make
images.stamp: $(wildcard *.png)
touch images.stamp
%.gif: %.png
convert_one --from=$^ --to=$#
# HTML would look like
%.html:
./generate-images.py > $#
make images.stamp
So when you invoke make all, it generates htmls and converts newly generated images. Note that it will only convert the images that are updated, which is what you want.
Thanks to Beta for pointing out the mess with gif/png extensions.
That sounds like it's evaluating all of the $() expressions as it's processing the Makefile, rather than as it executes each rule. You could add a rule to your makefile like so:
images: $(patsubst %.png,%.gif,$(wildcard *.png))
.PHONY: images
and then change your example snippet to
%.html:
./generate-images.py > $#
make images
so that Make evaluates the glob at the right time. This is something about which checking the manual might be worthwhile.