Unwanted rebuild in Makefile - makefile

What principles should be followed in order not to rebuild some object in Makefile every time?
I only know the most primitive case where we can split the compilation into several steps: creating object modules and linking them. But what to do in more difficult cases? Let's say I have a set of input files and expected output files to test. I want to make it so that only tests on files with wrong output or changed files are re-run.
TEST_INPUT_FILES = $(wildcard $(TEST_DIR)/, *in)
TEST_OUTPUT_FILES = $(wildcard $(TEST_DIR)/, *out)
The above shows how I create lists for each group of files. And in general, how can I be sure that when one file is changed, tests will be passed on this file? Any advice or literature on this topic will be useful, I couldn’t find the answer myself, because everyone disassembles only the banal assembly case

Make creates files from other files using the shell and any program in the shell environment, should the target files not exist or be out of date.
What you'd do is have Make rules running the test with the tested program and any input files, including expected output, then create a test report file. If you want to rerun a failed test you'd remove (clean) the test report prior running the test.
# Make report from test program and inputs
$(REPORT): $(TEST) $(TEST_INPUT) $(TEST_EXPECTED_OUTPUT)
# Remove old report, if any
rm -f $#
# Run test creating report on success
$^
You can also do this by adding report to .DELETE_ON_ERROR: https://www.gnu.org/software/make/manual/make.html#Special-Targets

Related

Makefile wildcard target expansion

I'm a fairly new user of GNU Make. I want to get a list of Golang files and build each one of them using Make.
I want to create a target that will receive the apt Go file as a param.
In the snippet I've written the control of the program never reaches the %.go target.
Here is a snippet of a file.
EXECUTABLES := $(wildcard cmd/*/*/*.go)
%.go:
echo "Build the go file"
build: $(EXECUTABLES)
echo $<
Output:
echo cmd/abc/handler/main.go
cmd/abc/handler/main.go
I modified the script to this but I'm facing the same issue. Also tried replacing %.go with *.go and with cmd/abc/handler/main.go
Here is one of the variants mentioned above.
%.go:
echo "Hello world"
build: $(wildcard cmd/*/*/*.go)
echo $<
Anything I might be missing here?
You have a rule that tells make how to build a .go file. But, you already HAVE a .go file. Make doesn't have to build it, it's a source file that you wrote. That rule has no prerequisites, so as far as make is concerned if the file exists, it's up to date (it doesn't depend on any other file).
So, when you ask make to build that file make looks and says "this file exists already, and it doesn't depend on anything, so the thing you asked me to do is already done and I don't need to run any commands".
When you write a makefile you have to think of it from the end first back to the beginning. What is the thing you want to end up with? That is the first target you write. Then what files are used as inputs to that final thing? Those are the prerequisites of that target. And what commands are needed to create the target from the prerequisites? That is the recipe.
Then once you have that, you think about those prerequisites, considering them as targets. What prerequisites do they have? And what commands do you need to turn those prerequisites into that target?
And you keep going backwards like that until you come to a place where all the prerequisites are original source files that can't be built from anything, then you're done.

Is it ok to have a GNU Make target that claims to generate / update a certain target file but actually doesn't?

At present, I have a makefile that has:
a target which links an executable image file from a bunch of object files
a pattern rule target that compiles the various object files the linker target depends on
I want to make the following changes.
Instead of compiling the object files outright, I want the pattern rule target mentioned above to create (for each object file that needs updating) an empty object_file_name.update file. Essentially, this target's job would be to take stock of all object files that actually need to be recompiled.
Write a new target that launches a Perl process which finds all these object_file_name.update files and, for each object file that must be recompiled, compiles it in this Perl process.
I know how to do 2) ... that part is not giving me any trouble. The part I'm worried about is 1). The reason is that that target would basically have to claim to update any needed object files while, in truth, merely creating an .update file for each such object file but not the object file itself.
I think I could trick GNU Make into not starting to try to link anything before all the object files have been built by declaring my dependencies accordingly (pseudo-code, not a valid GNU Make snippet):
# Phony target that reads the *.update files created by the pattern rule target below and then
# compiles each object file for which an *.update file exists.
COMPILE_OBJECTS :
...
# Pattern rule target to take stock of all object files that need updating. Creates an *.update file for
# each object file that needs recompiling.
%.o : %.c :
...
$(EXE_FILE_TO_LINK) : $(LIST_OF_OBJECT_FILE_PATHS) COMPILE_OBJECTS
...
but I still worry that this might result in undefined behavior because my pattern rule target would basically be lying to GNU Make about updating the needed object files. Is my worry justified?
Basically, I want to interject an intermediate layer between GNU Make and the compiler so that GNU Make doesn't compile each object file separately. Instead, the compiling would be done in a single Perl process that has access to the complete list of object files that need to be compiled, allowing me to do various fancy things that I couldn't do if GNU Make controlled compilation directly.
Yes, it's legal and I often use this pattern.
Consider the case where you only want to kick off a long build step if a file has changed.
target: config-file
target-creator $< -o $#
Now let's say we can't give make the dependencies for config-file (because the config file creation step lacks a dependency listing ability (BAH!)).
.PHONY: FORCE
FORCE: ;
config-file: FORCE
config-creator -o $#.tmp
cmp $#.tmp $# || mv $#.tmp $#
We ask make to build target
Make first has to build config-file
Make will always run the recipe for config-file,
as its dependency FORCE is out of date (being phony)
CRUCIALLY we only update config-file if config-creator decides something has actually changed
If cmp decides config-file.tmp and config-file are the same, and the last line of the recipe completes with no error
OTOH if cmp detects a mis-compare, it fails, and the shell goes on to execute the mv.
After running the recipe for config-file, make does actually check config-file's modification time. IF config-file has become younger than target, only then will target-creator be run.
The subtlety here is that even though config-file's recipe runs every time, config-file itself is not phony.

How to get make to fail if generation of a required include passes but the file is not actually created

I have a makefile that is erroneously generating a required makefile include file. The included file does not initially exist, but there is a rule to make it. The rule runs successfully, but because of a bug, the required include file is not created where expected (and thus make is unable to include it). However, instead of make failing (due to the fact that the file still can't be included), make completes successfully.
The following is my makefile1.mak file.
include myfile.mak
default:
#echo hi
myfile.mak:
#echo hello
When I execute 'make -f makefile1.mak', I get:
makefile1.mak:1: myfile.mak: No such file or directory
hello
hi
Of course, I finally figured out that my code to generate myfile.mak was not generating it correctly, but the actual makefiles that I'm using are 100s of lines long, so we didn't notice that the include wasn't happening for quite a while (it was a very subtle build issue that was introduced).
So, my question is - is there any way to get make to fail on the above example?
Add a line to the rule:
myfile.mak:
do various things to build myfile.mak
test -f $#

Number Files to get rebuilt by Make

Is there a way to let make determine the number of files to be recompiled before actually compiling? The problem is this: Consider having a quite big project with hundreds of source files. It would very convenient to have a rough idea of how long compilation will take, but to know that, one needs to know the number of files to be compiled.
The general answer is no, because your build could generate files which themselves are inputs to other rules which generate more files. And so on. However if a rough answer is good enough you can try the --dry-run flag. From the GNU make documentation...
“No-op”. Causes make to print the recipes that are needed to make the targets up to date, but not actually execute them. Note that some recipes are still executed, even with this flag (see How the MAKE Variable Works). Also any recipes needed to update included makefiles are still executed (see How Makefiles Are Remade).
As you can see, despite its name even the --dry-run flag will change the state of your build.
"make -n" will do the dry run. But you can't get the list of files to be rebuilt. May be you can write shell script to store the last modified time of files and get the list of files.
I think a found a decent solution for unix. Here SRC are your source files, HDR your headers and DEP the dependency files (something like DEP:=$(OBJ:.o=.d) )
isInDepFile+=$(shell grep -q $(modifiedFile) $(depFile) 1>&2 2> /dev/null && echo $(depFile))
COMPFILES=
checkDepFiles=$(foreach depFile,$(DEP), $(eval filesToCompile+=$(isInDepFile))) $(thinOutDepFiles)
thinOutDepFiles=$(foreach fileToCompile,$(filesToCompile),$(eval DEP=$(filter-out $(fileToCompile),$(DEP))))
countFilesToCompile: $(SRC) $(HDR)
$(eval modifiedFiles=$?)
$(foreach modifiedFile,$(modifiedFiles), $(call checkDepFiles))
$(eval numOfFilesToCompile = $(words $(filesToCompile)))
$(eval numDepFiles = $(words $(DEP)))
$(eval NumSRCFiles = $(words $(SRC)))
#echo $(NumSRCFiles) sources
#echo $(numDepFiles) files to leave
#echo $(numOfFilesToCompile) files to compile
#touch $#
This first generates a list of modified files within your source and header files lists. Then for each modified file it checks all dependency files for its filename. If a dependency file contains the current file we are looking at, it is added to the list of filesToCompile. It is also removed from the list of dependency files to avoid duplication.
This can be invoked in the main building rule of your project. The advantage of that over the dry run is that it gives you a simple number to work with.

Makefile putting child on chain

I'm working with a makefile and I'm currently running in debug mode. I noticed the "putting child 0x5435etc PID 2344 on the chain" Is this makefiles way of remembering what files are generated?
I ask because I'm using a tool that generates a bunch of different file types of of a target like below.
%.v: $.rdl
(Generates .html, .v, .vh, .xml, .spirit.xml, etc into current directory)
The tool generates all the files as expected and desired.
Then the makefile runs a target
vpath %.spirit.xml ${list_of_directories}
%.ralf: %.spirit.xml
(Generates a .ralf and .spirit.ralf)
The very first time I run "$ make " in a clean directory it generates the list of .v files on the first target, but then fails on the first .ralf. If I run "$ make " again it correctly builds all of the .ralf files. Any possible easy answers? I noticed that when it puts children into the chain for the %.v target it only ever puts the .v files! I was thinking it might not know the others exist!
Yeah, you will need to tell Make that that command produces multiple outputs.
This should work:
%.html %.v %.vh %.xml %.spirit.xml: %.rdl
# command
See Pattern examples in the manual.

Resources