Including an automatically generated makefile as a trick to enforce execution of a recipe - makefile

I'm slowly losing my mind here. First, let me describe what it is I'm trying to do. We have a compiler that spews out weirdly formatted dependency files. To get these makefiles into a format GNU Make can understand, they need to be processed by a Perl script first. Technically, the Perl script doesn't convert the input dependency files it gets passed; instead it creates a new, properly formatted dependency file for each input dependency file.
Now, in order for GNU Make to know which translation units need recompiling and which don't, it obviously must have seen those dependency files before trying to make the translation unit targets, so we have the following line in our master makefile:
include $(PROCESSED_EXISTING_DEPENDENCY_FILES)
where $(PROCESSED_EXISTING_DEPENDENCY_FILES) is a list of all converted dependency files. My idea was to (ab-)use an automatically generated makefile whose recipe not only builds that makefile but also triggers the creation of all dependency files mentioned in the $(PROCESSED_EXISTING_DEPENDENCY_FILES) list and include that makefile just before including the converted dependency files. To ensure that the conversion takes place, the parent process of our Make process will delete the automatically created makefile first (we have a Perl wrapper process controlling GNU Make). The relevant part in the master makefile would look like this:
# Phony target that creates processed dependency files.
CONVERTED_EXISTING_DEPENDENCY_FILES :
<recipe here>
$(PRE_CONVERTED_DEPENDENCY_FILE_INCLUSION_HOOK) : CONVERTED_EXISTING_DEPENDENCY_FILES
$(info $(TARGET_BUILD_MESSAGE_PREFIX) Building $(notdir $#) ...)
$(file >$#,# Automatically generated makefile that gets included before including the existing, converted dependency files.)
$(file >>$#,$(DOLLAR)(info Including pre-converted-dependency-files-inclusion hook file ...))
$(file >>$#,)
include $(PRE_CONVERTED_DEPENDENCY_FILE_INCLUSION_HOOK)
include $(PROCESSED_EXISTING_DEPENDENCY_FILES)
We're already using the same basic principle in several other cases, and so far this has worked perfectly fine, but for some reason when I try this, GNU Make gets lost in an infinite loop where it will continuously re-revaluate the master makefile, include all other makefiles and then go back to re-revaluating the master makefile again.
The $(PRE_CONVERTED_DEPENDENCY_FILE_INCLUSION_HOOK) does get created, and if there are any dependency files to be converted, they are processed, too, but I'm still at a loss as to what causes this infinite loop in Make. We are using GNU Make 4.2.1 for Windows on a Windows 10 (64 bit) system.

I recommend you rework your model completely to avoid any recipes that know how to build included files, and instead follow the model for auto-dependency generation described in this post (based on how automake handles dependency generation).
Then add in the postprocessing step directly into the same recipe that generates the dependency files, rather than having a separate rule that does it. I don't think it's necessary to have two separate rules because you really don't want the intermediate step here: you just want to generate the make prerequisites definitions... similar to how normally we wouldn't have separate rules for preprocessing, compiling, assembling object files: one rule does that even though there are multiple steps involved.

Related

Is it ok to have a GNU Make target that claims to generate / update a certain target file but actually doesn't?

At present, I have a makefile that has:
a target which links an executable image file from a bunch of object files
a pattern rule target that compiles the various object files the linker target depends on
I want to make the following changes.
Instead of compiling the object files outright, I want the pattern rule target mentioned above to create (for each object file that needs updating) an empty object_file_name.update file. Essentially, this target's job would be to take stock of all object files that actually need to be recompiled.
Write a new target that launches a Perl process which finds all these object_file_name.update files and, for each object file that must be recompiled, compiles it in this Perl process.
I know how to do 2) ... that part is not giving me any trouble. The part I'm worried about is 1). The reason is that that target would basically have to claim to update any needed object files while, in truth, merely creating an .update file for each such object file but not the object file itself.
I think I could trick GNU Make into not starting to try to link anything before all the object files have been built by declaring my dependencies accordingly (pseudo-code, not a valid GNU Make snippet):
# Phony target that reads the *.update files created by the pattern rule target below and then
# compiles each object file for which an *.update file exists.
COMPILE_OBJECTS :
...
# Pattern rule target to take stock of all object files that need updating. Creates an *.update file for
# each object file that needs recompiling.
%.o : %.c :
...
$(EXE_FILE_TO_LINK) : $(LIST_OF_OBJECT_FILE_PATHS) COMPILE_OBJECTS
...
but I still worry that this might result in undefined behavior because my pattern rule target would basically be lying to GNU Make about updating the needed object files. Is my worry justified?
Basically, I want to interject an intermediate layer between GNU Make and the compiler so that GNU Make doesn't compile each object file separately. Instead, the compiling would be done in a single Perl process that has access to the complete list of object files that need to be compiled, allowing me to do various fancy things that I couldn't do if GNU Make controlled compilation directly.
Yes, it's legal and I often use this pattern.
Consider the case where you only want to kick off a long build step if a file has changed.
target: config-file
target-creator $< -o $#
Now let's say we can't give make the dependencies for config-file (because the config file creation step lacks a dependency listing ability (BAH!)).
.PHONY: FORCE
FORCE: ;
config-file: FORCE
config-creator -o $#.tmp
cmp $#.tmp $# || mv $#.tmp $#
We ask make to build target
Make first has to build config-file
Make will always run the recipe for config-file,
as its dependency FORCE is out of date (being phony)
CRUCIALLY we only update config-file if config-creator decides something has actually changed
If cmp decides config-file.tmp and config-file are the same, and the last line of the recipe completes with no error
OTOH if cmp detects a mis-compare, it fails, and the shell goes on to execute the mv.
After running the recipe for config-file, make does actually check config-file's modification time. IF config-file has become younger than target, only then will target-creator be run.
The subtlety here is that even though config-file's recipe runs every time, config-file itself is not phony.

Generate include files for the Makefile by the same Makefile

In my program, I have a somewhat complicated build process. Currently, in one directory I use include in Makefile.am with a file, that does not exist but has to be build on its own. The reason is that this include file is quite long. Further in the real program it is not just only one file but several and the generation process for this file can change from time to time.
The Makefile.am looks something like this
noinst_LIBRARIES = libtest.a
nodist_libtest_a_SOURCES = file.c
CLEANFILES = file.c Make_file.mk
$(builddir)/Make_test.mk: $(srcdir)/Perl/generate_mk_files.pl
perl $(srcdir)/Perl/generate_mk_files.pl file
include $(builddir)/Make_file.mk
After creation of Make_file.mk it looks something like
$(builddir)/file.c: $(srcdir)/file.template $(srcdir)/Perl/generate_c.pl
perl $(srcdir)/Perl/generate_c.pl $(srcdir)/file.template
Automake works and the final build process as well. The output to make is something like (I have shorted it somewhat):
Makefile:721: Make_file.mk: Datei oder Verzeichnis nicht gefunden (file not found)
perl ../../../../src/components/test/Perl/generate_mk_files.pl test
perl ../../../../src/components/test/Perl/generate_c.pl ../../../../src/components/test/file.template
Therefore, make first complains that the include file is not found, then creates it and then also follows the rules of the included file.
Although I am happy that it works I wonder why. First, I thought that make loads the Makefile. During this step, Make_file.mk does not exists. Therefore it seems the Makefile is loaded more than once.
Further, the manual of Automake for include states:
Note that these fragments are read and interpreted by automake, not by
make.
Which is not what I see, since the included fragment does not exist during the execution of Automake.
My questions basically are:
Why does it work?
Is this the correct way to do this or should I use another approach, e.g. starting new instances of make within Makefile.
I don't really know Automake, but, from the GNU make manual :
If an included makefile cannot be found in any of these directories {standard includes directories} , a
warning message is generated, but it is not an immediately fatal
error; processing of the makefile containing the include continues.
Once it has finished reading makefiles, make will try to remake any
that are out of date or don’t exist. See How Makefiles Are Remade.
Only after it has tried to find a way to remake a makefile and failed,
will make diagnose the missing makefile as a fatal error.
If you want make to simply ignore a makefile which does not exist or
cannot be remade, with no error message, use the -include directive
instead of include, like this:
-include filenames…
This acts like include in every way except that there is no error (not even a warning) if any of the filenames (or any
prerequisites of any of the filenames) do not exist or cannot be
remade.
So basically, make cannot execute the recipe for remaking the include file before he has finished to parse the main Makefile. So it raises a warning, continue to read the Makefile, find the rule for remaking the included file, remake it, and then restart itself (that is explained in details in the How Makefiles Are Remade section).
Going back to the manual, it states there's two forms for automakes include mechanism:
include $(srcdir)/file
and
include $(top_srcdir)/file
neither of which match your include. So I'd imagine the include is actually run by the underlying make whatever that might be (e.g. GNU Make, though of course other make programs have this functionality as well).
Now for the questions:
Why does it work?
As explained in another answer, GNU Make will attempt to make a missing included makefile, before failing.
Is this the correct way to do this or should I use another approach, e.g. starting new instances of make within Makefile
Generating makefiles is one of the tasks autotools do, either through autoconf or automake. Going through multiple stages of "making makefiles" seems prone to error (and hard to maintain). Recursive make has similar problems.
The reason is that this include file is quite long.
automake include statements will happily paste together a large makefile out of smaller components.
Further in the real program it is not just only one file but several and the generation process for this file can change from time to time.
It's hard to recommend what to suggest to do in autotools based on how the "changes" are determined. Since it seems you're also using libtool adding/removing sources to libs (or entire libs) there can be effected by conditionals, variables, etc.

How to delay effect of "include" directive in Makefile.am until make (avoid "include" being seen by Automake)?

My Makefile.am includes a file (with various defined variables), for example:
include make.config
...
The problem is that this file is in turn generated by a tool (i.e. config.generator.sh) based on some input file (i.e. input.dat). The straightforward and wrong idea would be to add a rule to generate make.config:
make.config : input.dat
config.generator.sh input.dat > make.config
include make.config
...
Although this content is perfectly working makefile on its own without automake, the idea is doomed with automake. The make.config file is included by automake before I even have a chance to execute make (and it fails as the file is not yet generated):
automake: cannot open < make.config: No such file or directory
Is there a way to postpone effect of include directive until make is run (possibly by using another directive)?
There is probably a way to simply run arbitrary commands before any makefile generation is done (i.e. AC_CONFIG_COMMANDS*). But the question is more complicated because the config.generator.sh is supposed to use executables which are in turn also generated during the same build process (so there is a dependency chain which logically has to be managed by makefiles from the same project). The documentation simply confirms the logic without providing alternatives.
The solution is described in this email of Automake's mailing list.
The idea is to use include directives inside small regular "wrapper" makefile and include Automake-generated Makefile into it (note the upper case M). Because makefile is not an Automake template, the include works as expected triggering builds for non-existing files.
Note that:
By default make utility will search for makefile first (not for Makefile) making this approach working seamlessly.
It is still recommended to specify all rules inside Makefile.am and keep the "wrapper" makefile simple. The rules for non-existing files will naturally come from the generated Makefile anyway.
I've come across the same annoying problem today when moving my OCaml project to Autotools. My solution is to use autoconf's substitution to go around automake. For the above example, I'd add a substitution to configure.ac:
AC_SUBST([include_make_config], ["include make.config"])
and adjust Makefile.am, replacing the include directive with the autoconf variable reference:
make.config : input.dat
config.generator.sh input.dat > make.config
#include_make_config#
...
automake doesn't touch the #include_make_config# line so it gets carried over into the generated Makefile.in. When autoconf takes over, it substitutes the variable with include make.config in the final Makefile.
Note: I use this with OCaml's ocamldep dependency generator.

Makefile: need to do a target before including another makefile

Part of my Makefile:
CPUDEPS=./mydeps.cpu
(...)
deps: $(CPUDEPS)
$(CPUDEPS): $(CCFILES)
#echo [DEPS] CPU
$(CMDECHO)makedepend -Y -s'# CPU sources dependencies generated with "make deps"' \
-w4096 -f- -- $(CFLAGS) -- $^ 2> /dev/null > $(CPUDEPS)
(...)
sinclude $(CPUDEPS)
Problem 1: includes are done during the first phase of processing, targets during the second phase; so, if ./mydeps.cpu doesn't exist and I "make deps", I get first the error
Makefile:335: ./mydeps.cpu: No such file or directory
I hide the error using sinclude instead of include, but the problem is still there: the old file is included, not the just-generated-one. Have to run it twice to include the updated file. This is because make does a two-phase processing; is there any way to tell make to complete the target deps before parsing the includes?
Problem 2: even if the file ./mydeps.cpu doesn't exist and make deps actually creates it, I always get a "make: Nothing to do for deps". This doesn't happen with other targets. I don't understand why and how to avoid it.
Problem 1 is non-existant: before building a target, make automatically rebuilds makefiles (with implicit rules if no explicit rule is provided). So having a rule for the makefile ensures that will always be up to date, there is no need to run deps twice. Additionally, since CPUDEPS is a makefile, it will be updated automatically before any other rule is run, so dependencies will always be updated if necessary and make deps is not needed. You can probably notice this by yourself by observing the [DEPS] line being echoed if any of the CCFILES becomes more recent that the dependency file.
For Problem 2, adding anything to the recipe ensures that make doesn't complain about having nothing to do. If there is nothing else, you can use something like #echo OK to give feedback to the user, or a simple #true if you prefer totally silent makes.
What you are trying to achieve is useless: you can use the dependencies file that was created during the previous build. That's enough.
The main reasoning behind that rule is:
if you haven't changed any of your files, then the dependencies file is up-to-date, and there's nothing to build.
if you have changed anything, even very deep into your #include chain, on an existing file that were used by previous build, then the dependencies file have already caught it. You'll rebuild what is needed.
if you change something in a new file (you add that file!) then it was not used by previous build, and not listed in dependencies. But if you really want to use it, then you have to modify at least one of your other files that was used before, and you're back on the previous case.
The solution is to create the dependencies file during the normal process of the compilation, and to optionally include it (with sinclude) if it is present.

Using make to add m4 preprocessing to an arbitrary language

We have an ActionScript (Flex) project that we build using GNU make. We would like to add an M4 preprocessing step to the build process (e.g., so that we can create an ASSERT() macro that includes file and line numbers).
We are having remarkable difficulty.
Our current strategy is:
Create a directory "src/build" (assuming source code is in src/ and subdirectories).
Within src/build, create a Makefile.
Run make inside src/build.
The desired behavior is, make would then use the rules we write to send the *.as files src/ and its subdirs, creating new *.as files under build. For example:
src/bar.as -> m4 -> src/build/bar.as
src/a/foo.as -> m4 -> src/build/a/foo.as
The obvious make rule would be:
%.as : ../%.as
echo "m4 --args < $< > $#"
This works for bar.as but not a/foo.as, apparently because make is being "smart" about splitting and re-packing directories. make -d reveals:
Trying implicit prerequisite `a/../foo.as'.
Looking for a rule with intermediate file `a/../foo.as'.
but we want the prerequisite to be "../a/foo.as". This (what we don't want) is apparently documented behavior (http://www.gnu.org/software/make/manual/make.html#Pattern-Match).
Any suggestions? Is it possible to write a pattern rule that does what we want?
We've tried VPATH also and it does not work because the generated .as files are erroneously satisfying the dependency (because . is searched before the contents of VPATH).
Any help would be greatly appreciated.
One option is to use a different extension for files that haven't been preprocessed. Then you can have them in the same directory without conflict.
As Anon also said, your source code is no longer Flex - it is 'to be preprocessed Flex'. So, use an extension such as '.eas' (for Extended ActionScript) for the source code, and create a 'compiler' script that converts '.eas' into '.as' files, which can then be processed as before.
You may prefer to have the Extended ActionScript compiler do the whole compilation job - taking the '.eas' direct to the compiled form.
The main thing to be wary of is ensuring that '.eas' files are considered before the derived '.as' files. Otherwise, your changes in the '.eas' files will not be picked up, leading to hair-tearing and other undesirable behaviours (head banging, as in 'banging head against wall', for example) as you try to debug code that hasn't changed even though the source has.

Resources