Make inconsistent assumptions over interface - makefile

I want to optimise the compilation time of my makefile. One problem that waists my time is, after modifying one single file, make returns for instance,
File "frontend/parser_e.ml", line 1:
Error: The files expression/rc.cmi and frontend/gen/lexer_ref.cmi
make inconsistent assumptions over interface Utility
make: *** [frontend/parser_e.cmx] Error 2
rm frontend/parser_name.ml
Note that the files in trouble may change, but it happens quite often. What I have to do is make clean and then make, as a consequence it is not an incremental build and takes time.
So does anyone know what I should check in my makefile to reduce the chance of having this kind of error?
Edit 1:
Actually, all my ml-related files are in depth 1, except frontend/gen/*, which are in depth 2. Following the answer of #camlspotter, I modified a little bit the ocamldep part of my makefile. Now it looks like follows:
DIRS= -I frontend -I frontend/gen -I lib ...
depend: $(AUTOGEN)
# ocamldep -native $(DIRS) */*.ml */*.mli > depend # this is what was written before, I don't hink it is correct
ocamldep -native $(DIRS) *.ml *.mli > depend
As a consequence, make following another make gives immediately an inconsistence error.
One remark is I don't have AUTOGEN, is it normal?
Another remark is that make depend generates a depend that has 0 character, is it normal?
Edit 2:
I modified depend: by following Makefile of OCaml source code:
beforedepend:: */*.ml
depend: beforedepend
(for d in \
frontend frontend/gen lib ... ; \
do ocamldep $(DIRS) $$d/*.mli $$d/*.ml; \
done) > depend
I have actually around 20 folders, each has 1-5 ml files. This time, make rangs over for d in ..., and does not want to stop. But if I remove 3-4 folders, it succeeds to create a depend after several seconds.

Your Makefile does not cover all the necessary dependencies between modules.
The meaning of
File "frontend/parser_e.ml", line 1:
Error: The files expression/rc.cmi and frontend/gen/lexer_ref.cmi
make inconsistent assumptions over interface Utility
is:
frontend/parser_e.ml depends on expression/rc.ml and frontend/gen/lexer_ref.ml
Both expression/rc.ml and frontend/gen/lexer_ref.ml use module named Utility
expression/rc.ml and frontend/gen/lexer_ref.ml must agree with the type (interface) of Utility, but did not.
I think of two possibilities to cause this state:
There may be two different utility.ml, for example dir_a/utility.ml and dir_b/utility.ml. OCaml does not allow linking modules with the same name. You can workaround this using packed modules (see -pack compiler option). Your case is not this.
The both modules use the same utility.ml but the dependencies may not be perfectly known to your Makefile. This is your case.
A possible scenario of the second case is:
You have modified utility.ml or utility.mli and its interface (.cmi file) has been changed.
One of expression/rc.ml and frontend/gen/lexer_ref.ml is recompiled against this new interface of Utility, but the other IS NOT, since the dependency is not known.
The compiler has found the inconsistency between the two modules when they are used together in frontend/parser_e.ml.
For fix, you have to run ocamldep to capture all the necessary module dependencies and inform it to make. Note that:
Give proper options and arguments. Since you work with nested directories, you need -I option several times.
Make sure that the auto-generated .ml and .mli files are really generated before ocamldep runs. Since you seem to have .mly and .mll files and you have the issue around them, I suspect you miss something around here.
A good example of the dependency analysis of OCaml modules is found at OCaml compiler source code itself. It is good to check around its lines with beforedepend, depend and include .depend.
General hints:
Add include .depend to your Makefile and capture all the module dependencies into this .depend file, using ocamldep
Note that all the .ml and .mli files of your project must be scanned by ocamldep. Do not forget to add -I options properly or it misses some dependencies.
Before running ocamldep, make sure auto-generated .ml and .mli files such as the output of .mly and .mll are generated. Or it misses some dependencies.
Typical Makefile looks like:
beforedepend:: x.ml
x.ml: x.mly
ocamlyacc x.mly
beforedepend:: y.ml
y.ml: y.mll
ocamllex y.mll
depend: beforedepend
ocamldep -I <dir1> -I <dir2> <all the ml and mli paths> > .depend
include .depend

Related

Make: Avoiding issues from having same targets in multiple Makefiles

I need to know the best way of dealing with this. Also you could answer this -- after reading the sample below --: Look at the makelib target in package.make and tell me if there is a way to force this be treated as not updated if the recipe (make -C ../lib/ -f lib.make) reports as nothing to be made (not using ordered prequisites)?
I need to explain this point using an example. I have inherited this and I need the best way to make this right.
A target which other targets will be depending on:
File lib.make
--------------
.DEFAULT_GOAL = thelib.dll
%.dll: file1.obj file2.obj
makelib file1.obj file2.obj -o thelib.dll
This by itself is pretty solid. You run it once (make -f lib.make) and it creates the lib. If you run it subsequently, having no modified files, then it will tell you it has nothing to do.
Now we're going to use this in a special way somewhere else:
File: package.make
------------------
.DEFAULT_GOAL: all
all: package
makelib:
#make -C ../lib/ -f lib.make
package: makelib file3 file4
#package_files file3 file4 ../lib/out/*.dll -o package
This is how lib.make is referenced inside package.make.
The issue is even though the package gets created when you call make -f package.make all make assumes that package target needs to be rebuilt every time since one of its dependencies -- makelib -- had to be remade
Make considers makelib out of date despite what happens after entering lib.make.
To correct this I thought of a few choices:
moving makelib to the ordered prerequisites ( after the |) but that's not quite right because in case of a newly built library my package wont' be updated
adding the dll (thelib.dll) as the dependency to the makelib target a second time but this would almost duplicate the logic and break the encapsulation.
removing makelib target and moving the line #make -C ../lib/ -f lib.make to inside the package recipe. There is a problem with this and that is that I have removed the dependency between the package and lib. If lib requires update, the package won't know about it and won't get updated.
using include lib.make and then rewriting package rule to something like: package: thelib.dll file3 file4. There are problems with this also and the least of them is for a make file to be included, it must be written as such. Otherwise a lot of overwriting/conflicting targets and definitions will be introduced.
Are there any suggestions other than and directly listing the dll as the dependency?
There are two main ways this works:
First, if you use recursive make (please remember to always invoke a sub-make using $(MAKE), never make directly) then you should make the target in the parent makefile be the actual file generated by the sub-make:
package: lib/thelib.dll ...
...
lib/thelib.dll: FORCE
$(MAKE) -f lib
FORCE:
Second, you can use non-recursive make which means you include the sub-makefile into the parent make and write it so that it expects that. You can play tricks with variables etc. to make this more generic, so it can be invoked from either the parent or subdirectory, if you want.

Generate include files for the Makefile by the same Makefile

In my program, I have a somewhat complicated build process. Currently, in one directory I use include in Makefile.am with a file, that does not exist but has to be build on its own. The reason is that this include file is quite long. Further in the real program it is not just only one file but several and the generation process for this file can change from time to time.
The Makefile.am looks something like this
noinst_LIBRARIES = libtest.a
nodist_libtest_a_SOURCES = file.c
CLEANFILES = file.c Make_file.mk
$(builddir)/Make_test.mk: $(srcdir)/Perl/generate_mk_files.pl
perl $(srcdir)/Perl/generate_mk_files.pl file
include $(builddir)/Make_file.mk
After creation of Make_file.mk it looks something like
$(builddir)/file.c: $(srcdir)/file.template $(srcdir)/Perl/generate_c.pl
perl $(srcdir)/Perl/generate_c.pl $(srcdir)/file.template
Automake works and the final build process as well. The output to make is something like (I have shorted it somewhat):
Makefile:721: Make_file.mk: Datei oder Verzeichnis nicht gefunden (file not found)
perl ../../../../src/components/test/Perl/generate_mk_files.pl test
perl ../../../../src/components/test/Perl/generate_c.pl ../../../../src/components/test/file.template
Therefore, make first complains that the include file is not found, then creates it and then also follows the rules of the included file.
Although I am happy that it works I wonder why. First, I thought that make loads the Makefile. During this step, Make_file.mk does not exists. Therefore it seems the Makefile is loaded more than once.
Further, the manual of Automake for include states:
Note that these fragments are read and interpreted by automake, not by
make.
Which is not what I see, since the included fragment does not exist during the execution of Automake.
My questions basically are:
Why does it work?
Is this the correct way to do this or should I use another approach, e.g. starting new instances of make within Makefile.
I don't really know Automake, but, from the GNU make manual :
If an included makefile cannot be found in any of these directories {standard includes directories} , a
warning message is generated, but it is not an immediately fatal
error; processing of the makefile containing the include continues.
Once it has finished reading makefiles, make will try to remake any
that are out of date or don’t exist. See How Makefiles Are Remade.
Only after it has tried to find a way to remake a makefile and failed,
will make diagnose the missing makefile as a fatal error.
If you want make to simply ignore a makefile which does not exist or
cannot be remade, with no error message, use the -include directive
instead of include, like this:
-include filenames…
This acts like include in every way except that there is no error (not even a warning) if any of the filenames (or any
prerequisites of any of the filenames) do not exist or cannot be
remade.
So basically, make cannot execute the recipe for remaking the include file before he has finished to parse the main Makefile. So it raises a warning, continue to read the Makefile, find the rule for remaking the included file, remake it, and then restart itself (that is explained in details in the How Makefiles Are Remade section).
Going back to the manual, it states there's two forms for automakes include mechanism:
include $(srcdir)/file
and
include $(top_srcdir)/file
neither of which match your include. So I'd imagine the include is actually run by the underlying make whatever that might be (e.g. GNU Make, though of course other make programs have this functionality as well).
Now for the questions:
Why does it work?
As explained in another answer, GNU Make will attempt to make a missing included makefile, before failing.
Is this the correct way to do this or should I use another approach, e.g. starting new instances of make within Makefile
Generating makefiles is one of the tasks autotools do, either through autoconf or automake. Going through multiple stages of "making makefiles" seems prone to error (and hard to maintain). Recursive make has similar problems.
The reason is that this include file is quite long.
automake include statements will happily paste together a large makefile out of smaller components.
Further in the real program it is not just only one file but several and the generation process for this file can change from time to time.
It's hard to recommend what to suggest to do in autotools based on how the "changes" are determined. Since it seems you're also using libtool adding/removing sources to libs (or entire libs) there can be effected by conditionals, variables, etc.

Makefile pattern rules not working

I am learning makefiles, and can't just wrap my head around this problem i am having, and would like to understand how/why this fail.
I have half a dozen erlang files in a src directory. I want to compile these into a ebin directory, without having to define a rule for each and every one of them. According to the Gnu make documentation, pattern rules should be right up my alley.
However, with the following makefile, all I get from make is make: *** No targets. Stop. Why is that?
ebin/%.beam: src/%.erl
mkdir -p ebin
erlc -o ebin $<
Edit: Based on this answer, I now understand that i would have to explicitly declare the targets, for instance by using make ebin/cmplx.beam. However, i still do not understand how i should write my makefile to get my desired behaviour - since I have half a dozen targets (and in other projects even more), this seems like an unnecessary hassle. Is there not a way to define targets based on the source file names?
The target rule tells make that whenever it needs to produce a beam file in the ebin directory, and there exists a corresponding erl file in the src directory, it can use erlc.
However, this doesn't tell make that this is what it needs to do. You could explicitly tell make what it needs to do by giving it a target on the command line:
make ebin/foo.beam
If you don't give a target on the command line, make will pick the first non-pattern rule in the makefile as its target. However, your makefile doesn't have any non-pattern rules, so there is no target.
What you probably want is that for each existing erl file in src, make should consider the corresponding beam file in ebin to be a target. You can achieve that by calling wildcard and patsubst:
erl_files=$(wildcard src/*.erl)
beam_files=$(patsubst src/%.erl,ebin/%.beam,$(erl_files))
ebin/%.beam: src/%.erl
mkdir -p ebin
erlc -o ebin $<
all: $(beam_files)
(The indented lines need to be actual physical tabs, not spaces.)
That way, running make will rebuild all beam files that are out of date. all gets chosen as the default target, and it in turn depends on all beam existing or potential, each of which in turn depends on the corresponding erl file.
This trick is described in the GNU make manual.

I Want my makefile to be more order independent!

This is related to my previous question: Why does .PHONY not work in this situation?.
I have a makefile system that I wrote to make it easy for developers who are not familiar with make, to do their tasks. In short, there is a generic portion which would be the same for all projects, and a set of makefiles that are specific for a given project. The project specific ones include the generic ones. It worked great on make 3.80 for some reason, but when I tried it out on make 3.81 I ran into a few problems. That forced me to make changes that are mentioned in the above post. Now I have some new problems, so I decided to make another post. Like in that post, I made a much smaller and simpler set of makefiles that show the problem. Unfortunatly, the "simple" case consists of 6 files. Sorry about that. First I'll start with the "project specific" ones (these are meant to be simple):
makefile:
TARGETS:=\
Lib1.mk \
Lib2.mk \
my_prog.mk \
include generic/top.mk
Lib1.mk:
BINARY:=Lib1
TYPE:=LIB
LOCATION:=a/location
include generic/rules.mk
Lib2.mk:
BINARY:=Lib2
TYPE:=LIB
LOCATION:=another/location
LIBS:=Lib1
include generic/rules.mk
my_prog.mk:
BINARY:=my_prog
TYPE:=EXE
LOCATION:=some/location
LIBS:=Lib1 Lib2
include generic/rules.mk
A quick description: Makefile simply lists the names of all the targets. A target is either a executable or a library. BINARY is the name of the library or executable (extensions are added by the generic part). TYPE is either EXE or LIB. LOCATION is where the binary should go. LIBS is whatever libraries this binary depends on. The real ones handles creating all the -L, rpath, etc. stuff for the user (as well as their equivalents for visual studio). Now for the generic ones (these do the REAL work):
generic/top.mk:
ALL_BINS:=
.PHONY: all
all:
include $(TARGETS)
all: $(ALL_BINS)
%.so %.exe:
mkdir -p $(dir $#)
touch $#
clean:
rm -rf out
and finally..
generic/rules.mk:
ifeq (EXE,$(TYPE))
$(BINARY).FULL_FILE_NAME:=out/$(LOCATION)/$(BINARY).exe
else
$(BINARY).FULL_FILE_NAME:=out/$(LOCATION)/lib$(BINARY).so
endif
$(BINARY).DEP_LIBS:=$(foreach a,$(LIBS),$($(a).FULL_FILE_NAME))
ALL_BINS+=$(BINARY)
$(BINARY): $($(BINARY).FULL_FILE_NAME)
$($(BINARY).FULL_FILE_NAME): $($(BINARY).DEP_LIBS)
BINARY:=
LOCATION:=
LIBS:=
Ok, in this state, things work fine. Make handles all the dependencies correctly, and if I touch any of the files, it will correctly "build" only the ones that it has to, and nothing more. The problem happens when you take the m_prog.mk line from makefile and move it to the top of the list, like so:
TARGETS:=\
my_prog.mk \
Lib1.mk \
Lib2.mk \
The problem seems to be that while its is going through rules.mk for my_prog.mk it does not yet know what the full library path for Lib1 and Lib2 (they are empty strings). So in the end, it considers my_prog to be dependent on nothing and it tries to build it out of order. In this example, you just see it "touch" my_prog first and then the other 2. Of course, when I have real compiler and linker commands in there, it throws an error.
Back when I simply had the .PHONY targets depend on each other (so my_prog depended on Lib1 and Lib2) life was easy and harmonious. Now that I can't do that, life became more difficult.
You may say, "heck just put it in the right order!". Well up to now, this has been handled automatically through make for the end users. In fact, most customers have been putting things in alphabetical order. They don't know or care what order they depend on each other. It would stink to have to tell them to re-order all of that now. Sorry for the length of this post. I'd appreciate any answers!
If you set variables using the := assignment operator, the assignment is evaluated immediately.
If you set variables using just = as the assignment operator, they're evaluated lazily, as late as possible (at the time of actual use).
See http://www.gnu.org/software/automake/manual/make/Flavors.html.
There are several ways to do what you want. The cleanest is probably by using vpath. Just modify rules.mk:
$(BINARY).DEP_LIBS:=$(foreach a,$(LIBS),$(a).so)
ALL_BINS+=$(BINARY)
vpath %.so out/$(LOCATION)

Makefile: need to do a target before including another makefile

Part of my Makefile:
CPUDEPS=./mydeps.cpu
(...)
deps: $(CPUDEPS)
$(CPUDEPS): $(CCFILES)
#echo [DEPS] CPU
$(CMDECHO)makedepend -Y -s'# CPU sources dependencies generated with "make deps"' \
-w4096 -f- -- $(CFLAGS) -- $^ 2> /dev/null > $(CPUDEPS)
(...)
sinclude $(CPUDEPS)
Problem 1: includes are done during the first phase of processing, targets during the second phase; so, if ./mydeps.cpu doesn't exist and I "make deps", I get first the error
Makefile:335: ./mydeps.cpu: No such file or directory
I hide the error using sinclude instead of include, but the problem is still there: the old file is included, not the just-generated-one. Have to run it twice to include the updated file. This is because make does a two-phase processing; is there any way to tell make to complete the target deps before parsing the includes?
Problem 2: even if the file ./mydeps.cpu doesn't exist and make deps actually creates it, I always get a "make: Nothing to do for deps". This doesn't happen with other targets. I don't understand why and how to avoid it.
Problem 1 is non-existant: before building a target, make automatically rebuilds makefiles (with implicit rules if no explicit rule is provided). So having a rule for the makefile ensures that will always be up to date, there is no need to run deps twice. Additionally, since CPUDEPS is a makefile, it will be updated automatically before any other rule is run, so dependencies will always be updated if necessary and make deps is not needed. You can probably notice this by yourself by observing the [DEPS] line being echoed if any of the CCFILES becomes more recent that the dependency file.
For Problem 2, adding anything to the recipe ensures that make doesn't complain about having nothing to do. If there is nothing else, you can use something like #echo OK to give feedback to the user, or a simple #true if you prefer totally silent makes.
What you are trying to achieve is useless: you can use the dependencies file that was created during the previous build. That's enough.
The main reasoning behind that rule is:
if you haven't changed any of your files, then the dependencies file is up-to-date, and there's nothing to build.
if you have changed anything, even very deep into your #include chain, on an existing file that were used by previous build, then the dependencies file have already caught it. You'll rebuild what is needed.
if you change something in a new file (you add that file!) then it was not used by previous build, and not listed in dependencies. But if you really want to use it, then you have to modify at least one of your other files that was used before, and you're back on the previous case.
The solution is to create the dependencies file during the normal process of the compilation, and to optionally include it (with sinclude) if it is present.

Resources