Makefile.am process SUBDIRS in parallel - parallel-processing

We have unit tests in our project, and they run very slowly. The main reason for this, as far as I can tell is that each subdir runs serially. There is no reason for this and I'd like to modify things so each subdirectory is processed in parallel.
I found this question but it seems that the accepted answer is for how to specify this in your makefile, and not the makefile.am. I tried just adding the solution to my Makefile.am and it didn't seem to make a difference. Is this the correct way to do it at a Makefile.am level? If so, any advice for what I could be doing wrong? If not, please show me the path of truth :-)

In answer to my question, things from Makefile.am are translated fairly directly to the Makefile, so the changes in the original question can be made in Makefile.am. The only part I'm not 100% confident on is whether or not SUBDIRS (as it has special meaning) can get mangled in the autotools process. At any rate, processing the SUBDIRS in parallel is perhaps not typically the answer.
I solved this was to use a separate target for the directories I wanted processed in parallel, and I bet that this is typically the correct answer. There may well be some way to get the SUBDIRs to be processed this way, but using a separate target was pretty easy to get working for me, and at least for what I was trying to do a separate target was more appropriate.

Related

how my makefile aotumatically change .o prerequisite to .c targets?

here is my Make file.
look at target olmenu-proto1, it depends on olmenu-proto1_yacc.o
But I haven't define any target called olmenu-proto1_yacc.o.
Interestingly, when I invoke make olmenu-proto1,it works!
Strangely enough!
I want to know why it would do this, thank you!
Please include the relevant bits of your makefile in your question, rather than asking people to follow a link to another site. Especially one where it's impossible to view unless you enable a lot of javascript, which many people leave mostly disabled.
In any event, most likely the reason is because make can envision how to create targets by chaining together rules, even if you don't list the prerequisites explicitly. For more information see Chains of Implicit Rules in the GNU make manual.

Make - Parameter for not recompiling existing .o-Files?

Yes, that is the question, how can I prevent make, or better the compiler, from recompiling already existing libraries (.o-Files)?
I call the makefile simply via make [-parameters], and if possible, I don't like to rewrite the makefile itself (the reason is, I have to work with a lot of them)?
Edit: Whoops, sorry, yes, I meant the object-files.
The problem is, I got a directory with several hundred .mk-Files for several hundred sub-programs/libraries/whatever. I call the mk-files in a loop:
foreach i ('ls *.ml')
make CFLAGS='[some parameters]' -f $i -B
end
Now when I do it like this, everytime a makefile gets called, it recompiles not only the programs, but for each mk-file the needed libraries and objects, which slow down the whole process a lot. So I don't want it to recompile these.
These makefiles were not written by me, and I really don't want to edit several hundreds of them, so I'm asking whether there is a parameter especially prohibiting overwriting.

Can MinGW Make be sped up without disabling implicit rules?

GNU Make under MinGW is known to be very slow under certain conditions due to how it executes implicit rules and how Windows exposes file information (per "MinGW “make” starts very slowly").
That previous question and all other resources on the issue that I've found on the internet suggest working around the problem by disabling implicit rules entirely with the -r flag. But is there another way?
I have a "portable" Makefile that relies on them, and I'd like to make it so that it does not take around a minute to start it up each time, rather than having to get the Makefile owner to alter it just for me.
You should use make -d to see all the things make is doing and try to see where the time is going. One common reason for lengthy make times are match-anything rules which are used to determine whether or not a makefile needs to be rebuilt. Most of the match-anything rules CAN be removed; they're rarely needed anymore.
You can add this to your makefile and see if it helps:
%:: %,v
%:: RCS/%,v
%:: RCS/%
%:: s.%
%:: SCCS/s.%
And, if you don't need to auto-create your makefile you can add:
Makefile: ;
(also put any included makefiles there that you don't need to auto-create).
ETA
It seems your real question can be summed up as, "why does make take so much longer to start on Windows than on Linux, and what can I do to fix that without changing makefiles?"
The answer is, nothing. Make does exactly the same amount of work on both Windows and Linux: there are no extra rules or procedures happening on Windows that could be removed. The problem is that Windows NTFS is slower than typical Linux filesystems for these lookups. I know of no system setting, etc. that will fix this problem. Your only choice is to get make to do less work so that it's faster, and the only way to do that is by removing built-in rules you don't need.
If the problem is you really don't want to edit the actual makefiles, that's simple enough to solve: just write the rules above into a small separate makefile, maybe something like speedup.mk, then set the environment variable MAKEFILES=speedup.mk before invoking make. Make will parse that makefile as well without you having to change any makefiles.

Makefile include makefile from different directory

I have two makefiles, directoryA/Makefile and directoryB/Makefile.
directoryA/Makefile depends on targets in a rather large and complex directoryB/Makefile.
I could do a recursive make
$(MAKE) -C directoryB
But that is undesirable for several reasons. Two big reasons: I make have to execute the makefile several times, and make can't correctly know when rebuilding a target is necessary.
I would like to use the include directive. The problem is twofold:
The targets in directoryB/Makefile are all defined relative to that Makefile.
Many commands depend on the working directory being directoryB.
Recursive make solves both of these problems, but with big disadvantages (mentioned earlier). Is there a way to solve both problems when using include?
It's hard to say without seeing directoryA/Makefile, but another alternative is to have it include directoryB/Makefile, then
cd directoryB
make -f ../directoryA/Makefile

Building hierarchical Makefile with GNU Make

I have a project divided in modules, each hosted in a directory, say:
root
|_module_A
|_module.cpp
|_Makefile
|_module_B
|_Makefile
|_main.c
|_Makefile
main.c depends on targets defined in Makefiles related to module_A and module_B.
I want to write my root/Makefile with respect to targets defined in Makefiles of both modules.
Now, I know that I could use the include directive, but the problem here is that targets and filenames in module_A and module_B aren't prepended with their directory, so I get something like this:
make: *** No rule to make target `module.o', needed by `main.c'. Stop.
There is a good way to solve this?
Thanks.
There are a couple of ways to do this, none of them perfect. The basic problem is that Make is good at using things there to make things here, but not the other way around.
You haven't said what the targets in module_B are; I'll be pessimistic and suppose that module_A and module_B both have targets called module (different source files, different recipes), so you really can't use include.
The biggest choice you have to make is whether to use recursive Make:
If you don't, then root/Makefile must know how to build module_A/module and module_B/module, so you'll simply have to put those rules in. Then you must either leave the redundant rules in the subdir makefiles (and run the risk that they'll drift out of agreement with the master makefile), or eliminate them, or have them call the master makefile recursively (which you wouldn't have to do very often, but it sure would look silly).
If you do, then root/Makefile will look something like this:
main: main.o module_A/module.o Module_B/module.o
...
main.o: main.c
...
%/module.o:
$(MAKE) -C $(#D) $(#F)
This will work well enough, but it will know nothing about dependencies within the subdirectories, so it will sometimes fail to rebuild an object that is out of date. You can make clean (recursively) beforehand every time, just to be on the safe side, crude but effective. Or force the %/module.o rule, which is less wasteful but a little more complicated. Or duplicate the dependency information in root/Makefile, which is tedious and untidy.
It's just a question of your priorities.
Can't you write the makefile in a non-recursive way?
Recursive Make Considered Harmful

Resources