optional flavours of a target - makefile

I want to manage some features and setup specific quirks at compile time. Thus, I must be able to modify the list of objects that have to be compiled to make the final target, the compile command and link command. So far I've been in need of using only two flavours of the program, where the other one (say special) one just added some CXXFLAGS and one object file:
specialclient: CXXFLAGS+=-DSPECIALBUILD
specialclient: LDFLAGS+=-lanotherlib
specialclient: libanotherlib client
where client is the normal target. But now that I need another selectable feature (say peculiar), things start to get complicate, because I want to be able to select any combination of the features; yet I don't want to specify targets for peculiarclient and specialpeculiarclient. I would like to be able to specify each feature in the command line, like make [special] [peculiar] client.
How can I solve this problem?

Some conditionals should do the trick:
ifdef SPECIAL
CXXFLAGS+=-DSPECIALBUILD
LDFLAGS+=-lanotherlib
client: libanotherlib
endif
ifdef PECULIAR
CXXFLAGS+=-DPECULIARBUILD
LDFLAGS+=-lyetanotherlib
client: libyetanotherlib
endif
Now you can make client, make client SPECIAL=1, make client PECULIAR=yes, make client SPECIAL=TRUE PECULIAR=very, or whatever.

Related

Linking against an external object file (.o) with autoconf

For work purposes I need to link against an object file generated by another program and found in its folder, the case is that I did not find information about this kind of linkage. I think that if I hardcode the paths and put the name-of-obj.o in front of the package_LDADD variable should work, but the case is that I don't want to do it that way.
If the object is not found I want the configure to fail and tell the user that the name-of-obj.o is missing.
I tried by using AC_LIBOBJ([name-of-obj.o]) but this will try to find in the root directory a name-of-obj.c and compile it.
Any tip or solution around this issue?
Thank you!
I need to link against an object file generated by another program and
found in its folder
What you describe is a very unusual requirement, not among those that the Autotools are designed to handle cleanly or easily. In particular, Autoconf has no mechanisms specifically applicable to searching for bare object files, as opposed to libraries, and Automake has no particular automation around including such objects when it links. Nevertheless, these tools do have enough general purpose functionality to do what you want; it just won't be as tidy as you might like.
I think that if I hardcode the paths and put the
name-of-obj.o in front of the package_LDADD variable should work, but
the case is that I don't want to do it that way.
I take it that it is the "hardcode the paths" part that you want to avoid. Adding an item to an appropriate LDADD variable is not negotiable; it is the right way to get your object included in the link.
If the object is not found I want the configure to fail and tell the
user that the name-of-obj.o is missing.
Well, then, the key thing appears to be to get configure to perform a search for your object file. Autoconf does not have a built-in mechanism to perform such a search, but it's just a macro-based shell-script generator, so you can write such a search in shell script + Autoconf, maybe something like this:
AC_MSG_CHECKING([for name-of-obj.o])
OTHER_LOCATION=
for my_dir in
/some/location/other_program/src
/another/location/other_program.12345/src
$srcdir/../relative/location/other_program/src; do
AS_IF([test -r "${my_dir}/name-of-obj.o"], [
# optionally, perform any desired test to check that the object is usable
# ... perhaps one using AC_LINK_IFELSE ...
# if it passes, then
OTHER_LOCATION=${my_dir}
break
])
done
# Check whether the object was in fact discovered, and act appropriately
AS_IF([test "x${OTHER_LOCATION}" = x], [
# Not found
AC_MSG_RESULT([not found])
AC_MSG_ERROR([Cannot configure without name-of-obj.o])
], [
AC_MSG_RESULT([${OTHER_LOCATION}/name-of-obj.o])
AC_SUBST([OTHER_LOCATION])
])
That's functional, but of course you could embellish, such as by providing for the package builder to specify a location to use via a command-line argument (AC_ARG_WITH(...)). And if you want to do this for multiple objects, then you would probably want to wrap up at least some of that into a custom macro.
The Automake side is much less involved. To get the object linked, you just need to add it to the appropriate LDADD variable, using the output variable created by the above, such as:
foo_LDADD = $(OTHER_LOCATION)/name-of-obj.o
Note that if you're building just one program target then you can use the general LDADD instead of foo_LDADD, but note that by default these are alternatives not complements.
With that said, this is a bad idea overall. If you want to link something that is not part of your project, then you should get it from an installed library. That can be a local, custom-built library, of course, so long as it is a library, not a bare object file, and it is installed. It can be a static library if you don't want to rely on or distribute a separate shared library.
On the other hand, if your project is part of a larger build, then the best approach is probably to integrate it into that build, maybe as a subproject. It would still be best to link a library instead of a bare object file, but in a subproject context it might make sense to use a lib that was not installed to the build system. In conjunction with a command-line argument that tells it where to find the wanted lib, this could make the needed Autoconf code much cleaner and clearer.

Make overflow¹, or “How to override a target?”

I have a series (dozens) of projects that consist of large amounts of content in git repositories. Each repository has a git submodule of a common toolkit. The toolkit contains libraries and scripts needed to process the content repositories and build a publishable result. All the repositories are pushed to a host that runs CI and publishes the results. The idea is to keep the repeated code to an absolute minimum and mostly have content in the repositories and rely on and the toolkit to put it all together the same way for every project.
Each project has a top level Makefile that typically only has a couple lines, for example:
STAGE = stage
include toolkit/Makefile
The stage variable has some info about what stage this particular is in which determine which formats get built. Pretty much everything else is handled by the 600 line Makefile in the toolkit. Building some of the output formats can require a long chain of dependencies: The process of a source might trigger a target rule, but to get to the target there might be 8–10 intermediate dependencies where various files get generated before the final target can be made.
I've run across a couple situations where I want to completely replace (not just extend) a specific target rule in just one project. The target gets triggered in the middle of a chain of dependencies but I want to do something completely different for that one step.
I've tried just replacing the target in the top level Makefile:
STAGE = stage
%-target.fmt:
commands
include toolkit/Makefile
This is specifically documented not to be supported, but tantalizingly it works sometime of the time. I've tried changing the order of declaring the custom target and the include but that doesn't seem to significantly affect this. In case it matters, yes, the use of patterns in targets is important.
Sometimes it is useful to have a makefile that is mostly just like another makefile. You can often use the ‘include’ directive to include one in the other, and add more targets or variable definitions. However, it is invalid for two makefiles to give different recipes for the same target.
Interestingly if I put custom functions in the top level Makefile below the include I can override the functions from the toolkit such that $(call do_thing) will use my override:
STAGE = stage
include toolkit/Makefile
define do_thing
commands
endef
However the same does not seem to be true for targets. I am aware of the two colon syntax, but I do not want to just extend an existing target with more dependencies, I want to replace the target entirely with a different way of generating the same file.
I've thought about using recursive calls to make as suggested in the documentation, but then the environment including helper functions that are extensively setup in the toolkit Makefile would not be available to any targets in the top level Makefile. This would be a show stopper.
Is there any way to make make make an exception for me? Can it be coerced into overriding targets? I'm using exclusively recent versions of GNU-Make and am not too concerned about portability. Failing that is there another conceptual way to accomplish the same ends?
¹ My brain hasn't had enough coffee today. In trying to open Stack Overflow to ask this question I typed makeoverflow.com into my browser and was confused why auto-completion wasn't kicking in.
Updated answer:
If your recipe has dependencies, these cannot be overriden by default. Then $(eval) might save you like this:
In toolkit have a macro definition with your generic rule:
ifndef TARGET_FMT_COMMANDS
define TARGET_FMT_COMMANDS
command1 # note this these commands should be prefixed with TAB character
command2
endef
endif
define RULE_TEMPLATE
%-target.fmt: $(1)
$$(call TARGET_FMT_COMMANDS)
endef
# define the default dependencies, notice the ?= assignment
TARGET_DEPS?=list dependencies here
# instantiate the template for the default case
$(eval $(call RULE_TEMPLATE,$(TARGET_DEPS)))
Then into the calling code, just define TARGET_FMT_COMMANDS and TARGET_DEPS before including the toolkit and this should do the trick.
(please forgive the names of the defines/variables, they are only an example)
Initial answer:
Well, I'd write this in the toolkit:
define TARGET_FMT_COMMANDS
command1 # note this these commands should be prefixed with TAB character
command2
endef
%-target.fmt:
$(call TARGET_FMT_COMMANDS)
The you could simply redefine TARGET_FMT_COMMANDS after include toolkit/Makefile
The trick is to systematically have the TAB character preced the commands inside the definition if not you get weird errors.
You can also give parameters to the definition, just as usual.
I ran into the same issue, how I ended up working around the problem that overriding a target does not override the prerequisites was to override the pre-requisites' rules as well to force some of them to be empty commands.
in toolkit/Makefile:
test: depend depend1 depend2
#echo test toolkit
...
in Makefile:
include toolkit/Makefile
depend1 depend2: ;
test: depend
#echo test
Notice how depend1 and depend2 now have empty targets, so the test target's command is overridden and the dependencies are effectively overridden as well.

how to check for a macro defined in a c file in Makefile?

I have a #define ONB in a c file which (with several #ifndef...#endifs) changes many aspects of a programs behavior. Now I want to change the project makefile (or even better Makefile.am) so that if ONB is defined and some other options are set accordingly, it runs some special commands.
I searched the web but all i found was checking for environment variables... So is there a way to do this? Or I must change the c code to check for that in environment variables?(I prefer not changing the code because it is a really big project and i do not know everything about it)
Questions: My level is insufficient to ask in comments so I will have to ask here:
How and when is the define added to the target in the first place?
Do you essentially want a way to be able to post compile query the binaries to to determine if a particular define was used?
It would be helpful if you could give a concrete example, i.e. what are the special commands you want run, and what are the .c .h files involved?
Possible solution: Depending on what you need you could use LLVM tools to maybe generate and examine the AST of your code to see if a define is used. But this seems a little like over engineering.
Possible solution: You could also use #includes to pull in .c or header files and a conditional error be generated, or compile (to a .o), then if the compile fails you know it is defined or not. But this has it's own issues depending on how things are set-up in your make file.

Binaries for different compilers inside same gnu session or alternatives

I am having a design problem when using GNU Make.
My problem is the following:
I have 2 executables to compile.
These binaries need to be compiled for each compiler I list in a variable, let us call it COMPILERS.
After compiling the binaries, I need to run all binaries (all of them) several times and generate, for each of them, the times in a text file.
I must put all these files together, and generate a plot out of all that data.
So, for example, if I have 3 compilers to test, I would have 6 binaries, 6 * n_of_time_to_run_benchmark and a final output with all that data, in a single plot file.
The problem with this is that my usual way to approach binary compilation is to use CXX variable, CXXFLAGS, etc. But the CXX variable is supposed to change inside the same session, which looks inconsistent to me. An example of invocation would be:
make plot COMPILERS=clang++ g++
What I did is to just compile binaries separately every time I invoke make and per compiler, making use of CXX variable.
From a script I create a folder build-clang++ and compile, I create another folder build-g++ and compile, run all benchmarks, per folder for every couple of executables for same compiler. But for this I need an external script, and this is what I want to avoid, to be able to port to windows later more easily without duplicating scripts or installing more dependencies.
What is the best way to handle this:
Use another Makefile that calls this makefile with different configurations and use it as my "script" for generating the plot? This way the makefile looks like much more traditional to me, with his separate flags, etc.
Just create all targets directly inside same Make session?
To me it looks cleaner the script solution because a Makefile is usually written in a way that the compiler is a fixed variable that does not change in the whole session.
Thank you.

How can I build many variants of one project using makefile?

I work on a project which is often built and run on several operating systems and in multiple configurations. I use two compilers: icc and gcc, and multiple sets of arguments for those compilers. That can give me many variants of build of one project.
What I would like to do is:
compile project using icc or gcc compiler with one set of arguments
test the performance of the application befor and after new build>
compare obtained results
compile project for another set of arguments and repeat previous steps
Has anyone an idea how to do it nicely using makefile?
You just need to cascade your make-targets according to your needs:
E.g.:
# Assumed that $(CONFIGURATION_FILES) is a list of files, all named *.cfg
# There you store your set of arguments per step
#The main target
all: $(CONFIGURATION_FILES)
#Procedure for each configuration-file
%.cfg: compile_icc compile_gcc test compare
compile_icc:
#DO whatever is necesarry
compile_gcc:
#DO whatever is necesarry
test:
#DO whatever is necesarry
compare:
#DO whatever is necesarry
However, for this kind of job I would rather use some build-automation tool ... I only know Maven, but for Makefile-Based build some other tools may fit better ... take a look on the different options e.g. here:
https://en.wikipedia.org/wiki/List_of_build_automation_software

Resources