cmake, keeping link flags of internal libs - gcc

In src/foo I create lib foo linked with -lwhatever
add_library(foo foo.cpp)
target_link_libraries(foo -lwhatever)
In src/bar I use foo
add_executable (bar bar.cpp)
target_link_libraries(bar foo)
Might be thanks to LTO, but I need to add -lwhatever to target_link_libraries of bar otherwise I have missing symbols at compile time.
target_link_libraries(bar foo -lwhatever)
Can't it be done transparently?

As of CMake 3.0, CMake is missing the INTERFACE_LINK_FLAGS target property which would make this possible.
The problem here is really that using target_link_libraries to specify linker flags was not the smartest design decision in the first place. A separate command in the spirit of target_compile_options would have been desirable.
I consider this an oversight in CMake. If you have a relevant use case, you might be able to argue for the inclusion of a target_link_flags command in a future CMake version. Feel free to contact the developer's mailing list if this is a major concern for you.
An alternative for now would be to use the LINK_FLAGS target property of foo to specify the link flags. You can then also inspect that property at the point where you specify bar to avoid having to hardcode the options twice, effectively turning it into an INTERFACE_* property manually. But that's about as good as it gets.

Related

Are there any practical reasons to use `-include` in a Makefile?

I was recently debugging a vague problem which turned out to be caused by a misplaced sub-Makefile which was conditionally included into a main Makefile by the -include directive. Mind the starting minus sign. According to GNU Make manual:
If you want make to simply ignore a makefile which does not exist or
cannot be remade, with no error message, use the -include directive
instead of include, like this:
-include filenames…
This acts like include in every way except that there is no error (not
even a warning) if any of the filenames (or any prerequisites of any
of the filenames) do not exist or cannot be remade.
For compatibility with some other make implementations, sinclude is
another name for -include.
The nastiest problem with this directive is that no diagnostics whatsoever are given when the sub-Makefile cannot be found. Needless to say, this complicates debugging a lot.
In fact, there was no real need to use it there, a regular include worked just fine and is much more robust. I understand the original author's intention for using -include. That sub-Makefile contained some "secret" stuff that was not meant to be shared with 3rd party engineers. But this functionality was never used in the end, and it could have been implemented in a more transparent way.
I wonder if there are other practical cases when -include is useful. Maybe some cases when one or several makefiles are dynamically generated during the build process?
Surely, the most useful application of -include is when the include file is auto-generated by make itself.
Remember that all include files also become make's targets automatically. So -include generated_file does not make make to fail prematurely, but implies that generated_file will be (re-)built using other rules in the current Makefile. This can be exploited in auto-dependencies generation, for example.
BTW. Another trick with 'include' is that include $(empty_var) also works without errors (i.e. is no-op).

how does ld deal with code that is supplied twice (in a source file and in a library)?

Suppose we call
gcc -Dmyflag -lmylib mycode.c
where mylib contains all of mycode but is compiled without -Dmyflag. So all functions and other entities implemented in mycode are available in two versions to the loader. Empirically, I find that the version from mycode is taken. Can I rely on that? Will mycode always overwrite mylib?
Empirically, I find that the version from mycode is taken.
Read this explanation of how linker works with archive libraries, and possibly this one.
Can I rely on that?
You should rely on understanding how this works.
If you understood material in referenced links, you'll observe that adding main to libmylib.a will invert the answer (and if mycode.c also contains main, you'll get duplicate symbol definition error).
If you are using a dynamic library libmylib.so, the rules are different, and the library will always lose to the main binary, although there are many complications, such as LD_PRELOAD, linking the library with -Bsymbolic, and others.
In short, you should prefer to not do this at all.

G++/LD fails: can't find library when library isn't actually needed

I have a program foo I'm trying to compile and link and I'm running into a chicken and egg dillemma.
For reasons I'll explain below, Within a given directory I'm forced to add a link to several libraries we build (let's call them libA and libB) regardless of my target. I know I only actually need libA for my program; so after all libs are built and this binary is built I verified with ldd -u -r foo to show that libB is an unused direct dependency.
Being unused I altered the makefiles and flags such that libB is enveloped with -Wl --as-needed and -Wl --no-as-needed. I make rebuild, use ldd again and this time it doesn't show any unused deps. So far so good.
Now the fun part: Since its unused I would expect that if libB is not found/available/built that I should still be able to compile and link foo as long is libA is available. (example: If I did a fresh checkout and only built libA before trying to compile this specific test). But ld errors out with /usr/bin/ld: cannot find -lB
This suggests that ld needs to locate libB even if it won't need any of the symbols it provides? That doesn't seem to make sense. If all symbolic dependencies are already met, why does it even need to look at this other library? (That would explain the problem ld has and why this is not possible)
Is there a way I can say "Hey don't complain if you can't find this library and we shouldn't need to link with it?"
The promised reasons below
For various reasons beyond my control I have to share makeflags with many other tests in this directory due to the projects makefile hierarchy. There is a two level makefile for all these tests that says foo is a phony target, his recipe is make -f generictest.mk target=foo, and the generictest.mk just says that the source file is $(target).C, that this binary needs to use each library we build, specifies relative path to our root directory and then includes root's generic makefile. The root directory generic makefile expands all the other stuff out (flags, options, compiler, auto-gen of dependencies through g++ etc), and most importantly for each statement that said "use libX" in generictest.mk it adds -lX to the flags (or in my case enveloped in as-needed's)
While I'm well aware there are lots of things that are very unideal and/or horribly incorrect in terms of makefile best practices with this, I don't have the authority/physical ability to change it. And compared to the alternative employed in other folders, where others make individual concrete copies of this makefile for each target, I greatly prefer it; because that forces me to edit all of them whenever want to revise our whole make pattern, and yields lot of other typos and problems.
I could certainly create another generictest.mk like file to use for some tests and group together those using each based on actual library needs, but it would be kind of neat if I didn't have to as long as I said "you don't all of them, you need each of them but only if you actually use it".
There's no way that the linker can know that the library is not needed. Even after all your "normal" libraries are linked there are still lots and lots of unresolved symbols: symbols for the C runtime library (printf, etc.). The linker has no idea where those are going to come from.
Personally I'd be surprised if the linker didn't complain, even if every single symbol was already resolved. After all there may be fancy things at work here: weak bindings, etc. which may mean that symbols found later on the link line would be preferred over symbols found earlier (I'm not 100% sure this is possible but I wouldn't be surprised).
As for your situation, if you know that the library is not needed can't you just use $(filter-out ...) on the link command line to get rid of it? You'd have to write your own explicit rule for this with your own recipe, rather than using a default one, but at least you could use all the same variables.
Alternatively it MIGHT be possible to play some tricks with target-specific variables. Declare a target-specific variable for that target that resets the variable containing the "bad library" with a value that doesn't contain it (maybe by using $(filter-out ...) as above), and it will override that value for that target only. There are some subtle gotchas with target-specific variables overriding "more general" variables but I think it would work.

Using -Lpath -lname options vs. providing the library file directly

Recently, I got into a problem with linking and VPATH with the side effect of this question.
Assume you are implementing a library and you want to link your tests with it. You have two options (that I know of):
Using -L and -l options:
gcc main.o -Lpath/to/lib -lname
Giving the library file directly:
gcc main.o path/to/lib/libname.a
My question is, given the fact that I am linking to my own library under implementation (and not one that is installed, and therefore placed in /usr/lib for example), is there any advantage in choosing either method?
Currently I use the first method, but the second method allows me to solve the problem I had with VPATH. I am particularly interested in knowing whether there are certain caveats with the second method, before making the switch.
The only difference I can think of is that the first method allows the linker to choose a shared library if it exists. The second doesn't, because you explicitly name a .a static archive.
If you were using a shared library there is a difference: if the second form names a shared library without a soname the application would have a DT_NEEDED tag of path/to/lib/libname.so, but for the first form would have the tag with the value libname.so. (If the shared library has a soname that would be used for the DT_NEEDED tag however the path was specified.)

Is there a way to strip all functions from an object file that I am not using?

I am trying to save space in my executable and I noticed that several functions are being added into my object files, even though I never call them (the code is from a library).
Is there a way to tell gcc to remove these functions automatically or do I need to remove them manually?
If you are compiling into object files (not executables), then a compiler will never remove any non-static functions, since it's always possible you will link the object file against another object file that will call that function. So your first step should be declaring as many functions as possible static.
Secondly, the only way for a compiler to remove any unused functions would be to statically link your executable. In that case, there is at least the possibility that a program might come along and figure out what functions are used and which ones are not used.
The catch is, I don't believe that gcc actually does this type of cross-module optimization. Your best bet is the -Os flag to optimize for code size, but even then, if you have an object file abc.o which has some unused non-static functions and you link statically against some executable def.exe, I don't believe that gcc will go and strip out the code for the unused functions.
If you truly desperately need this to be done, I think you might have to actually #include the files together so that after the preprocessor pass, it results in a single .c file being compiled. With gcc compiling a single monstrous jumbo source file, you stand the best chance of unused functions being eliminated.
Have you looked into calling gcc with -Os (optimize for size.) I'm not sure if it strips unreached code, but it would be simple enough to test. You could also, after getting your executable back, 'strip' it. I'm sure there's a gcc command-line arg to do the same thing - is it --dead_strip?
In addition to -Os to optimize for size, this link may be of help.
Since I asked this question, GCC 4.5 was released which includes an option to combine all files so it looks like it is just 1 gigantic source file. Using that option, it is possible to easily strip out the unused functions.
More details here
IIRC the linker by default does what you want ins some specific cases. The short of it is that library files contain a bunch of object files and only referenced files are linked in. If you can figure out how to get GCC to emit each function into it's own object file and then build this into a library you should get what you are looking.
I only know of one compiler that can actually do this: here (look at the -lib flag)

Resources