In the library I am using there is a trouble: all paths are relative.
I mean, file from path1/path2/file.h has #include "interface.h", which (interface) is located in anotherpath/anotherpath2/interface.h.
Are there any ways how can I force linker to look for includes in different directories?
The linker couldn't care less for header files. It's the compiler you are looking at. (To be really nitpicking, it's the preprocessor. ;-) )
CMake has the include_directories() command:
include_directories( "anotherpath/anotherpath2" )
This, in ./CMakeLists.txt, would make #include "interface.h" possible.
But is that what you really want? Usually, directories are used to segregate modules. #include "anotherpath/anotherpath2/interface.h" would send a much clearer message as to what is actually included here, and where a human could find that header file to look up its declarations. Perhaps a refactoring of your include statements would be better than to add lots of include directories to the CMake configuration...
Generally speaking, your question gives very little context, so it's hard to give advice.
Related
I have a #define ONB in a c file which (with several #ifndef...#endifs) changes many aspects of a programs behavior. Now I want to change the project makefile (or even better Makefile.am) so that if ONB is defined and some other options are set accordingly, it runs some special commands.
I searched the web but all i found was checking for environment variables... So is there a way to do this? Or I must change the c code to check for that in environment variables?(I prefer not changing the code because it is a really big project and i do not know everything about it)
Questions: My level is insufficient to ask in comments so I will have to ask here:
How and when is the define added to the target in the first place?
Do you essentially want a way to be able to post compile query the binaries to to determine if a particular define was used?
It would be helpful if you could give a concrete example, i.e. what are the special commands you want run, and what are the .c .h files involved?
Possible solution: Depending on what you need you could use LLVM tools to maybe generate and examine the AST of your code to see if a define is used. But this seems a little like over engineering.
Possible solution: You could also use #includes to pull in .c or header files and a conditional error be generated, or compile (to a .o), then if the compile fails you know it is defined or not. But this has it's own issues depending on how things are set-up in your make file.
I am investigating using precompiled headers to reduce our compile times.
I have read the documentaiton on the subject here: https://gcc.gnu.org/onlinedocs/gcc/Precompiled-Headers.html, where I read the following:
Only one precompiled header can be used in a particular compilation.
On the project whose build time I would like to improve, there are often very Long lists of includes. The above leads me to Think that to get the most performance improvements, I would have to make a collection of common includes, put them into a single Header file, compile and include that Header file.
On the other hand, I prefer to list my dependancies in particular file explicitly, so I would be inclined to include first the precompiled Header, followed by the Manual list of actual Header files.
I have two questions related to this:
Is my analysis and approach correct? Have I interpreted the statement correctly?
Doing this, I will use this file (say stdafx.h) in many places, thereby including files I don't need. I would like to explicitly list my dependencies however, for code documentation purposes.
Where I to do something like the following:
#ifdef USE_PRECOMPILED_HEADERS
#include "stdafx.h"
#else
#include "dep1.h"
#include "dep2.h"
#endif
I could periodically run a build without pre-compiled headers to check if all my dependencis are listed. This is a bit clunky however. Does anyone have a better solution?
If anyone has Information to help us obtain better results in our Investigation, I am happy to hear them.
Yes, your observation is absolutely fine!
You "would have to make a collection of common includes, put them into a single Header file, compile and include that Header file". This common header file is generally named as stdafx.h (although you can name it anything you want!)
I am afraid I don't really understand this part of the question.
EDIT :
Do you also want the standard headers (like iostream, map, vector, etc.) to be included as dependencies in the code documentation?
Generally this must be a NO. Hence, you must include only those header files in stdafx.h which are not under your control (i.e., [1] standard language includes [2] includes from dependent modules (mostly exposed interface headers)). Rest all includes (whose source is in the current project/module) must be explicitly included in each header file wherever required, and not put in the pre-compiled stdafx.h.
The above leads me to Think that to get the most Performance
improvements, I would have to make a collection of common
includes, put them into a single Header file, compile and
include that Header file.
Yes, this observation is correct: You put most (all?) includes in one single header file, which is then precompiled.
Which, in turn, means that...
any compilation without the aid of that header being precompiled will take ages;
you are relying on naming conventions or other means (documentation?) to make the information link between things referenced in your individual translation unit and their declaration.
I don't much like precompiled headers for those reasons...
I'm a Java developer learning C++. I'm using eclipse as my IDE and MinGW as my toolset. Is it considered a best practice to list down every single object in a makefile? Also, is it just as acceptable to use wildcards to include all the files?
The use of wildcards is common, and accepted, but not really good practice.
If extra source files get into your source directories, they could wind up causing conflicts or -- worse -- riding silently in your libraries as useless baggage (introns?). Also, if a needed source file goes missing, your linker will complain about a missing {function|typename|whatever} and it might not be obvious what file has been lost (not really a problem if you have good source control, but still annoying). Finally, if your build system is expected to produce different targets using different subsets of the source files, wildcards will require you to either divide your source directories Venn-diagram-style, or resort to filename conventions that do the same thing (gah!).
Maintaining explicit lists of object files in a makefile really isn't that hard to do, and it keeps things simple.
This question already has answers here:
Closed 12 years ago.
Possible Duplicates:
what is the difference between #include <filename> and #include “filename”
C/C++ include file order/best practices
In what order should the include statements in a header file and source file come in C++? #include <> followed by #include "" or the otherwise?
Also, should the header file of a source file precede all include statements in source file?
I prefer to include in this order:
Standard libraries first.
Then third-party libraries.
Lastly, headers that I have written myself.
A general rule of thumb is to include headers in an order so as to maximize the chance of detecting that one of your own headers fails to itself include all that it needs. I.e. include that first. But since it's impossible to do that for all headers that you include, this is just a kind of vague guideline that doesn't hurt and might do some good.
When you have many headers, try to be a bit more systematic.
Like, group them by what they achieve (like [windows.h] followed by some MS header that requires [windows.h]), and/or alphabetically.
In the end, just don't use too much time on this. :-)
Cheers & hth.,
There is no better or worse, they server different purposes. #ncude "" is supposed to be used for files in your project or direct dependencies that are not system wide installed. Where #include <> are for inludes that (eg under Linux) are located in your /usr/include or simialr folder, also called system libraries.
Just follow the project's existing conventions, if it has any for #include directives. If it doesn't, it doesn't really matter what you do as long as you're consistent.
This matters about as much as whether you put opening curly braces on their own line. I would suggest that you pick whichever one you like better, and be consistent.
I need to be override certain macro definition by my header file. And I am not allowed to change source code. And I have to use gcc, but if anyone is aware of something similar on any other compiler then also it will help.
Here is what I exactly need:
Lets say I have code base with lot of .c files. These .c files include .h files. After all the .h files have been included for each file I want the compiler to behave as if I have another extra.h file which I want to specify when invoking the compiler. What I do in that .h file is #undef some macro and re-define the macro the way I want them to be.
Note: I am aware of --preinclude option in gcc, but using --preinclude over-rides my extra.h by the .h of the original source code. What I need is some kind of post include option.
Unless you uniformly have one specific header that is always included last in the source files, this is going to be tricky.
I think the way I'd approach it, if I had to, would be:
Create a new directory, call it headers.
Put in there suitable dummy headers with the same name as the regular headers, which would contain #include "extra.h" at the end (or possibly #include <extra.h>, but I would try to avoid that).
The dummy headers would also include the original files by some mechanism, possibly even using #include "/usr/include/header.h" but preferably some other technique - such as #include "include/header.".
The extra.h header would always redefine all its macros - it would not have the normal #ifndef EXTRA_H_INCLUDED / #define EXTRA_H_INCLUDED / #endif multiple inclusion guards, so that each time it is included, it would redefine the relevant macros.
Consequently, extra.h cannot define any types. (Or, more precisely, if it does, those must be protected against multiple definition by multiple include guards; the key point is that the macros must be defined each time the file is included - a bit like <assert.h>.)
Each redefined macro would be explicitly protected by #undef REDEFINED_MACRO and then #define REDEFINED_MACRO ....
There is no point in testing whether the macro is defined before undefining it.
The build process would be modified to look in the headers directory before looking anywhere else. The compiler option would be -I./headers or something similar, depending on exactly where you locate the headers directory.
Depending on how you have decided to locate the normal versions of the headers, you might need another -I option (such as -I/usr if you've used #include "include/header.h" notation) to locate the standard headers again.
The upshot is that your private headers get used directly by the compiler, but they include the standard headers and then your extra.h header - thus achieving what you wanted without modifying the C source or the normal headers.
But there is something misguided about the whole attempt...you would be better off not trying this.
Makefile could be used to redefine the macros through the -U and -D compiler(gcc) options. But why redefine them after the originals are evaluated? I cannot think of a need for such a thing. Can you tell what are you hoping to achieve through this?
The requirement is to insert extra.h after all the other .h files in a .c file. So adding it at the end of each .h file will insert it between two .h files included in sequence inside a .c file, which is not the intention.
You can use sed/awk inside makefile(s) to:
- first create duplicate .c files inserting '#include "extra.h"' after other #include lines inside each of the .c files (it will be tedious/ticky to resole #ifdef blocks inside the .c files)
- then achieve your target compiling those duplicate .c files
- finally delete the duplicate .c files
You can use
-include file option of GCC, because of this feature:
If multiple -include options are given, the files are included in the order they appear on the command line.
So as I understand you must include ALL *.h files from the command line,- just keep your "extra.h" the last header in -include option list and you should get what you want.
There are two ways I can think of doing this according to your requirements, and both should be relatively simple, I hope.
The first way does not touch the source code at all, however it requires that each header file you are #undef'ing things from has a header guard. You can copy and concatenate every header file that you need to "change" things in into one monolithic file, your "extra.h" file. Then at the end of that file, go ahead and redefine all the macros you need. Then include this file when you compile. The header guards will prevent the original headers from being included. Obviously, there are a number of potential problems with this approach, and it certainly wouldn't work in general.
The second way is a lot cleaner and more reliable, but it requires you to edit the code directly, albeit non-intrusively. For each header you need to redefine things in, make a copy of that header with an ".orig" suffix or something, then edit the actual header file directly. After you are all done doing whatever you are doing, then just copy all the ".orig" files back into the actual headers before your customers obtain the code. I assume your requirements aren't so draconian that you can't change the code even temporarily.
If none of that works, then I doubt you are going to find an effective answer from anybody short of hacking GCC directly and adding a "-postinclude" option yourself.