Meaning of -DHAVE_CONFIG_H in makefiles - makefile

I am starting to learn about makefiles. Looking at the output I see a lot of occurrences of:
g++ -DHAVE_CONFIG_H -I ...
what is -DHAVE_CONFIG_H exactly? What is the function of this compilation option?

All that -DHAVE_CONFIG_H does is to define the pre-processor token HAVE_CONFIG_H exactly as if you had #define HAVE_CONFIG_H right at the start of each of your source files.
As to what it's used for, that depends entirely upon the rest of your source file (and everything that it includes as well). That's where you should be looking for to work out its effect.
It looks like it may mean that a header file config.h is available and should be included, in which case you'll probably find the following sequence somewhere in you source files:
#ifdef HAVE_CONFIG_H
#include "config.h"
#endif
which will include the header file when you say it's available. However that's supposition on my part and by no means the exact effect, just what I would use such a preprocessor symbol for.

Related

gcc not respecting my include directory order

I'm attempting to build some code using a temporary version of an include file in my local ../include/records directory. The orignal lives in /home/apps/include/records. I have my gcc command set to search ../include before /home/apps/include, but it's still picking up the original module from /home/apps/include and reporting errors. If I rename the original in /home/apps/include, then gcc picks up my local edited copy and it compiles clean. So, what's up with the include order...? This 'local include first' logic has always worked for me in the past, but this may be the first time I've used it since migrating from AIX to Linux.
Is there something beyond the order of the -I command-line options that could be overriding the requested include order?
The source module include statment is:
#include "records/novarec.h"
and the gcc command line looks like this:
$make
gcc -DLINUX64 -c -g -I. -I../include -I/home/apps/include -I/home/apps/include/em -I/home/apps/include/odbc -Wno-implicit-function-declaration -Wno-implicit-int -Wno-format-security -Wno-format-truncation -Wno-discarded-qualifiers novaget.c
The compiler complains about an undefined value that's in my local copy of novarec.h, but not in the production /home/apps/include/records/novarec.h:
novaget.c: In function ‘calcComscoreDemoV1’:
novaget.c:2651:15: error: ‘CSCD_W21_49’ undeclared (first use in this function); did you mean ‘CSCD_W25_49’?
fval = *(dm+CSCD_W21_49);
^~~~~~~~~~~
It seems like the answer is this:
My module called in 2 include files. The first one also includes the second one - and the first one lives in /home/apps/include. That seems to make gcc search there for the second include file - even though /home/apps/include is not the first include directory in my path.
When I reverse the 2 include statements in my .c file, the correct path is followed for novarec.h. i.e. when I code:
#include "spottvdemos.h" (this modules has a #include "records/novarec.h")
#include "records/novarec.h"
novarec.h gets picked up from /home/apps/include, but when I code:
#include "records/novarec.h"
#include "spottvdemos.h"
novarec gets picked up from ../include, which is what I wanted.

What does gcc -DPIC do?

What exactly does -DPIC do when compiling using GCC, and when is it really necessary?
I found that -fpic and -fPIC are to generate Position Independent Code. But I could not find anything about -DPIC.
This is just a preprocessor macro definition. The GCC manual says:
-D NAME
Predefine NAME as a macro, with definition 1.
-D NAME=DEFINITION
The contents of DEFINITION are tokenized and processed as if they
appeared during translation phase three in a #define directive.
This might be useful if your source code cares whether it's being compiled as position-independent code. For example:
#ifdef PIC
/* ... */
#endif

Configure clang-check for c++ standard libraries

I am trying to run Ale as my linter, which in turn uses clang-check to lint my code.
$ clang-check FeatureManager.h
Error while trying to load a compilation database:
Could not auto-detect compilation database for file "FeatureManager.h"
No compilation database found in /home/babbleshack/ or any parent directory
json-compilation-database: Error while opening JSON database: No such file or directory
Running without flags.
/home/babbleshack/FeatureManager.h:6:10: fatal error: 'unordered_map' file not found
#include <unordered_map>
^~~~~~~~~~~~~~~
1 error generated.
Error while processing /home/babbleshack/FeatureManager.h.
Whereas compiling with clang++ returns only a warning.
$ clang++ -std=c++11 -Wall FeatureManager.cxx FeatureManager.h
clang-5.0: warning: treating 'c-header' input as 'c++-header' when in C++ mode, this behavior is deprecated [-Wdeprecated]
There are no flags to clang-check allowing me to set compilation flags.
Took a while to figure this out, but you can do
clang-check file.cxx -- -Wall -std=c++11 -x c++
or if you are using clang-tidy
clang-tidy file.cxx -- -Wall -std=c++11 -x c++
To get both working with ALE, I added the following to my vimrc
let g:ale_cpp_clangtidy_options = '-Wall -std=c++11 -x c++'
let g:ale_cpp_clangcheck_options = '-- -Wall -std=c++11 -x c++'
If you want ALE to work for C as well, you will have to do the same for g:ale_c_clangtidy_options and g:ale_c_clangcheck_options.
I was getting stumped by a similar error message for far too long:
/my/project/src/util.h:4:10: error: 'string' file not found [clang-diagnostic-error]
#include <string>
^
I saw other questions suggesting that I was missing some critical package, but everything already seemed to be installed (and my code built just fine, it was only clang-tidy that was getting upset).
Passing -v showed that my .h file was being handled differently:
$ clang-tidy ... src/*.{h,cc} -- ... -v
...
clang-tool ... -main-file-name util.cc ... -internal-isystem /usr/lib/gcc/x86_64-linux-gnu/9/../../../../include/c++/9 ... -x c++ ... /tmp/copy/src/util_test.cc
...
clang-tool ... -main-file-name util.h ... -x c-header /my/project/src/util.h
...
As Kris notes the key distinction is the -x c-header flag, which is because clang assumes a .h file contains C, not C++, and this in turn means that the system C++ includes weren't being used to process util.h.
But the -main-file-name flag also stood out to me as odd; why would a header file ever be the main file? While digging around I also came across this short but insightful answer that header files shouldn't be directly compiled in the first place! Using src/*.cc instead of src/*.{h,cc} avoids the problem entirely by never asking Clang to try to process a .h on its own in the first place!
This does introduce one more wrinkle, though. Errors in these header files won't be reported by default, since they're not the files you asked clang-tidy to look at. This is where the "Use -header-filter=. to display errors from all non-system headers.*" message clang-tidy prints comes in. If I pass -header-filter=src/.* (to only include my src headers and not any other header files I'm including with -I) I see the expected errors in my header files. Phew!
I'm not sure whether to prefer -x c++ or -header-filter=.* generally. A downside of -header-filter is you have to tune the filter regex, rather than just passing in the files you want to check. But on the other hand processing header files in isolation is essentially wasteful work (that I expect would add up quickly in a larger project).

Pre-processing C code with GCC

I have some C source files that need to be pre-processed so that I can use another application to add Code Coverage instrumentation code in my file.
To do so, I use GCC (I'm using this on a LEON2 processor so it's a bit modified but it's essentially GCC 3.4.4) with the following command line:
sparc-elf-gcc -DUNIT_TEST -I. ../Tested_Code/0_BSW/PKG_CMD/MEMORY.c -E > MEMORY.i
With a standard file this works perfectly, but this one the programmer used a #ifndef UNIT_TEST close and no matter what I do the code won't be pre-processed. I don't understand why since I'm passing -DUNIT_TEST to GCC explicitly defining it.
Does anyone have any clue what could cause this? I checked the resulting .i file and as expected my UNIT_TEST code is not present in it so I get an error when instrumenting it.
Anything wrapped in an #ifndef will only be parsed if it's NOT defined so you need to remove that definition to get all the code that is inside that block. GCC can't spit out preprocessed info for all the #ifdef and #ifndef if at preprocessing times symbols are/aren't defined.

compiler directive for compiling on different platforms

I am compiling a demo project.
The project is written for windows and linux. I have written a Makefile. However, I am not sure how to specify the platform the compiler will be compiling on.
I will be compiling on Linux.
In my source file I have this:
#if defined(WIN32)
#include ...
#include ...
#elif defined(LINUX)
#include ...
#include ..
#else
#error "OS not supported"
#endif
My simple Makefile is this. And when I compile I get the error "OS not supported".
How can I add the directive so that it will compile with the #elif defined(LINUX).
LIBS_PATH = -L/usr/norton/lib
INC_PATH = -I/usr/norton/include
LIBS = -lntr
app: *.cpp *.h Makefile
g++ $(LIBS_PATH) $(INC_PATH) *.cpp -o app
Many thanks for any suggestions,
Decide which is going to be your default platform - say LINUX.
LIBS_PATH = -L/usr/norton/lib
INC_PATH = -I/usr/norton/include
LIBS = -lntr
PLATFORM = -DLINUX
CXX = g++
app: *.cpp *.h Makefile
${CXX} ${CFLAGS} ${PLATFORM} ${INC_PATH} *.cpp -o $# ${LIBS_PATH} ${LIBS}
You can use round brackets in place of braces. This uses a macro for the C++ compiler, allows you to add other flags via CFLAGS (though that is also usually set by 'make'), and adds a platform, the include path, the library path and the actual library to the compile line.
Note that your rule enforces a complete recompilation of everything every time anything changes. This is 'safe' but not necessarily efficient. Note that wild-cards are dangerous too - more so for the source than the headers. You may include that backup copy of a file in the build (old-reader.cpp - you only wanted reader.cpp in there really). More conventionally, you list the object files needed for the program so that each object file can be individually rebuilt when needed, and the results linked together. If you get your dependencies correct (a moderately big 'if'), then there's no problem. If you get them wrong, you can end up with inconsistent programs.
However, if the difference is between a 5 second recompile and a 5 minute recompile, you should probably take the 5 minute recompilation (as shown) and answer another SO question while waiting.
To compile on Linux (64-bit):
make CFLAGS="-m64"
To compile on Linux (32-bit):
make CFLAGS="-m32"
To compile on Windows 64:
make PLATFORM=-DWIN64
To compile on Windows 32:
make PLATFORM=-DWIN32
Etc.
You can add -DLINUX=1 when compiling.
Also, if you run:
echo "" | cpp -dD
You can see the list of default #define when compiling. In linux, there will always be a:
#define __linux__ 1
in the output. So if you change your LINUX by the above #define, you don't need to do anything special. Ie:
...
#elif defined(__linux__)
...
As for the Makefile itself, I would do something like:
CXX=g++
CPPFLAGS = -I/usr/norton/include
LDFLAGS = -L/usr/norton/lib -lntr
OBJ_FILES = one.o two.o
app: $(OBJ_FILES) Makefile
one.o: one.cpp one.h
two.o: two.cpp two.h
So the implicit rules are used.

Resources