GNU autotools include file dependency tracking (.deps directories) for CUDA - include

I am using GNU autotools to build cuda project. CUDA files are regular C++ files as far as preprocessor is concerned, however they use .cu extension and must use nvcc compiler which is g++ based NVIDIA compiler. This breaks regular dependency tracking, .deps directories are not populated. This means that if .cu file includes another file, changes to include file do not trigger recompilation of .cu file.
how can I modify my Makefile.am/configure.ac to enable tracking dependency for .cu files.
Thanks

Try writing an implicit ".cu.cc" rule for generating C++ files from the CUDA files. Automake should then be able to track the dependencies of the .cc files, which should reflect back on the CUDA files.

Related

Is there any possible way for LLVM to generate .bc file while generating .o file?

I am tring to build linux kernel using clang/llvm. I am trying to save the .bc file while generating the .o file . I find LLVM have the API "writebitcodetofile" whcich can save the bc code to certain file, but I am not sure how to use it.
There is a number of flags that can do that for you:
-flto enables Link-time optimization, which uses LLVM bitcode. In this case (almost) all the .o files will in fact contain the bitcode.
-save-temps tells clang to put the results of each intermediate phase into a separate file. Simple clang -save-temps main.c may output main.o, main.bc, main.i, main.s, or object file, bitcode file, preprocessed file, and assembly file, respectively.
-fembed-bitcode tells clang to include the bitcode representation of a file into the resulting object file. You can learn more about this here: https://jonasdevlieghere.com/libebc-ebcutil/
Note, however, that you won't get bitcode for assembly files.

Why does OMNet++ compiler gets errors for a precompiled package?

I have included an external package callled SoPlex (a folder of .cpp and .h files and the library files) into my OMNet++ project. I have already tested the package in Code::Blocks IDE and it works fine besides some warnings it had: warning: explicit conversion operators only available with -std=c++11 or -std=gnu++11.
It certainly was working in Code::Blocks IDE. But when I want to use it in my OMNet++ project it gives a lot of errors for the SoPlex package like in the picture:
It gives a lot of errors for just the code of SoPlex and not my OMNet++ project code.
Any idea what may cause the problem?
I have used MinGW to compile SoPlex package in Code::Blocks IDE. When I use MinGW GCC in OMNet++ instead of GCC for OMNet++ as current toolchain there is this error fatal error: omnetpp.h: No such file or directory.
Regarding the errors with the 3rd party library. Depending where you put the library inside the src folder, at least that directory must be added as an include dir, otherwise the header files will not be found by the compiler.
As for the problem with the omnetpp.h: OMNeT++ has it's own makefile generator which automatically adds the required include folder (omnetpp_root/include). The generic MinGW GCC toolchain does not. If you want to avoid extra work, always use the omnet toolchain to build your models.

cmake + qt + visual studio: moc objects on build

I am using cmake + qt + visual studio to work on a project. Problem I am having it that I would like visual studio to create new moc objects if I modify the QT ui files. If I just do a full build everything works file, but if I just modify something on the ui file it does not "auto moc" and I have to rebuild the whole project.
The cmake file I have is pretty simple:
cmake_minimum_required(VERSION 3.2)
set(CMAKE_VERBOSE_MAKEFILE ON)
project(main)
set(CMAKE_INCLUDE_CURRENT_DIR ON)
set(CMAKE_AUTOMOC ON)
set(CMAKE_AUTOUIC ON)
find_package(Qt5Widgets)
file(GLOB CPP_FILES *.cpp)
add_executable(main ${CPP_FILES})
target_link_libraries(main Qt5::Widgets)
target_compile_features(main PUBLIC cxx_nullptr)
Does anyone know a way to get this to work (having visual studio to detect ui file modifications and "auto moc" the modified ui file)?
Start by replacing your file(GLOB ...) with explicitly listing out the files you want to include if you want proper dependency handling. This will also ensure the build is creating dependencies for the set of files you are expecting it to. This answer has more details about why you probably want to do this, aside from the reasons below.
The CMake documentation for AUTOUIC includes this statement:
If a preprocessor #include directive is found which matches
ui_<basename>.h, and a <basename>.ui file exists, then uic will be
executed to generate the appropriate file.
Can you confirm that your .cpp sources have #include directives that follow this pattern? In your file(GLOB ...) you are only capturing the .cpp files and not the .h files, so if you've only got the #include directives in the headers, AUTOUIC may not pick them up properly. It's been a while since I've used this and I can't recall if AUTOUIC would still find them if you only list the .cpp files and not the headers too in your add_executable() call, but it's something for you to try. You also may be facing a similar situation with AUTOMOC if you have headers which use the Q_OBJECT and Q_GADGET macros. So just explicitly list out your .cpp and .h files you give to add_executable() and see if that addresses your problem.

Assembler used by golang when building with and without cgo

Let's say I have a golang package, which contains some assembly code:
demopkg/
source1.go
source2.go
asm_amd64.s
If I try to build it using go build, toolchain will use go tool asm to assemble the *.s files.
But if I add Cgo to the mixture, by putting a single import "C" into any of the sources, go will switch to gcc assembler.
I can see it by executing go build -n. Calls to the /usr/local/go/pkg/tool/linux_amd64/asm from the first case get replaced by calls to gcc. Besides that, it starts complaining about broken syntax.
Is this behaviour documented, so I can rely on it for the maintaining of my package? Can I force go build to use one exact assembler?
Yes, it's in the cgo documentation
When the Go tool sees that one or more Go files use the special import
"C", it will look for other non-Go files in the directory and compile
them as part of the Go package. Any .c, .s, or .S files will be
compiled with the C compiler. Any .cc, .cpp, or .cxx files will be
compiled with the C++ compiler. Any .h, .hh, .hpp, or .hxx files will
not be compiled separately, but, if these header files are changed,
the C and C++ files will be recompiled. The default C and C++
compilers may be changed by the CC and CXX environment variables,
respectively; those environment variables may include command line
options.

What is the difference between make and gcc?

The last sentence in the article caught my eye
[F]or C/C++ developers and
students interested in learning to
program in C/C++ rather than users of
Linux. This is because the compiling
of source code is made simple in
GNU/Linux by the use of the 'make'
command.
I have always used gcc to compile my C/C++ programs, whereas javac to compile my Java programs. I have only used make to install programs to my computer by configure/make/make install.
It seems that you can compile apparently all your programs with the command make.
What is the difference between make and gcc?
Well ... gcc is a compiler, make is a tool to help build programs. The difference is huge. You can never build a program purely using make; it's not a compiler. What make does it introduce a separate file of "rules", that describes how to go from source code to finished program. It then interprets this file, figures out what needs to be compiled, and calls gcc for you. This is very useful for larger projects, with hundreds or thousands of source code files, and to keep track of things like compiler options, include paths, and so on.
gcc compiles and/or links a single file. It supports multiple languages, but does not knows how to combine several source files into a non-trivial, running program - you will usually need at least two invocations of gcc (compile and link) to create even the simplest of programs.
Wikipedia page on GCC describes it as a "compiler system":
The GNU Compiler Collection (usually shortened to GCC) is a compiler system produced by the GNU Project supporting various programming languages.
make is a "build tool" that invokes the compiler (which could be gcc) in a particular sequence to compile multiple sources and link them together. It also tracks dependencies between various source files and object files that result from compilation of sources and does only the operations on components that have changed since last build.
GNUmake is one popular implementation of make. The description from GNUmake is as follows:
Make is a tool which controls the generation of executables and other non-source files of a program from the program's source files.
Make gets its knowledge of how to build your program from a file called the makefile, which lists each of the non-source files and how to compute it from other files.
gcc is a C compiler: it takes a C source file and creates machine code, either in the form of unlinked object files or as an actual executable program, which has been linked to all object modules and libraries.
make is useful for controlling the build process of a project. A typical C program consists of several modules (.c) and header files (.h). It would be time-consuming to always compile everything after you change anything, so make is designed to only compile the parts that need to be re-compiled after a change.
It does this by following rules created by the programmer. For example:
foo.o: foo.c foo.h
cc -c foo.c
This rule tells make that the file foo.o depends on the files foo.c and foo.h, and if either of them changes, it can be built by running the command on the second line. (The above is not actual syntax: make wants the commands indented by a TAB characters, which I can't do in this editing mode. Imagine it's there, though.)
make reads its rules from a file that is usually called a Makefile. Since these files are (traditionally) written by hand, make has a lot of magic to let you shorten the rules. For example, it knows that a foo.o can be built from a foo.c, and it knows what the command to do so is. Thus, the above rule could be shortened to this:
foo.o: foo.h
A small program consisting of three modules might have a Makefile like this:
mycmd: main.o foo.o bar.o
$(CC) $(LDFLAGS) -o mycmd main.o foo.o bar.o
foo.o: foo.h bar.h
bar.o: bar.h
make can do more than just compile programs. A typical Makefile will have a rule to clean out unwanted files:
clean:
rm -f *.o core myapp
Another rule might run tests:
check: myapp
./myapp < test.input > test.output
diff -u test.correct test.output
A Makefile might "build" documentation: run a tool to convert documentation from some markup language to HTML and PDF, for example.
A Makefile might have an install rule to copy the binary program it builds to wherever the user or system administrator wants it installed.
And so on. Since make is generic and powerful, it is typically used to automate the whole process from unpacking a source tarball to the point where the software is ready to be used by the user.
There is a whole lot of to learn about make if you want to learn it fully. The GNU version of make has particularly good documentation: http://www.gnu.org/software/make/manual/ has it in various forms.
Make often uses gcc to compile a multitude of C or C++ files.
Make is a tool for building any complex system where there are dependancies between the various system components, by doing the minimal amount of work necessary.
If you want to find out all the things make can be used for, the GNU make manual is excellent.
make uses a Makefile in the current directory to apply a set of rules to its input arguments. Make also knows some default rules so that it executes even if it doesn't find a Makefile (or similar) file in the current directory. The rule to execute for cpp files so happens to call gcc on many systems.
Notice that you don't call make with the input file names but rather with rule names which reflect the output. So calling make xyz will strive to execute rule xyz which by default builds a file xyz (for example based on a source code file xyz.cpp.
gcc is a compiler like javac. You give it source files, it gives you a program.
make is a build tool. It takes a file that describes how to build the files in your project based on dependencies between files, so when you change one source file, you don't have to rebuild everything (like if you used a build script). make usually uses gcc to actually compile source files.
make is essentially an expert system for building code. You set up rules for how things are built, and what they depend on. Make can then look at the timestamps on all your files and figure out exactly what needs to be rebuilt at any time.
gcc is the "gnu compiler collection". There are many languages it supports (C, C++, Ada, etc depending on your setup), but still it is just one tool out of many that make may use to build your system.
You can use make to compile your C and C++ programs by calling gcc or g++ in your makefile to do all the compilation and linking steps, allowing you to do all these steps with one simple command. It is not a replacement for the compiler.
'gcc' is the compiler - the program that actually turns the source code into an executable. You have to tell it where the source code is, what to output, and various other things like libraries and options.
'make' is more like a scripting language for compiling programs. It's a way to hide all the details of compiling your source (all those arguments you have to pass the compiler). You script all of the above details once in the Makefile, so you don't have to type it every time for every file. It will also do nifty things like only recompile source files that have been updated, and handle dependancies (if I recompile this file, I will then need to recompile THAT file.)
The biggest difference is that make is turing complete (Are makefiles Turing complete?) while gcc is not.
Let's take the gcc compiler for example.
It only knows how to compile the given .cpp file into .o file given the files needed for compilation to succeed (i.e. dependencies such as .h files).
However, those dependencies create a graph. e.g., b.o might require a.o in the compilation process which means it needs to be compiled independently beforehand.
Do you, as a programer want to keep track of all those dependencies and run them in order for your target .o file to build?
Of course not. You want something to do that task for you.
Those are build tools - tools that help making the build process (i.e. building the artifacts like .o files) easier. One such tool is make.
I hope that clarifies the difference :)

Resources