Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 4 years ago.
Improve this question
Assume the directory of a project contains subdirectories such as src, bin, lib, doc, etc. Where do I put the makefile of a project?
For example,
some projects put their makefile in src/ subdirectory of the root directories of the projects,
some projects put their makefiles in the root directory of the project.
The second way feels more logically organized to me. Can you provide cases when it is better to put makefile in the root directory, src/ or some other directory for what reasons?
The rest of the question is opinion based, but the last part is much less opinion based:
Can you provide cases when it is better to put makefile in the root directory, src/ or some other directory for what reasons?
First, that directory might not be called src/ but some other ways.
Sometimes the Makefile is itself generated (e.g. by cmake or autoconf) and its generator requires some specific file tree organization.
A common reason to put all sources in src/ is cross-compilation, or compilation for different styles of targets (perhaps a debug release, and an optimized one). Then you would put all source code in src/ and ensure that make (and gcc) don't put object files in that src/ directory, but in some other one (e.g. obj-x86 for X86 object files, obj-arm for ARM object files, etc...), perhaps specific not only to the ISA but also to the ABI. So you could end up putting object files and executables in a long named directory such asobj-x86_64-linux-optimized and obj-PowerPC-AIX-debug. BTW, the src/ directory could even be shared on several machines (e.g. NFS mounted) and read-only (e.g. shareable by several users).
Notice that GCC requires to be built outside of its source tree.
Then, source code has somehow a different meaning for make (actually, for compilers like GCC) and for developers.
The social definition (used by developers) of source code is: the preferred form of a program on which human developers work (You'll find that definition expressed clearly by free software movements).
For a compiler like GCC or Clang, the source code is simply the translation unit given as input to the compiler (that is .c and .h files processed as input by the gcc compiler). In many (but not all) cases, such C files are genuine source code, because the human developers write them.
In some cases, C or C++ "source" files (as known by gcc or g++ or clang or clang++) are generated (in that case, for developers they are no more source code, but for gcc they are still an input source). A classical example is of course C files generated by bison. See this mine answer for a general discussion of that idea.
An other example (where C files are in some common directory) is given by domain-specific (or high-level) languages implementations compiled to C (e.g. Bigloo, Chicken Scheme, my old GCC MELT, or even the old C with classes -precursor of C++- in the 1980s).
When such implementations are more or less (or even entirely) bootstrapped, you really want to keep the generated C files together, perhaps in some src/ directory (or some generated/ one). And you could even put these under your version control system (e.g. git), notably for languages having a single implementation (otherwise, you won't be able to build such a compiler; you need the generated C code to build it and later recompile it with itself), and you'll certainly distribute such (generated) C files in a source tarball.
At last, very large projects of thousands of C or C++ -or Ocaml- files (even entirely human written) tend to group these in subdirectories, in particular because a single large directory of many thousands *.c files is not "readable" by (or friendly to) humans.
On the contrary, for a small project of at most a hundred thousands lines of code in several dozens of C source files with a manually written Makefile, you'll better put all *.c & .h files in the same directory containing that Makefile.
But having or not some src/ directory, or putting or not the object files produced by the compiler in the same directory containing the source files or the Makefile, is still largely a matter of taste, opinions, conventions, and habits. I recommend to study the existing practice in free software projects (similar to your project) on github or elsewhere.
Related
What is the correct way to install (system-wide) a D library (on a GNU system, at least) using Makefile.am?
Here is my code to install the static and shared libraries:
install-data-local:
install librdf_dlang.a librdf_dlang.so $(libdir)
The remaining question is how to install .d files for developers to use my library?
Particularly, what should be the installation directory for .d files?
If you are doing a system-wide installation of D libraries and source (interface files I presume), then the most common places are /usr/include/<project name> or /usr/local/include/<project name> as long as it does not clash with some existing C/C++ project that stores header files there. Some D programmers prefer /usr/include/d/ or /usr/local/include/d/ as well...
I for an example use /usr/di (D imports) for this purpose and my library projects have all their interface files there. I will explain why I do not like to have separate project directories there.
No matter what directory you pick, you need to update your compiler search paths.
Here is a part of my dmd.conf:
[Environment64]
DFLAGS=-I/usr/include/dmd/phobos -I/usr/include/dmd/druntime/import -I/usr/di -L-L/usr/lib64 -L--export-dynamic -fPIC
, and ldc2.conf looks like:
// default switches appended after all explicit command-line switches
post-switches = [
"-I/usr/include/d/ldc",
"-I/usr/include/d",
"-I/usr/di",
"-L-L/usr/lib64",
];
If you prefer to have a separate directory for every project, you would end up with -I<path> for each of them. - I really do not like this approach. However, it is very popular among developers so it is really up to you how to organise the D import files. I know how much developers dislike the Java approach with domain.product.packages, but this nicely fits into a single place where all D interface files are and most importantly there are no clashes because of the domain/product part...
According to Filesystem Hierarchy Standard (and e.g. this SO question)
/usr/local/include
looks a strong candidate on a "linux/unix-like system". See especially note 9:
Historically and strictly according to the standard, /usr/local is for data that must be stored on the local host (as opposed to /usr, which may be mounted across a network). Most of the time /usr/local is used for installing software/data that are not part of the standard operating system distribution (in such case, /usr would only contain software/data that are part of the standard operating system distribution). It is possible that the FHS standard may in the future be changed to reflect this de facto convention.
I have no idea about Windows.
I'm working on different Android projects and need to setup project in Source Insight for different kernel source tree.
There are many unused files in kernel, I want to find a method to pick out all .c,.h,.S files that are compiled in kernel. I was nearly crazy when I pick the source files manually.
I'd wrote a script that can pick up the files corresponding to the .o files, but there are some .o files are compiled by multiple .c files, which make it more complicated.
Is there an easier way to know what files are handled in the compiling process?
Any information would be greatly appreciated.
It's my first question in stackoverflow, I love here so much.
Thanks.
I always need to search the kernel source without looking at powerpc, ia86, sparc, alpha, infiniband, etc. Assuming you can compile the kernel, several ways of doing this:
1) $K/scripts/basic/fixdep.c is called from Makefile.build to create a .cmd file for each source which contains information about the compile options, compile source/target and dependency list. Modify this to write a separate file with just the source file or source/dependencies.
2) Hack $K/scripts/Makefile.build to log the currently compiled file. See the cmd_as_o_S and rule_cc_o_c areas.
Option #1 is the best but requires a little coding. Option #2 is easiest but a true hack, and doesn't pick up the dependencies.
i'm new year and I need some answer. I searched on the web to some answer but i didn't found anything usefull. What am i searching is for a shell programms that when you execute it, create a Makefile with the binary name in arguments like :
./automakefile.sh hello .
Will build you a Makefile with a binary name called hello.
I hope you guys will help me, i'm counting on you <3
There is, unfortunately, no such magic command. If there was, we wouldn't need Makefiles to start with because the magic would most likely have been incorporated in the compiler.
There are several reasons why there isn't a command like that.
Given a random binary file, you can't generally say what programming language it was written in.
You also can't tell what source file were used to compile the binary file from, or where in the file hierarchy they are located (not just where they were located when the binary file was compiled last time, maybe on another system).
You don't know the dependencies between the source code files. Makefiles are primarily useful for keeping track of these (and compiler flags etc.), so that changing one single source file in a big project does not trigger a recompilation of everything.
You don't know what compiler to use, or what flags to pass to it. This is another thing a Makefile contains.
There are build tools available for making the creation of Makefiles easier, and for making them portable between systems on different architectures (the Makefiles that is, not necessarily the programs, that's down to the programmer). One such set of tool is GNU's autotools, another is CMake, and I'm sure there are others as well, but those are the ones I use.
Now you're facing another but similar problem, and that is that you still need to learn the syntax of, and writ,e your Makefile.am and configure.ac files (for the GNU tools), or your CMakeLists.txt files (for CMake).
So, I understand that CLion currently only fully supports CMake projects. I don't care if I can't compile or run anything with CLion, as I don't currently do that with Eclipse anyway. I am just looking for editor support, with nice click-to-follow, autocomplete, etc.
What I am wondering is whether or not indexing can still work for non-CMake projects. I can create my project just fine, and indexing completes just fine, but after that is done it can't find my include files. It creates a default CMakeLists.txt file, in which the appropriate sources and include_directories have been added. It doesn't seem to make a difference though, as after indexing completes I still can't click-to-follow #include lines, and any references to things in other files don't work correctly.
Is there something else I can do to make indexing work so I can use CLion as an editor, or is this a pipe dream until Makefile support is someday added?
After some research, I found out your best chances are:
Once it's created, edit CMakeLists.txt (for example, see How to
find libraries). One example:
set(Library "../Library")
include_directories(${Library})
set(SOURCES main.cpp)
add_executable(project_name ${SOURCES})
Note ../ goes to the up folder and in the main.cpp you can use #include "header_to_add.h" (header_to_add.h must be in ../Library folder.
Edit the source code of you .cpp, .h or whatever to add the full path of the library you want to #include taking into account the scope starts in the directory where the file is.
For example: #include "../Library/header_to_add.h" (note the "../" goes one level up from the current folder".
(Maybe not possible or hard) Modify the makefile to prepare CMake to get the necessary inputs (for example, see this).
I recommend the first one mainly because it maintains the structure outside the source files.
Edit: Also it's possible to prepare CMake to use makefile (Source).
I'm working on different Android projects and need to setup project in Source Insight for different kernel source tree.
There are many unused files in kernel, I want to find a method to pick out all .c,.h,.S files that are compiled in kernel. I was nearly crazy when I pick the source files manually.
I'd wrote a script that can pick up the files corresponding to the .o files, but there are some .o files are compiled by multiple .c files, which make it more complicated.
Is there an easier way to know what files are handled in the compiling process?
Any information would be greatly appreciated.
It's my first question in stackoverflow, I love here so much.
Thanks.
I always need to search the kernel source without looking at powerpc, ia86, sparc, alpha, infiniband, etc. Assuming you can compile the kernel, several ways of doing this:
1) $K/scripts/basic/fixdep.c is called from Makefile.build to create a .cmd file for each source which contains information about the compile options, compile source/target and dependency list. Modify this to write a separate file with just the source file or source/dependencies.
2) Hack $K/scripts/Makefile.build to log the currently compiled file. See the cmd_as_o_S and rule_cc_o_c areas.
Option #1 is the best but requires a little coding. Option #2 is easiest but a true hack, and doesn't pick up the dependencies.