Makefile for installing an OCaml library - makefile

What are the things I need in my install and uninstall targets in a Makefile for an OCaml library in order to make it play nicely with the rest of the installation, work seamlessly with ocamlfind and so on? Basically to be a "good citizen". I am not interested in GODI at the present time. Thanks!

META files for ocamlfind are easy to write (basically, look for a META in another ocaml project you know¹, copy it and make the corresponding changes), and they will give you ocamlfind integration, with in particular easy support for post-build installation and desinstallation (using ocamlfind install and ocamlfind remove). You should begin with that.
¹: for example I take inspiration from batteries's META.
The building part of the Makefile is more tricky, their are numerous solutions (OCamlMakefile, OMake, ocamlbuild, plain Makefile, etc.) with varying strenghts and weaknesses. If you project is simple enough I would recommend ocamlbuild that takes care of a lot of the dependency tracking by itself.
You may also use Oasis, which is a relatively new tools that builds on ocamlbuild and ocamlfind and seeks to provide a unified configuration file for pre-build configuration and various building and deployment (of your program, your software libraries if any, accompanying data or documentation...). It's not yet a mature project (and its little brother Oasis-DB isn't released yet), but I encourage you to give it a try if you have time. It's a bit more complex than META, as it does more in the end, so building the META first is still a good step.
Finally, you said you weren't interested in Godi (Godi is a very good system, and in some cases (eg. BSD etc.) it's a premium choice to have a good OCaml installation), but in case you may still be interested in Godiva, a tool to help the building of GODI packages. I have never used it myself, though.

I don't use makefiles but ocamlbuild and a shell script to install the software I distribute. Debian people did packages for my software with these scripts without problems. So you may want to check them out since they correspond to some of their requirements (e.g. separate targets for byte and native code).
You may also want to have a look
to their packaging policy, though I don't know if this document is still up to date.
Don't forget to add a META file for ocamlfind. And you may also want to include an _oasis file for the upcoming oasis-db project (not yet done in the software I distribute).

Related

Why does FetchContent prefer subdirectory-subsumption vs installation of dependencies?

Consider two software projects, proj_a and proj_b, with the latter depending on the former; and with both using CMake.
When reading about modern CMake, one gets the message that the "appropriate" way to express dependencies is via target dependencies; and one should arrange it so that dependent projects are represented as (imported) targets you can depend on. More specifically, in our example, proj_b will idiomatically have:
find_package(proj_a)
# etc etc.
target_link_library(bar proj_a::foo)
and proj_a will need to have been installed, utilizing the CMake installation-and-export-related commands, someplace where proj_b's CMake invocation will search for proj_a-config.cmake.
I like this approach and encourage others to adapt to it. It offers flexibility in the choice of your own version of proj_a vs the system version; and also allows for non-CMake proj_a's via a Findproj_a.cmake script (which again, can be system-level or part of proj_b).
So far so good, right? However, there are people who want to "take matters into their own hands" in terms of dependencies - and CMake officially condones this, with commands such as ExternalProject and more recently, FetchContent: This allows proj_b's configuration stage to actually download a (built, or in our case source-form) version of proj_a.
The puzzling part to me is that, after proj_a is downloaded, say to an external/proj_a directory, CMake's default behavior will be to
add_subdirectory(external/proj_a)
that is, to use proj_a as a subproject of proj_b and build them together. This, while the idiomatic use above allows the maintainer of proj_a to "do their own thing" in my CMakeFile, and only keep things neat and tidy for others via what I export/install.
My questions:
Why does it make sense to add_subdirectory(), rather than to build, install, and perform the equivalent of find_package() to meet the dependency? Or rather, why should the former, rather than the latter, be the default?
Should I really have to write my project-level CMakeLists.txt to be compatible with being add_subdirectory()'ed?
Note: Just to give some concrete examples of how this use constrains proj_a:
Must use unique option names which can't possibly clash with super-project names. So no more WITH_TESTS, BUILD_STATIC_LIB - it has to be: WITH_PROJ_A_TESTS and BUILD_PROJ_A_STATIC_LIB.
You have to account for the parent project having searched for other dependencies already, and perhaps differently than how you would like to search for them.
Following the discussion in comments, I decided to post a bug report about this:
#22904: Support FetchContent_MakeAvailable performing build+install+find_package rather than add_subdirectory
So maybe this will change and the question becomes moot.
Why does it make sense to add_subdirectory(), rather than to build, install, and perform the equivalent of find_package() to meet the dependency? Or rather, why should the former, rather than the latter, be the default?
FetchContent doesn't just have to be for project() dependencies. It can be used for fetching utility scripts too. I'm guessing it was designed with that kind of consideration in mind. If your utility script is just one file, you can just file(DOWNLOAD) and add_subdirectory() directly, but the utilities could be multiple files, such as is the case with aminaya/project_options. FetchContent() uses a lot of the same machinery as ExternalProject, so it can do a lot of the useful things that ExternalProject does. For example, you can use FetchContent to fetch aminaya/project_options as a remote git repo, or as its archive artifacts- ex. v0.20.0.zip
Should I really have to write my project-level CMakeLists.txt to be compatible with being add_subdirectory()'ed?
It's your choice! The reasoning here can be highly objective, or subjective. It's up to you. Some people just like to put in a lot of effort to support whatever their users might want. Some people have a lot of historical configuration baggage and are still catching up to newer CMake. And as you mentioned at the end of your question post, there are certain adjustments that need to be made to accomodate for cleanly allowing people to add_subdirectory() you as a dependency. One example of a project which chose "no" is glew (see issue #314 for explanation).
Just to give another reference to some related work mentioned in responses to the KitWare/CMake ticket your raised, here's the ticket which tracked work on "FetchContent and find_package() integration".

Can a project support both Autotools and Cmake at the same time?

I happen to think (but maybe is a myth) that Cmake is greater than Autotools about making easy supporting Microsoft.
At the same time, I'm kind of sure that Autotools is even more straightforward than Cmake when it comes to important UNIX derivatives such as macOS and most popular Linux distros.
What if I can't choose?
Can a project support both Autotools and Cmake at the same time?
Bonus for: can a project support both Autotools and Cmake and even simply bare Make at the same time?
By "at the same time" I mean that ideally one should not necessarily run a clean script when changing from trying one of the build systems to another. But I guess it would be a reasonable configuration, if necessary.
Finally, do you know an example project that uses both Autotools and Cmake? One that uses both Autotools, Cmake and simply bare Make?
Yes, you can very easily support both CMake and Autotools at the same time, since they don't overlap (that is, the files you use to create those environments are different, so you can have both types of files in your project at the same time). One example of this is the GNU uCommon C++ framework.
No, you can't (easily) support bare make and either of the above systems at the same time. Neither Autotools nor CMake are actually build tools themselves. They're "build tool generators". So you don't run autotools or cmake and the result is your built project: instead you run autotools or cmake and they generate control files for a build tool. Then you run the build tool and the result is your built project.
Autotools generates makefiles, and cmake generates many different types of control files, where makefiles are one of the most common.
So, you can't have your OWN makefile in your project, because they'll conflict with the makefile generated by autotools or cmake.
Of course, you can do things like put your own makefiles in a subdirectory then invoke make with an argument like make -f rawmake/makefile or something like that. But there's no convenient way to support them all.
Realistically, I would never choose to support more than one of the above options. You will spend a lot of time getting it right, and every time you need to change your build environment it's two or three times as much work. People will find issues with whichever one of them you tend to use less often. It's a huge hassle for not that much benefit.
Which you choose depends a lot on your project. If your project runs only (or almost exclusively) on POSIX-type systems, you want it to be maximally portable even to much older systems even though it uses a lot of special OS features, or you want its installation and build options to be extremely flexible (straightforward support for cross-compilation, etc.) then autotools is a good choice. If your project runs on lots of different OS types (Windows in particular) and you want people to be able to develop with their choice of IDE (Visual Studio, Xcode, etc.) easily, then cmake is a good choice.
If your program is straightforward to build and needs hardly any configuration or customization, or you are already familiar with makefiles and don't feel like learning a whole new language just for builds, then raw makefiles may be a good choice.

How to install and use open source library on Windows?

I'd like to use open source library on Windows. (ex:Aquila, following http://aquila-dsp.org/articles/iteration-over-wave-file-data-revisited/) But I can't understand anything about "Build System"... Everyone just say like, "Unzip the tar, do configure, make, make file" at Linux, but I want to use them for Windows. There are some several questions.
i) Why do I have to "Install" for just source code? Why can't I use these header files by copying them to the working directory and throw #include ".\aquila\global.h" ??
ii) What are Configuration and Make/Make Install? I can't understand them. I just know that configuration open source with Windows need "CMake", and it is configuration tool... But what it actually does??
iii) Though I've done : cmake, mingw32-make, mingw32-make install... My compiler said "undefined references to ...". What this means and what should I do with them?
You don't need to install for sources. You do need to install for the libraries that get built from that source code and that your code is going to use.
configure is the standard name for the script that does build configuration for the software about to be built. The usual way it is run (and how you will see it mentioned) is ./configure.
make is a build management tool (as the tag here on SO will tell you). One of the most common mechanisms for building code on linux (etc.) is to use the autotools suite which uses the aforementioned configure script to generate build configuration information for use by generated makefiles which make then uses to build the software. make is also the way to run the default build target defined in a makefile (which is often the all target and which usually builds the appropriate library/binary/etc.).
make install is a specific, secondary, invocation of the make tool on the install target which (generally) installs the (in this case previously) built code into an appropriate location (in the autotools/configure universe the default location is generally under /usr/local).
cmake is, again as the SO tag says, a build system that generates configuration files for other build tools (make, VS, etc.). This allows the developers to create the build configuration once and build on multiple platforms/etc. (at least in theory).
If running cmake worked correctly then it should have generated the correct information for whatever target system you told it to use (make or VS or whatever). Assuming that was make that should have allowed mingw32-make to build the software correctly (assuming additionally that mingw32-make is not a distinct cmake target than make). If that is not working correctly then something is still missing from your system (and cmake probably should have caught that).
But to give any more detail you will need to give more detail about what errors you are actually getting and from what command.
(Oh, and on Windows, and especially if you plan on building your software with VS (or some other non-mingw32-make tool) the chances of you needing to run mingw32-make install are incredibly small).
For Windows use cmake or latest ninja.
The process is not simple or straight, but achievable. You need to write CMake configuration.
Building process is not simple and straight, that's why there exists language like Java(that's another thing though)
Rely on CMake build the library, and you will get the Open-Source library for Windows.
You can distribute this as library for Windows systems, distribute and integrate with your own software, include the Open Source library, in either cases, you would have to build it for Windows.
Writing CMake helps, it would be helpful to build for other platforms as well.
Now Question comes: Is there any other way except CMake for Windows Build
Would you love the flavor of writing directly Assembly?
If obviously answer is no, you would have to write CMake and generate sln for MSVC and other compilers.
Just fix some of the errors comes, read the FAQ, Documentation before building an Open Source library. And fix the errors as they lurk through.
It is like handling burning iron, but it pays if you're working on something meaningful. Most of the server libraries are Open Source(e.g. age old Apache httpd). So, think before what you're doing.
There are also not many useful Open Source libraries which you could use in your project, but it's the way to Use the Open Source libraries.

What are the major differences between makefile and CMakeList

I've searched for the major differences between makefile and CMakeLists, but found weak differences such as CMake automates dependency resolution whereas Make is manual.
I'm seeking major differences, what are some pros and cons of me migrating to CMake?
You can compare CMake with Autotools. It makes more sense! If you do this then you can find out the shortcomings of make that form the reason for the creation of Autotools and the obvious advantages of CMake over make.
Autoconf solves an important problem—reliable discovery of system-specific build and runtime information—but this is only one piece of the puzzle for the development of portable software. To this end, the GNU project has developed a suite of integrated utilities to finish the job Autoconf started: the GNU build system, whose most important components are Autoconf, Automake, and Libtool.
Make can't do that. Not out of the box anyway. You can make it do it but it would take a lot of time maintaining it across platforms.
CMake solves the same problem (and more) but has a few advantages over GNU Build System.
The language used to write CMakeLists.txt files is readable and easier to understand.
It doesn't only rely on make to build the project. It supports multiple generators like Visual Studio, Xcode, Eclipse etc.
When comparing CMake with make there are several more advantages of using CMake:
Cross platform discovery of system libraries.
Automatic discovery and configuration of the toolchain.
Easier to compile your files into a shared library in a platform agnostic way, and in general easier to use than make.
Overall CMake is clearly the choice when compared to make but you should be aware of a few things.
CMake does more than just make so it can be more complex. In the long run it pays to learn how to use it but if you have just a small project on only one platform, then maybe make can do a better job.
The documentation of CMake can seem terse at first. There are tutorials out there but there are a lot of aspects to cover and they don't do a really good job at covering it all. So you'll find only introductory stuff mostly. You'll have to figure out the rest from the documentation and real life examples: there are a lot of open source projects using CMake, so you can study them.

Perfect makefile

I'd like to use make to get a modular build in combination with continuous integration, automatic unit testing and multi-platform builds. Similar setups are common in Java and .NET, but I'm having a hard time putting this together for make and C/C++. How can it be achieved?
My requirements:
fast build; non-recursive make (Stack Overflow question What is your experience with non-recursive make?)
modular system (that is, minimal dependencies, makefile in subdirectory with components)
multiplatform (typically PC for unit testing, embedded target for system integration/release)
complete dependency checking
ability to perform (automatic) unit tests (Agile engineering)
hook into continuous integration system
easy to use
I've started with non-rec make. I still find it a great place to start.
Limitations so far:
no integration of unit tests
incompatibility of windows based ARM compilers with Cygwin paths
incompatibility of makefile with Windows \ paths
forward dependencies
My structure looks like:
project_root
/algorithm
/src
/algo1.c
/algo2.c
/unit_test
/algo1_test.c
/algo2_test.c
/out
algo1_test.exe
algo1_test.xml
algo2_test.exe
algo2_test.xml
headers.h
/embunit
/harnass
makefile
Rules.top
I'd like to keep things simple; here the unit tests (algo1_test.exe) depend on both the 'algorithm' component (ok) and the unit test framework (which may or may not be known at the time of building this). However, moving the build rules to the top make does not appeal to me as this would distribute local knowledge of components throughout the system.
As for the Cygwin paths: I'm working on making the build using relative paths. This resolves the /cygdrive/c issue (as compilers can generally handle / paths) without bringing in C: (which make dislikes). Any other ideas?
CMake together with the related tools CTest and CDash seem to answer your requirements. Worth giving it a look.
Bill Hoffman (A lead CMake developer) refers to the Recursive Make Considered Harmful paper in a post at the CMake mailing list:
... since cmake is creating the makefiles for you, many of the disadvantages
of recursive make are avoided, for example you should not have to debug
the makefiles or even think about how they work. There are other examples
of things in that paper that cmake fixes for you as well.
See also this answer for "Recursive Make - friend or foe?" here on stackoverflow.
-
Recursive Make - friend or foe?
Ok here is what I do:
I use one Makefile at the root and wildcard patterns to collect all files in a directory. Note that I assume that foo/*.c will make up foo.so for example. This makes the maintaining the Makefile minimal, since just adding a file to the directory automatically adds it to the build.
Since it is make you are using I am assuming (I do that for my projects) that a compiler is used that uses gcc (cc) compatible command line syntax. So MSC is out of order; but don't get frustrated, I do most of my development (unfortunately) on Windows and use MinGW with MSys; works like a charm. Produces native binaries, but was built with a Posix compliant build environment.
Dependency checking is done with the somewhat standard -MD switch. I then include all the *.d files into the Makefile. I build the patterns out of the automatically collected source files.
Finally unit tests are implemented with the "standard" check target. The check target is like the all target, except it depends on the unit test and executes that once everything is built. I do it this way so that you can just build the project or build the unit tests (and the rest of the project) separably. When I am not developing the project I want to just build it and be done with it.
Here is an example of how I do it: https://github.com/rioki/c9y/blob/master/Makefile
It also has the install, uninstall and dist targets.
As you can see everything is plain make, no recursive make calls and all is relatively simple. I used automake and autoconf and I will never do that again; also other build tools are out of the question, if I need to install foojam or barmake to build something, I normally ditch that project immediately.

Resources