Is it worth learning GNU Make? - makefile

I'm lately feeling the need to learn a build tool. I'm looking through StackOverflow for recommendations and Gnu Make gets barely mentioned. Instead I see Ant, Maven, CMake, Scon and many others. However, when I look at the little "rogue sources" (as in not-in-the-repo) that I sometimes have to compile, they all require the make && make install steps.
Is learning Make a worse investment of my time than learning another tool?
If so why is Make still so popular?

Make is the standard build tool for everything C/C++. Many others have stepped to the plate, but even when they were useful and successful, they never achieved the ubiquity of make.
Make is installed on virtually every Unix-like machine out there. No matter if you're working with AIX, Solaris, Irix, BSD, or Linux, if there's a compiler installed, there's also make.
Some of the "replacements" (like Automake, CMake) even create Makefiles, which are in turn executed by make.
I would definitely recommend becoming familiar with make. If handled by someone who took the time to learn about make, it is a powerful tool, which can be used in a number of ways not even necessarily related to software development.
Even if you end up using a different build tool in the end, you will be able to "recycle" the lessons learned with make, as the underlying concepts are quite similar. And the sheer number of make-built projects means that there will always be the chance that you have to figure out an existing Makefile.
One thing, though. Get it right from the beginning.

I think the reason you don't see (GNU) make mentioned is that it's often the default; if you have a GNU toolchain, you will have make already. Thus, most people that start talking about build tools, talk about something else.
In my experience, make is fine, but it can be kind of tricky to get it to do exactly what you want to. It's maybe slightly arcane, but it's proven and works.

Make is popular because it's used (mainly) for C/C++ sources in Linux/*nix projects, and is far older than any of the other tools you've mentioned, thus it has stood the test of time and is mature. Kinda like tar.
To be honest with you, I only know make. Those other tools above may be better, but so many projects just use a basic Makefile that you're best off knowing at least a little bit of it. Not only for your own projects at work but most of the open-source ones you find on the net.

It really depends how much you will use it.
If yoy work a lot with C/C++ make projects, then yes, I would recommend learning more about it as a large make file has a steeper learning curve than other build tools you mention.
If you don't work with make, or work in other languages such as C#, Java or PHP then you'd be better off learning build tools relevant to those languages.

Like all tools, if you use it at all, you should put some time
into becoming reasonably adept at it. Also, some tools (like CMake, for example) generate makefiles and you may one day need to mess with those generated files.
GNU make has an excellent manual - it's certainly worth spendin an hour or two reading it.

Make is the de-facto standard on Linux systems for example. It is a very complex tool, and also a very powerful tool.
It is well suited to learn if you are developing C or C++, particularly if targeting Linux/*nix.
One of the features of make, is that you can set up dependencies for when to rebuild a file. E.g. each c or c++ file is build into an .obj file, and in the end, all .obj files are linked to an executable. But maybe the executable is a statically linked library, that is linked into another executable with other .obj files.
Make can make sure that you compilation time is as short as possible, because you can define that a c file should only be compiled if it, or any dependent header files, are newer that the .obj file. So any compilation or linking step is only executed if the current source files for the step is newer that the target file.
If you are developing in for example C#, you don't need this kind of dependency checking because all .cs files are compiled at once into a single executable.
So the conclusion is that you should use a build tool that is well suited for your choice of programming language.

Even if you end up preferring another build tool (personally I'm fond of VS... I know...) knowing make will probably prove more useful.
Make has many applications and whilst it is not always ideal for a single task, when dealing with new technologies it is stalwart and flexible.

I guess where you work is probably different, but I know that everywhere I've worked I would have been a far less valuable employee if I hadn't at least learned how to read Makefiles. Even in all Windows-VisualStudio environments, it comes up every now and then.
For instance, we just got a job that involves porting a bunch of old CX/UX code to Windows. The old code was built with makefiles. There's no way to understand their old system without knowing how to read those old makefiles.

Related

$(shell [foo]) in Windows

I've got a makefile (a file called 'Makefile' which is run by cmake in Linux, but works in Windows via nmake I believe and needs to be run in VS command prompt.)
And most of the 'sample' ones I can see are just one line (and the rest appear to be stuff I don't 'yet' understand and then this same one line.
include $(shell rospack find mk)/cmake.mk
(in the terminal rospack find [package] returns the path to said package, and cmake.mk is obviously the file it wants to include)
My problem is, that this appears (to me at least) to be written for use in a Linux system (which basically the entirety of ros, the program I'm working with, was) and in Windows this appears to just try to be
include /cmake.mk
(which unsurprisingly doesn't work)
Basically I need to know how to do the same thing in windows, generally in a 'dynamic' way, as it will only cause more problems down the line if I get this working by hard-coding the directory path and then it breaks because its not set properly some time in the future)
So I guess if this isn't possible or is particularly hard, a way of hard coding it would be a stopgap.
I tried:
include C:\[directory]\cmake.mk
but it seems to have issues with the ':'
I'm trying to work with Windows, because later in my project I'll be needing to use another program (for i90 robot) for which we only have Windows support.
OK, so apparently it acts differently if the file is actually in the folder.
as in
include C:\[directory]\cmake.mk
Errors with
C:\[directory]\cmake.mk not found
if the file isn't there, and
fatal error U1034: syntax error : separator missing
if it is
While this doesn't really seem to impact on the original problem, I guess it indicates I'm trying to do something funky windows doesn't like.
The short answer is, you'll never get a single makefile that does much of anything complicated that will work both with standard UNIX-style make (such as GNU make from GNU/Linux) and also work with nmake. Nmake is a completely different beast.
As an aside, it's confusing that your makefiles here are called "cmake", because cmake is an actual program, distinct from make (and nmake). I'm assuming, though, from the context that the use of the term "cmake" here doesn't refer to the actual cmake utility. Which is too bad, because if it did use cmake things would be simpler for you. Maybe.
It's not clear exactly what your requirement to use nmake is, though. If you laid out your real requirements, it would be a lot easier for us to advise you. For example, you say you need to use a "another program" which runs only on Windows. What does this program do, exactly, and how will you need to use it? Does it provide libraries that need to be linked with the "ros" code?
Basically, your simplest way forward is to obtain a UNIX-like environment, including tools like GNU make, for your Windows system. There are two main choices: Cygwin, which provides a completely POSIX infrastructure including shell, compiler, etc. which are ports of the GNU environment to Windows but require a POSIX layer, and MinGW, which has various GNU tools that run more or less natively on Windows.
However, if you MUST use Visual Studio as your compiler, for example, then these will be much more difficult to integrate.

Can `Jam` be used as an alternative to GNU Make with autoconf?

As it is stated on the FT Jam page at freetype.org:
Jam is a small open-source build tool that can be used as a replacement for Make. Even though Jam is a lot simpler to use than Make, it is far more powerful and easy to master. It already works on a large variety of platforms (Unix, Windows, OS/2, VMS, MacOS, BeOS, etc.), it is trivial to port, and its design is sufficiently clear to allow any average programmer to extend it with advanced features at will.
Are there any ways to substitute make for jam to make autoconf-based projects in MinGW / MSYS to build faster?
No. Autoconf relies on a Unix shell and associated utilities. It might be hacked to generate jam files, but the configure step is autotools. If you just have a plain makefile, you might as well just use mingw32-make and ditch autotools completely if possible.

Set up a development environment on Linux targeting Linux and Windows

For a university course I have to write a http server which is supposed to run on both Linux and Windows.
I have got a humble Linux machine which I don't think can handle any kind of heavy virtual environment, neither I'm willing to go through the hassle of installing it.
This is the first project of mine complex enough (I estimate ~1.5 months to develop) to require an environment sufficiently comfortable to alternate rapidly between short coding and testing sessions (the latter on both platforms, of course).
So, I was wondering what could be the best set up for this situation. I think testing it on Wine would be ok (it is not a real-world thing, after all), and I installed MinGW for the Windows-targeting part.
Basically, a simple well-written makefile could solve my problem... It should build both the Linux and Windows binaries and place them in the respective folders (the Windows one in the Wine sub-tree) and I'm all done! But I feel very inexperienced in this thing and I really don't know where to start. Maybe the make manual, ahah!:)
Thoughts, suggestions, anything I didn't think/know!
Thank you!
(PS. I'm planning to use emacs as editor, or maybe learn vim. Unless eclipse provide some kind of skynet-like plugin that entirely solve this problem...:)
You're on the right track. It's not that complicated, really, thanks to MinGW. You basically need two things:
The code has to be portable across the OSes. MinGW has some POSIX support, but you'll probably need to either use Cygwin in order to be able to use the POSIX interface or have your own compatibility layer for interfacing with the OS. I'd probably go for Cygwin as then you can code only against POSIX and won't have to test and debug your compatibility layer. Also, make sure you won't use any external libraries that are OS specific. Non-portable code often results in a compile error, but make sure you test the application thoroughly anyway.
The toolchains for targeting Linux and Windows. You already have them, you just need to use them correctly. Normally you'd use a variable like $(CROSS_COMPILE) as a prefix when calling the toolchain during cross compilation. So when compiling for Linux, you call gcc, ld, etc. (having the CROSS_COMPILE variable empty), and when compiling for Windows you call e.g. i486-mingw32-gcc, i486-mingw32-ld etc., i.e. CROSS_COMPILE=i486-mingw32-. Or just just define CC, LD etc. depending on the target.
I wrote a small game on Linux and made it run on Windows as well. If you browse the code, you can see the code has next to no #ifdef jungle (basically just some extra debugging features enabled for Linux), and the Makefile is simple as well, with no complicated handling for cross-compilation, just the possibility to override CC etc. like it should be. As lots of important open source software is written this way (especially software that's used by the desktop and embedded devices), you should also be able to find lots of other examples on how to set up the build environment correctly.
As for testing the application on Windows, I think the best option is if you can find a real Windows machine somehow. If you do everything correctly, it should run the same as on Linux and you won't need to continuously test your application on both OSes. If testing on a Windows machine is not possible, a VM would be the next best choice, though it would probably be more difficult to set it up. Wine is a good backup plan, but I don't think you can be sure your application works well on Windows if you only tested it on Wine.

What are some compiled programming languages that compile fast?

I think I finally know what I want in a compiled programming language, a fast compiler. I get the feeling that this is a really superficial thing to care about but some time after switching from Java to Scala for a while I realized that being able to make a small change in code and immediately run the program is actually quite important to me. Besides Java and Go I don't know of any languages that really value compile speed.
Delphi/Object Pascal. Make a change, press F9 and it runs - you don't even notice the compile time. A full rebuild of a fairly substantial project that we run takes of the order of 10-20 seconds, even on a fairly wimpy machine
There's an open source variant available at www.freepascal.org. I've not messed with it but it reportedly is just as fast - it's the design of the Pascal language that allows this.
Java isn't fast for compiling. The feature you a looking for is probably a hot replacement/redeployment while coding. Eclipse recompiles just the files you changed.
You could try some interpreted languages. They usually don't require compiling at all.
I wouldn't choose a language based on compilation speed...
Java is not the fastest compiler out there.
Pascal (and its close relatives) is designed to be fast - it can be compiled in a single pass. Objective Caml is known for its compilation speed (and there is a REPL too).
On the other hand, what you really need is REPL, not a fast recompilation and re-linking of everything. So you may want to try a language which supports an incremental compilation. Clojure fits well (and it is built on top of the same JVM you're used to). Common Lisp is another option.
I'd like to add that there official compilers for languages and unofficial ones made by different people. Obviously because of this the performance changes per compiler.
If you were to talk just about the official compiler I'd say it's probably Fortran. It's very old but it's still used in most science and engineering projects because it is one of the fastest languages. C and C++ come probably tied in second because there also used in science and engineering.

Is there a Linux equivalent of Windows' "resource files"?

I have a C library, which I build as a shared object for Linux and a DLL for Windows with MinGW32. The API depends on a couple of data files (statistical models) which I'd really like to roll in with the SO/DLL so that deployment is just one file.
It looks like I can achieve this for Windows with a "resource file" compiled with windres, but then I've got to write a bunch of resource-handling code for Windows, and I'm still stuck with the files on Linux.
Is there a way to achieve the same functionality on Linux?
Even better, is there a portable solution?
It's actually quite simple on Linux and other ELF systems: http://www.linuxjournal.com/content/embedding-file-executable-aka-hello-world-version-5967
OS X has bundles, so you just build your library as a framework and put the file in the bundle.
Two potential solutions:
Phong Vo's sfio library, which is part of the AT&T Advanced Software Technology toolset, is a wonderful replacement for C stdio.h, and it will allow you to open either files or memory blocks using a single API. So you can easily convert your existing files to C initialized data to include in your DLL or SO file.
This is a good cross-platform solution, but the penalty is that the learning curve to get started is pretty high. They don't make it easy to figure out how stuff works or to take one part of their toolset and split it out for use independent of the other parts. But the good news is that if you want to adopt their U/Win system for running Unix codes on windows (all part of the same toolset), you can create DLLs and SOs using the same system.
For this kind of problem I often fall back on Lua; I can stored Lua data either in external files or within C as initialized data. This is great for distributing everything in one .so file; I do this for my students.
Again the downside is that you have to master and incorporate a new technology.
In my own work I use Lua over the AT&T stuff for these reasons:
Lua has a much smaller footprint and is designed to play well with others; with AST you really have do adopt their way of doing things.
The learning curve with Lua is much less steep; you can be productive very quickly.
Lua is dead easy to install and it's easy to get information about it. AST has its own quirky installation process shared by nobody else in the world; it's often hard to make the installation work; and it's harder to get information about it.
Using Lua has a lot of other payoffs, so the effort spent learning Lua and learning how to incorporate Lua into C codes is easy to amortize over multiple projects.

Resources