I am new to make. I am trying to debug some issue. For that i kept some debug statements to know all pre-reqs of particular target.
Eg:
$(TARGET_BUILD)/%.o: $(TARGET_BUILD)/%.c
$(info pre-res for this target $^)
$(c_dependency)
After completion of the build. When i check the build log i can see there are several '.h' files listed in prerequisites. From the above target we mentioned only '.c' ($(TARGET_BUILD)/%.c) files as pre-reqs right? Then how come it is showing header files in pre-reqs. Are they auto generated? If yes, Can you please help me how they are created automatically.
FYI, $(c_dependency) is a define directive which has below lines defined :
mkdir -p $(#D)
$(CC64) -o $(#) -c $(CFLAGS64) $<
Related
I tried to use .DELETE_ON_ERROR target in makefile in order to delete both $(OBJ)
and executable files if the recipe fails, but it doesn't work. If I put an error inside any object file than while compiling the pattern rule an error occurs and it stops. The old object file is still on its place but I expect .DELETE_ON_ERROR to remove it.
Can anyone test the code? Can -include $(DEP) or flag -DDBG influence? The goal is to delete both the .o file that failed and the executable.
OUTPUT = executable
CPP := $(shell find $(SRC) -type f -name "*.cpp")
OBJ := $(CPP:.cpp=.o)
DEP := $(OBJ:.o=.d)
CXX := g++
CXXFLAGS =-MMD -MP -DDBG
INCLUDES = -I.
.DELETE_ON_ERROR :
$(OUTPUT): $(OBJ)
$(CXX) $^ -o $#
%.o: %.cpp
$(CXX) $(CXXFLAGS) $(INCLUDES) -c $< -o $#
-include $(DEP)
.PHONY : clean
clean:
rm -rf $(OBJ) $(DEP)
EDIT: According to the Ondrej K. solution to fix this problem you need to add #touch command before compilator in order to make the object files changed (the docs read "delete the target of a rule if it has changed".). So, the code should look like this:
%.o: %.cpp
#touch $#
$(CXX) $(CXXFLAGS) $(INCLUDES) -c $< -o $#
Not sure what failure you're seeing, but I am afraid there really isn't a good way for you to do that. .o files and executable ($(OUTPUT)) are separate rules. If the latter fails, former is already out of consideration. See the documentation:
.DELETE_ON_ERROR:
If .DELETE_ON_ERROR is mentioned as a target anywhere in the makefile, then make will delete the target of a rule if it has changed and its recipe exits with a nonzero exit status, just as it does when it receives a signal. See Errors in Recipes.
In other words, if your the target producing a binary object failed after .o target itself got updated, make would prune the changed file. But if your executable did not link, it won't go back and delete object files.
Not sure it'd be nice, but if you really needed to, you could probably achieve this by refactoring your makefile to basically have direct exec + objs from source prerequisites rule with a single recipe. Obvious downside, such rule would mean single .c file change causing all files being recompiled (basically negating substantial benefit of using make).
EDIT: I'll expand on the comment a bit to clarify. What you seem to want is: in case there is a broken .c file and compilation fails, remove the old .o file. That is quite clearly not how .DELETE_ON_ERROR works though. If the .o file already got updated, and then the rule failed, it would remove it ("delete the target of a rule if it has changed"), but in case of a mentioned syntactical problem, the compiler would fail before it would produced an .o file.
So, if for instance you updated your (pattern) rule for compilation so that it first touches (effectively updates timestamp) on the .o file and then tries to compile. After the compiler call and rule failed make would consider the target of the failed root to have been updated and remove it. Alternatively you could also change to rule to first try to rm the expected '.o' file in which case you actually wouldn't need to use .DELETE_ON_ERROR (and if there is no change in the relevant sources, the rule does not get used, so it's actually not as terrible as it sounds). Either way is not exactly very clean, but leads towards the behavior I understand you're describing.
It is possible that the Compiler crashes while writing the Output file. In this case, there is a corrupt output file that is newer than its sources. Make will stop due to the error, but on next run, it won't recompile the output file as it is newer than ist sources - and the make will fail again and again in the build step.
With the .DELETE_ON_ERROR rule, make will delete the Output file if the compiler (or whatever build step failed) exits with an error after touching (and corrupting) the Output file, so it will be recompiled on next run. (if the Compiler failed without touching the old output file, it will always be recompiled on next run anyway)
Most makefiles have a structure such as this:
.PHONY: prebuild
all: $(TARGET)
prebuild: Makefile
$(shell DEPDIR=$(DEPDIR) mkdir -p $(DEPDIR)/../common >/dev/null)
# do other work related to preparing for the object files to be built such as run a script to modify a header file included by $(TARGET).c
$(TARGET): $(TARGET).c prebuild
$(CC) $(CFLAGS) -o $(TARGET) $(TARGET).c
The implicit rules know how to build $(TARGET).o from $(TARGET).c, and doesn't do any work if $(TARGET).o is already newer than $(TARGET).c. This happens when make is run multiple times without changing the source file.
However, building the all target above will seemingly always rerun the $(CC) $(CFLAGS) -o $(TARGET) $(TARGET).c link to link the application and create the application binary. This happens even if that binary already exists and doesn't need to be recreated. In some larger projects, this process can take a long time (tens of seconds), which is sometimes not desirable.
Edit #1: The issue has to do something with an extra phony target that I do want to run ONCE before the object files are built. In my case, I'm running a script which takes Makefile variables and possibly updates a header file that is included in the C file. But, if the Makefile doesn't change, the prebuild target isn't run. However, the $(TARGET) target is still run even if prebuild doesn't do anything (for instance, because Makefile wasn't changed). FYI: because of the structure of my build system, I have prebuild run always because my build system is used for a variety of applications that can dynamically redefine prebuild.
How can this Makefile be restructured to not do the linking again when not necessary?
Edit #2:
Here's a simplified example that seems to illustrate my issue:
Before running, create a new directory and touch a b
.PHONY: prebuild main all
all: main
prebuild: a Makefile
#echo prebuild ran
main: prebuild
#echo main ran
When I run, I get this output:
prebuild ran
main ran
This is what happens no matter how many times I run make, even though the prerequisite a nor Makefile doesn't change. What I expect to happen is prebuild doesn't run (because a and Makefile don't change) and main also doesn't run because prebuild doesn't run. Clearly, I'm misunderstanding something.
The problem is that extra dependency triggering your rebuild.
Try this:
.PHONY: all
OUTPUTDIR=common/
TARGET=finalexe
all: $(OUTPUTDIR)/$(TARGET)
$(OUTPUTDIR)/$(TARGET): $(TARGET).c | $(OUTPUTDIR)
$(CC) $(CFLAGS) -o $# $(TARGET).c
$(OUTPUTDIR):
mkdir -p $#
In this above example, 'finalexe' will be created if A. it doesn't yet exist or B. if finalexe.c was modified. The timestamp on the OUTPUTDIR is not checked.
I have searched for hours for an answer to this. I am new to gcc and Makefiles.
I have a Makefile in some source code that looks like this:
CC=gcc
SRCDIR=src
BINDIR=../bin
CFLAGS= -flag
LIBS= -lthing
...
$(BINDIR)/program_name: $(SRCDIR)/program_name.c
$(CC) $(CFLAGS) $(SRCDIR)/program_name.c -o $(BINDIR)/program_name $(LIBS)
I understand what all of this means except what ../ in BINDIR is meant to do. When I make the Makefile, I get the error message:
/usr/bin/ld: cannot open output file ../bin/program_name: No such file or directory
collect2: error: ld returned 1 exit status
Makefile:20: recipe for target '../bin/program_name' failed
make: *** [../bin/program_name] Error 1
My guess is that the original author of this Makefile meant that the bin folder should go in the parent directory of where the Makefile is located. I know when using the Linux CLI command cd that the dot dot means go up a directory. Is that what this is trying to achieve?
To automatically create the $(BINDIR) directory before it is actually needed you must declare it as a prerequisite (dependence) of any target that uses it. But each time its content changes its timestamp also changes. So, declaring it as a regular prerequisite is not the best thing to do because the targets depending on it would be re-built without real reason, just because the content of $(BINDIR) changed.
This is why make also supports order-only prerequisites (OOPs):
$(BINDIR)/program_name: $(SRCDIR)/program_name.c | $(BINDIR)
$(CC) $(CFLAGS) $< -o $# $(LIBS)
$(BINDIR):
mkdir -p $#
Note the | that introduces the list of OOPs. An OOP is built if it does not exist, which causes the targets depending on it to be (re-)built too. But if it exists make does not even consider its last modification time. Even if some target depending on it is older, it is not rebuilt just because of that.
Note: I also used the $< and $# automatic variables. In the rule's recipe they expand as the first prerequisite ($(SRCDIR)/program_name.c) and the target ($(BINDIR)/program_name), respectively. They are highly recommended: less typing, less errors prone, more generic rules... they have many good properties.
Your makefile is missing a rule to create the BINDIR directory - if it doesn't exist, your link line won't be able to put the resulting binary there! A rule like this one should do it:
$(BINDIR):
mkdir -p $(BINDIR)
Just make sure that any other rules (like the one in your question) also depend on this directory!
The question was edited after MadScientist's answer. See history for the original makefile, but the problem stays the same.
I have a small makefile:
DEPFLAGS=-MD -Mo $(OUTDIR)/$*.Td
POSTCOMPILE=#mv -f $(OUTDIR)/$*.Td $(OUTDIR)/$*.d && touch $#
VPATH=../src
OUTDIR=../out
SOURCES:=$(notdir $(wildcard ../src/*.c))
OBJECTS:=$(SOURCES:%.c=$(OUTDIR)/%.o)
all: $(OBJECTS) $(OBJECTS:%.o=%.d)
$(OUTDIR)/%.o : %.c
$(OUTDIR)/%.o : %.c $(OUTDIR)/%.d
#$(CC) $(DEPFLAGS) -c $< -o $#
#$(POSTCOMPILE)
$(OUTDIR)/%.d : ;
.PRECIOUS: $(OUTDIR)/%.d
Directory structure looks like:
src
contains file.c
out
empty, after make: contains file.o and file.d
make
contains the makefile
When I call the makefile everything works fine and two files are generated: file.o and file.d
However, when I delete file.d nothing happens. I would expect that make finds a missing dependency for file.c and starts a rebuild. Why doesn't it happen?
Make version is 3.81 built for i386-pc-mingw32 under Windows 7.
Marking a file as .PRECIOUS does not remove all aspects of it's "intermediateness". All it does is prevent it from being deleted, but this feature of intermediate files is still in effect:
If an ordinary file b does not exist, and make considers a target that depends on b, it invariably creates b and then updates the target from b. But if b is an intermediate file, then make can leave well enough alone. It won’t bother updating b, or the ultimate target, unless some prerequisite of b is newer than that target or there is some other reason to update that target.
This is why your .d file is not recreated. In order for it to be recreated you need to ensure it's not an intermediate file. Fortunately this is trivial to do: you just need to mention the files explicitly somewhere as a target or prerequisite. You can do it like this:
all: $(OBJECTS) $(SOURCES:%.c=$(OUTDIR)/%.d)
Or if you prefer like this:
depends: $(SOURCES:%.c=$(OUTDIR)/%.d)
which would allow you to run make depends to update the dependency files, if you wanted to.
I'll just point out in passing that this method of managing dependencies is considered outdated. There's a better, more advanced way it can be done described here among other places.
(I'll be a horrific necromancer here, but I've ran into same problem, and found that actual issue isn't one mentioned in answer or comments here)
Dependency rule generated by compiler by default sports file name with ALL suffixes replaced by single suffix .o and path removed. Which doesn't match the pattern of rule in makefile.
For gcc 4.x and later correct options would be
$(OUTDIR)/%.o : %.c $(OUTDIR)/%.d
#$(CC) -MF $(OUTDIR)/$*.Td -MT $# -c $< -o $#
Mo flag no longer exist, you have to use only MF flag to specify dependency file name.MT flag allows to provide a literal line for target name.
I have a Makefile which is generated by a configure script with an option
In configure.ac:
AC_ARG_ENABLE([mmi],
AS_HELP_STRING([--enable-mmi], [Add the mmi function]),
[
AC_MSG_NOTICE([ * MMI: enabled])
AC_DEFINE([WITH_MMI], [1])
AC_SUBST(with_mmi, 1)
],
[
AC_MSG_NOTICE([ * MMI: disabled])
AC_SUBST(with_mmi, 0)
])
This option is also defined in a kconfig file (so we can change the config with a menuconfig type command instead of having to use the configure script directly)
The Makefile detect when the kconfig file is modified and in this case the configure script is run and the Makefile is modified.
The problem is that the Makefile is continuing and not using the parameter modified by the configure script.
If the make command is run a second time, it works correctly (the param is updated)
A workaround currently used is to force the exit of the Makefile directly after the configure script has been completed.
In Makefile.in:
%.o: %.c $(HEADERS) $(SELF_MAKEFILE) $(PTXDIST_PROJECT)/platform-myproject/state/myproject.prepare
gcc $(CFLAGS) -c -o $# $<
$(PTXDIST_PROJECT)/platform-myproject/state/myproject.prepare: $(PTXCONFIG_FULL_PATH)
cd $(PTXDIST_PROJECT); ptxdist prepare myproject
#echo ============================================
#echo Makefile has been modified. Please run again
#echo ============================================
exit 1
Note: above the ptxdist prepare myproject command is running the configure script and then is touching the $(PTXDIST_PROJECT)/platform-myproject/state/myproject.prepare file
It would be much better if it was possible to ask the Makefile to read itself again if it was modified so that it could be run in one step without error.
Any idea on how I could accomplish this ?
Makefiles generated by automake know to re-run autoconf, and configure when the makefiles etc. change. It seems to me that if you move the invocation of the ptxdist prepare myproject command into the autoconf file, not in the makefile, so that it's always done every time configure is invoked, then you won't have this problem.
If you don't want to do that then make will automatically re-invoke itself if any of its included makefiles changes. When you replied to Etan above you didn't say what the contents of myproject.prepare are, but if it's just an empty file that is touched to tell make the preparation is up to date you can include that:
include $(PTXDIST_PROJECT)/platform-myproject/state/myproject.prepare
and it will happen. If this file is not empty and contains content you can't include as a makefile, then you can change things so that it DOES just touch an empty file:
PREPARE_SENTINEL = $(PTXDIST_PROJECT)/platform-myproject/state/prepare.sentinel
%.o: %.c $(HEADERS) $(SELF_MAKEFILE) $(PREPARE_SENTINEL)
gcc $(CFLAGS) -c -o $# $<
$(PREPARE_SENTINEL): $(PTXCONFIG_FULL_PATH)
cd $(PTXDIST_PROJECT); ptxdist prepare myproject
#touch $#
include $(PREPARE_SENTINEL)