This is for setting up the application bundle of a MacOSX app. I have a script which copies a few files and does some other things. So I want to execute the script after the build (i.e. after the linking step). I want it to be executed just every time because it is not possible to specify its dependencies.
I know there is QMAKE_POST_LINK (e.g. described here or here) but it runs only when the target does not exists, i.e. when linking needs to be done. However, I want the script to run every time, also when the target already exists.
There is also QMAKE_EXTRA_TARGETS and POST_TARGETDEPS (e.g. described here) but that forces a relink all the time but I actually only want the script to rerun and it runs the script before the linking. (Currently, that's what I'm using using anyway, because I don't see a better way. Here is my QMake source.)
There is related questions there and there. I quote my answer for first of them:
Another way to make things in given order is to use empty "super"
target:
super.depends = target_pre first target_post
QMAKE_EXTRA_TARGETS += super
Where first - is default qmake target, and target_pre and
target_post some custom targets. Now make super just do the thing.
EDIT: looks like in last versions of Qt build of dependencies is running in paralell so this solution wouldn't work.
I've scratched my head about this for a few man-days over the past several months, and I haven't yet found a "pure" solution. However, FWIW, if you don't mind the hack of forcing relink every time, here's how to do that:
(This implementation of "post-build-events" is similar to this implementation of "pre-build-events".)
Caveats:
Forces relink every time
Works only for projects that have a linking step, so, not
TEMPLATE=aux or TEMPLATE=subdirs.
FORCELINK_CPP_FILE = force_link.cpp
#This batch of statements causes the dummy file to be touched each build.
forcelink.target = $$FORCELINK_CPP_FILE
#FORCE is a nonexistent target, which will cause Make to always re-execute the recipe.
forcelink.depends = FORCE
forcelink.commands = touch $$FORCELINK_CPP_FILE
QMAKE_EXTRA_TARGETS += forcelink
#This statement ensures that touching the above file at Make time will force relinking.
SOURCES += $$FORCELINK_CPP_FILE
#QMake will complain unless the file actually exists at QMake time,
# too, so we make sure it does.
#I used to touch this on QMake build_pass runs, too,
# but it caused transient access-denied errors.
# I guess the release and debug makefiles are generated in parallel.
!build_pass : write_file($$FORCELINK_CPP_FILE)
Related
I have a makefile that is erroneously generating a required makefile include file. The included file does not initially exist, but there is a rule to make it. The rule runs successfully, but because of a bug, the required include file is not created where expected (and thus make is unable to include it). However, instead of make failing (due to the fact that the file still can't be included), make completes successfully.
The following is my makefile1.mak file.
include myfile.mak
default:
#echo hi
myfile.mak:
#echo hello
When I execute 'make -f makefile1.mak', I get:
makefile1.mak:1: myfile.mak: No such file or directory
hello
hi
Of course, I finally figured out that my code to generate myfile.mak was not generating it correctly, but the actual makefiles that I'm using are 100s of lines long, so we didn't notice that the include wasn't happening for quite a while (it was a very subtle build issue that was introduced).
So, my question is - is there any way to get make to fail on the above example?
Add a line to the rule:
myfile.mak:
do various things to build myfile.mak
test -f $#
I'm slowly losing my mind here. First, let me describe what it is I'm trying to do. We have a compiler that spews out weirdly formatted dependency files. To get these makefiles into a format GNU Make can understand, they need to be processed by a Perl script first. Technically, the Perl script doesn't convert the input dependency files it gets passed; instead it creates a new, properly formatted dependency file for each input dependency file.
Now, in order for GNU Make to know which translation units need recompiling and which don't, it obviously must have seen those dependency files before trying to make the translation unit targets, so we have the following line in our master makefile:
include $(PROCESSED_EXISTING_DEPENDENCY_FILES)
where $(PROCESSED_EXISTING_DEPENDENCY_FILES) is a list of all converted dependency files. My idea was to (ab-)use an automatically generated makefile whose recipe not only builds that makefile but also triggers the creation of all dependency files mentioned in the $(PROCESSED_EXISTING_DEPENDENCY_FILES) list and include that makefile just before including the converted dependency files. To ensure that the conversion takes place, the parent process of our Make process will delete the automatically created makefile first (we have a Perl wrapper process controlling GNU Make). The relevant part in the master makefile would look like this:
# Phony target that creates processed dependency files.
CONVERTED_EXISTING_DEPENDENCY_FILES :
<recipe here>
$(PRE_CONVERTED_DEPENDENCY_FILE_INCLUSION_HOOK) : CONVERTED_EXISTING_DEPENDENCY_FILES
$(info $(TARGET_BUILD_MESSAGE_PREFIX) Building $(notdir $#) ...)
$(file >$#,# Automatically generated makefile that gets included before including the existing, converted dependency files.)
$(file >>$#,$(DOLLAR)(info Including pre-converted-dependency-files-inclusion hook file ...))
$(file >>$#,)
include $(PRE_CONVERTED_DEPENDENCY_FILE_INCLUSION_HOOK)
include $(PROCESSED_EXISTING_DEPENDENCY_FILES)
We're already using the same basic principle in several other cases, and so far this has worked perfectly fine, but for some reason when I try this, GNU Make gets lost in an infinite loop where it will continuously re-revaluate the master makefile, include all other makefiles and then go back to re-revaluating the master makefile again.
The $(PRE_CONVERTED_DEPENDENCY_FILE_INCLUSION_HOOK) does get created, and if there are any dependency files to be converted, they are processed, too, but I'm still at a loss as to what causes this infinite loop in Make. We are using GNU Make 4.2.1 for Windows on a Windows 10 (64 bit) system.
I recommend you rework your model completely to avoid any recipes that know how to build included files, and instead follow the model for auto-dependency generation described in this post (based on how automake handles dependency generation).
Then add in the postprocessing step directly into the same recipe that generates the dependency files, rather than having a separate rule that does it. I don't think it's necessary to have two separate rules because you really don't want the intermediate step here: you just want to generate the make prerequisites definitions... similar to how normally we wouldn't have separate rules for preprocessing, compiling, assembling object files: one rule does that even though there are multiple steps involved.
I am trying to port an existing code into a larger project. The larger project has a main Makefile with Makefiles in each sub-directory. I am sure the path below tells you all about how it is setup. I want to port my code to
/WORKING_DIRECTORY/Drivers/Char/example
And here is the content:
sansari#ubuntu:~/WORKING_DIRECTORY/drivers/char/examples$ ls
hello1.c Makefile
My question is 1- Should modify this local Makefile or the main one? I am setting up to modify this one, but I am not sure.
2- My other question is if I modify this local file, can I just run make from here and validate my configuration instead of running make for the entire project? I know that make only updates the files that are changed; however I feel better when I clean the build environment before each make. I have run into situations, which this alone fixed my issue.
Just as background, I did try to include the make file of my target project, the one I am trying to import here with -f command. What I did was: make -f Makefile -f ../mytarget/core/Makefile
But I ran into some issues with make not doing some of the normal things it does in the primary project. For instance, there was an include statement with a relative path to a header file, which make gave me an error about not seeing it. So now I am abandoning that strategy for the time being.
#Ahmad Masoud - Hey man, thanks. Here is the Makefile. Hey man, the link is exactly what I needed. I think it will address my other questions also. You see, I cross compiled this code, and when I flashed my phone, I get the following for
uname -r: 1|root#hltespr:/lib/modules # uname -r
uname -r
3.4.0-g7e6fbf7-dirty
And I have been wondering what "dirty" means and where it comes from. If you know please tell me. The link you sent, states that perhaps make would insert the Linux kernel version there? I ask this since, modprobe does not work when I try to load my module. Instead insmod works, and I can validate that my module is in. My main issue now is that I don't know how to execute the file to make sure it runs. I only know how to run the file using modprobe, and I can not use it. It gives me the following error:
1|root#hltespr:/ # modprobe /lib/modules/hello1.ko
modprobe /lib/modules/hello1.ko
modprobe: can't change directory to '3.4.0-g7e6fbf7-dirty': No such file or directory
Update as of 06/20/15 -I put in include /home/sansari/mytree2/tbt/makefile in my module's make file. I get the following error: makefile:3: *** missing separator. Stop.
#Ahmad - This is an update as of 062415. Thanks for the info. My goal is to get make to look into this external directory, collect all the source files and build them for me. What would you suggest? I am stuck because as it stands, I know make looks into my examples directory, but no other changes I make to that local make file in the examples directory shows up in make. For instance I tried adding ($warning ....) and #echo messages, but even they do not show up.
Update on 070215- Thanks for the previous comments and support. I feel I really should reopen this thread since I did not explain the goal in detail, and now I feel I can describe it better, and hopefully the resolution will help other. I issue the command:
TARGET=msm8974 PLATFORM=msm8974 make drivers/char/examples
But I get a message stating: Nothing to be done, while I have added a number of tasks to do. Below is my make file, and I'll elaborate on what I have added right after:
lib_tbt := ../../../m/shahin/tbt
lib_daemon := ../../../m/shahin/daemon
lib_lib := ../../../m/shahin/lib
lib_tasks := ../../../m/shahin/tasks
lib_tbt_driver := ../../../m/shahin/tbt_driver
lib_tbt_make := ../../../m/shahin/tbt/make
lib_tbt_msm_common := ../../../m/shahin/tbt/platform/msm8974/common
lib_tbt_msm8974 := ../../../m/shahin/tbt/platform/msm8974
lib_asm_generic = ../../../m/shahin/tbt/platform/msm8974/include/asm-generic
$(warning This is what is in lib_asm_generic $(lib_asm_generic))
#include $(lib_tbt_make)/macros.mk
.PHONY: all $(lib_tbt)
$(lib_tbt) $(lib_daemon) $(lib_lib) $(lib_tasks) $(lib_tbt_driver) $(lib_tbt_make) $(lib_tbt_msm_common) $(lib_tbt_msm8974) $(lib_asmgeneric) :
$(MAKE) --directory=$#
$(lib_*): $(MAKE) --directory=$#
obj-$(CONFIG_EXAMPLES) += hello1.o
_
Initially I only had the obj-$(CONFIG_EXAMPLES) += hello1.ostatement in my make file. I then proceeded to add the directory variables at the top of my makefile and added the $(lib_*): $(MAKE) --directory=$# directing make to compile what is in the directory. I believe that is what it does. Please let me know if I am mistaken. And although this same make file proceeds to create object files when I put it a different directory within my project, it won't do so when it is in a device driver directory. And I do not understand why. The other directory is the /external directory and it is at the top of the tree. But that should not matter right. What I have done was to first make sure I can compile a hello program in my device driver directory called /examples. I now want to add more source code to this section. I believe the correct term is module? I also want to know if I should copy of the source files to the /examples directory or referencing them via the path is ok. That is should I move the source code directory under /examples directory or not?
It is a LOT simpler than that if you are using a kernel that uses Kbuild.
Highly recommend reading
http://www.tldp.org/LDP/lkmpg/2.6/html/x181.html
Situation A - Your source is a sub-tree of the kernel source
You would NOT modify the top-level Makefile, just ensure that ~/WORKING_DIRECTORY/drivers/char/examples/Makefile and ~/WORKING_DIRECTORY/drivers/char/examples/KBuild are set up correctly/normally. THEN at the top-level of the kernel build directory (assuming you have a separate build directory) you would type:
foo#bar:~/build-dir$ make drivers/char/examples
The kernel top-level makefile then builds just that sub-tree. You can try it out on any part of the kernel, for example:
foo#bar:~/build-dir$ make fs
NOTE: build-dir can be the same as the kernel source directory
Situation B - You are building an external module
Then use the normal module KBuild / Makefile process.
P.S.
If you post your makefile / Kbuild then I may be able to help with the actual build processing.
I have a somewhat complicated Makefile which runs perl scripts and other tools and generates some 1000 files. I would like to edit/modify some of those generated files after all files are generated. So I thought I can simply add a new rule to do so like this:
(phony new rule): $LIST_OF_FILES_TO_EDIT
file_modifier ...
however, the point here is some of those generated files which I'd like to edit ($LIST_OF_FILES_TO_EDIT) are used in the same make process to generate a long list of files. So I have to wait to make sure those files are no longer needed in the make process before I can go ahead and edit them. But I don't know how to do that. Not to mention that it is really hard to find out what files are generated by the help of $LIST_OF_FILES_TO_EDIT.
If it was possible to mention in the Makefile that this rule should be only run as the last rule, then my problem would be solved. but as far as I know this is not possible. So anyone has an idea?
Some points:
List of files to edit ($LIST_OF_FILES_TO_EDIT) is determined dynamically (not known before make process)
I am not sure I have picked a good title for this question. :)
1) If you're going to modify the files like that, it might behoove you to give the targets different names, like foo_unmodified and foo_modified, so that the Make's dependency handling will take care of this.
2) If your phony new rule is the one you invoke on the command line ("make phonyNewRule"), then Make will build whatever else it's going to build before executing the file_modifier command. If you want to build targets not on that list, you could do it this way:
(phony new rule): $(LIST_OF_FILES_TO_EDIT) $(OTHER_TARGETS)
file_modifier ...
3) If your dependencies are set up correctly, you can find out which targets depend on $(LIST_OF_FILES_TO_EDIT), but it's not very tidy. You could just touch one of the files, run make, see which targets it built, repeat for all files. You could save a little time by using Make arguments: "make -n -W foo1 -W foo2 -W foo3 ... -W foo99 all". This will print the commands Make would run-- I don't know of any way to get it to tell you which targets it would rebuild.
Part of my Makefile:
CPUDEPS=./mydeps.cpu
(...)
deps: $(CPUDEPS)
$(CPUDEPS): $(CCFILES)
#echo [DEPS] CPU
$(CMDECHO)makedepend -Y -s'# CPU sources dependencies generated with "make deps"' \
-w4096 -f- -- $(CFLAGS) -- $^ 2> /dev/null > $(CPUDEPS)
(...)
sinclude $(CPUDEPS)
Problem 1: includes are done during the first phase of processing, targets during the second phase; so, if ./mydeps.cpu doesn't exist and I "make deps", I get first the error
Makefile:335: ./mydeps.cpu: No such file or directory
I hide the error using sinclude instead of include, but the problem is still there: the old file is included, not the just-generated-one. Have to run it twice to include the updated file. This is because make does a two-phase processing; is there any way to tell make to complete the target deps before parsing the includes?
Problem 2: even if the file ./mydeps.cpu doesn't exist and make deps actually creates it, I always get a "make: Nothing to do for deps". This doesn't happen with other targets. I don't understand why and how to avoid it.
Problem 1 is non-existant: before building a target, make automatically rebuilds makefiles (with implicit rules if no explicit rule is provided). So having a rule for the makefile ensures that will always be up to date, there is no need to run deps twice. Additionally, since CPUDEPS is a makefile, it will be updated automatically before any other rule is run, so dependencies will always be updated if necessary and make deps is not needed. You can probably notice this by yourself by observing the [DEPS] line being echoed if any of the CCFILES becomes more recent that the dependency file.
For Problem 2, adding anything to the recipe ensures that make doesn't complain about having nothing to do. If there is nothing else, you can use something like #echo OK to give feedback to the user, or a simple #true if you prefer totally silent makes.
What you are trying to achieve is useless: you can use the dependencies file that was created during the previous build. That's enough.
The main reasoning behind that rule is:
if you haven't changed any of your files, then the dependencies file is up-to-date, and there's nothing to build.
if you have changed anything, even very deep into your #include chain, on an existing file that were used by previous build, then the dependencies file have already caught it. You'll rebuild what is needed.
if you change something in a new file (you add that file!) then it was not used by previous build, and not listed in dependencies. But if you really want to use it, then you have to modify at least one of your other files that was used before, and you're back on the previous case.
The solution is to create the dependencies file during the normal process of the compilation, and to optionally include it (with sinclude) if it is present.