I am using CMake with Unix Makefile generator. When I use add_custom_target it generates the make target only in the CMAKE_CURRENT_BINARY_DIR and the CMAKE_BINARY_DIR.
I found it out by testing it with different layouts of sub-directories.
Is this somewhere documented? Is there a way to create a custom target that works in every target, similar to built-in make clean?
The rationale behind the question: I have a bunch of unit tests in several unittest folders. I don't build them with target all as compiling the tests takes much longer compared to the actual library. With the target unittest I can build them. I would like to be able to call this from every unittest subfolder. Preferably it would only build the unit tests located in the current directory and recursively in its sub-directories.
Related
I would like to ensure that a number of PDE target platform definitions are sound and would like to run these tests in a CI loop. I am aware that the director can be used to perform installations, but I would like to use the target files directly. How is this best done?
I have a ZF2 application that I've setup to be built using a Makefile with various options. The issue at hand is that the /vendors/ directory can contain an assortment of dependencies that are installed/updated via composer. Each dependency may, or may not contain unit tests, and the location of the tests are arbitrary.
When make test is run, I would like the Makefile to search through the /vendors/ directory for any folders named /tests/ and perform unit testing accordingly.
What would be the most optimal way to iterate through the vendors and locate any /tests/ directories to be able to perform unit testing?
Use the wildcard function:
LIST_OF_TESTS:=$(wildcard vendors/*/tests)
After I make a build I want to copy some files into my Xcode project.
I discovered that I could do this either in
In "Build Phases" with a custom build step.
I can also execute scripts before and after the different "tasks" in the Scheme editor:
Build (This is where I could add my script)
Run
Test
Profile
Analyze
Archive
I don't completely understand the differences / possible implications between the two approaches and I am wondering when to choose either approach. Thanks for clarification.
After I make a build I want to copy some files into my Xcode project.
I assume you want to copy files to your build product, not the Xcode project.
There are several subtle differences between scheme and build phase scripts. Here are some of them:
Scheme scripts are part of the scheme, so sharing with other developers is more configurable. Build phase scripts on the other hand are part of the target and cannot be skipped simply by choosing another scheme.
Scheme scripts can run before the dependency checking. So you can use them to modify source files and get up to date results. This is not possible with build phase scripts.
The information passed to the script in environment variables differs slightly. Depending on what information you need you sometimes have to choose right kind of script.
Build phase scripts are run conditionally only if the build process succeeds until their place in the target.
Build phase scripts can be configured to only run when input files changed.
There isn't much difference between the two, however you have more control where, in the build sequence, the build phases scripts are run and so this is preferable (for example you could modify files that have already been copied by standard Xcode build phases).
I always use Build Phases scripts myself and have never used Scheme scripts. They are more visible and more manageable.
I currently configure CMake/CTest for CI. Everything works fine except for the following:
We have several projects which depend on each other. In our toplevel build script, though, they are just being built in the right order. During CI, for each of the projects I just do a "make Continuous" in the build directory of the respective project. However when, say a header file is updated in one project only this project gets build after "make Continuous". Another dependent project which uses the same include files is not rebuild during "make Continuous" because in this project no updates occur.
So my question: Is there any way to force the build step to be done during "make Continuous", independent of the result of the svn update?
Any other ideas how to solve this?
add_dependencies will work for your case.
add_dependencies(target-name depend-target1 depend-target2 ...)
See also
http://cmake.org/cmake/help/v2.8.10/cmake.html#command:add_dependencies
Currently, we build a group of static libraries prior to building our app. The issue is that for each library there is some variation of the ./configure, make , test sequence. I would like to be able to cache the results of the configure step to speed up the build, since it is common to build on the same platform multiple times. We are thinking about wrapping each step in the build process in an SCONS process, but we're not sure that this would work. Any ideas?
You could use scons to wrapper your configure and make script. As long as you enumerate all your dependencies and generated targets then scons would be able to determine whether to run each of the build steps or not. This sounds pretty complicated though. Why not convert your config and build flow to scons or write a simple makefile for this config, make dependency? There are ways to add MD5 hashing into make ( if you're drawn to scons because of its hashing of dependencies).