How does Makefile know that a file changed and then recompile it? - makefile

Just out of curiosity, how does Makefile know that a file changed (and then recompile it)? Is it up to make? Is it up to the compiler? If so, is it language dependent?

It looks at the file time-stamp - simple as that. If a dependency is newer that the target, the target is rebuilt.

Make works by inspecting information about files, not their contents.
Make works out dependencies between targets and their dependencies, and then looks to see whether the files exist. If they do, it asks the operating system for the time and date the file was last modified. This is the 'timestamp' for this purpose, although the term can have other meanings.
If a target file either does not exist, or exists and is earlier than its dependent file, then Make rebuilds the target from the dependent by applying a rule.
If the dependent does not exist, Make signals an error.
A consequence of this is that you can force a rebuild by deleting the target, or by 'touching' the dependent to make it later than the target. You can avoid a rebuild by 'touching' the target. Touching simply updates the timestamp to now.

Related

Trick gmake build/ Copying files to preserve time stamps and avoid complete rebuild

short background info:
For some rather convoluted reasons I am trying to trick my projects build system. We are working with Code Composer Studio on Windows from Texas Instruments and the gmake implementation that was included. We use the standard Debug build option pretty much unmodified as far as I am aware of (my understanding of make and our implementation of it is unfortunately limited). When developing in Code Composer the build works as one would expect, only the files that have been changed and their dependencies are changed. Sometimes I need to regenerate all code from a code generator that we use however, triggering a complete rebuild.
Trying to trick this complete rebuild I wrote a python script that moves all the original files out from the project, waits for the code generator, then compares the files and replaces the new identical files with the old ones. Thus the creation timestamp and the last modified timestamp should be preserved. This also seems to work when examining these properties. However a complete rebuild of the problem is always triggered.
Core of the Problem:
Having already built the project in Code Composer and building again it concludes that nothing has changed and that there is nothing to do, as one would expect.
I move a file out of the folder and back in again, preserving both the creation time stamp and last modified time stamp as far as I can see (in windows explorer -> properties).
When now rebuilding the project it will rebuild the said file and its dependencies, despite it being identical to before and having the same timestamps.
What is gmake looking at? Does it detect that the folder has been changed. Is it looking at some other hidden timestamp? Is it a Windows problem? Shouldn't it be possible to do this, inserting a file with an old timestamp?
What am I missing? Anyone have a clue?
Having examined this I concluded that the problem does not lie with gmake or the makefile but with Code Composer Studio's/Eclipse makefile generation.
We use the built in makefile generation in Code Composer Studio (which is an Eclipse derivative). It seems to keep track of if files have been removed/moved out from the projects workspace. If it has it removes the dependecies of that files during the makefile generation, triggering a rebuild.
Examples:
Case 1, works as I expect:
Rebuilding an unchanged project in Code Composer Studio. Nothing is recompiled since nothing is changed.
Case 2, works as I expect:
Changing something in a single then rebuilding results in recompilation of only that single file. As I/once expects.
Case 3, does not work as I expect:
Having an unchanged project and the moving one source code file (xyz.cpp) out of the project, not changing it in any way and then mvoing it back results in a recompilation of xyz.obj.
Code Composer/Eclipse seem to keep track on the filestructure. Despite the file being put back into the project exactly the way it was, Code Composer/Eclipse deletes xyz.obj during the makefile generation, triggering a rebuild.
On the other hand; closing Code Composer, moving the file back and forth, Reopening Code Composer and regenerating makefile/rebuilding works as one expects. Nothing is rebuilt and no .obj files are removed.
Since this didn't have anything to do with gmake and makefile I guess this question was invalid/unrelevant to begin with. I will collect my thoughts and reformulate a new question formulated strictly as a Code Composer/Eclipse errand.
If anyone has found themselves in the same situation or have relevant information on their hands please share however!

I stopped the make command mid-way, will previous build be affected?

I built a very big project, which had a number of sub-projects, using make command. It took me 3 hours. Then by mistake (without cleaning the previous build) I re-executed the make command for a few minutes and then stopped it.
Have I ruined my previous build? How does make actually work behind the scenes? Are building the object files done in an atomic and safe manner?
Note: I cannot really run any of my binary files to see if they are broken since that's another lengthy process. I just want to know if I am fine or I have to re-run the make and let it finish.
If you want to publish this binary as a production version of your commercial product, then I would not rely on it, always be 100% sure that you are using a successfully built version of a fully saved and committed code base.
On the other hand, I you need this for debugging purposes, then you could use this! why? because the make system overrides the output binary only once if finishes compiling all the object files and only if it detects changes that requires a relink of the binary:
After recompiling whichever object files need it, make decides whether to relink edit. This must be done if the file edit does not exist, or if any of the object files are newer than it. If an object file was just recompiled, it is now newer than edit, so edit is relinked.
From GNU make: How Make Works
So if you haven't changed your code base, the linker will not relink the binary, leaving it as it was created by the successful build.

Building different targets in different folders - Xcode

I have two targets in the same xcode project, as bundle plugins, and I want the executable files within the binary to have the same name for both targets.
Is there a way I can either:
A) Define the executable file for each binary without affecting the .bundle name (or first target overwrites the second target as they're building).
B) Build both files in their own folder.
They are in the same project, so the build end-results are automatically placed in the same folder. One overwrites the other, as the .bundle name always ends up the same (because I want the same executable name). They share a lot of code, so they are in the same project, to build everything again at once, to make sure everything always has the latest code across all versions.
Would anybody know a way of doing this? I tried various options in the build settings. Or would anybody maybe have any "build phase" workaround ideas? Please don't ignore that the executable name needs to be the same for all binaries.
Thanks in advance!
I created a project for each slightly different built, with flags in the build settings for each target making each target distinct (with use of macros in the actual code).
Regarding in ensuring code is always the up-to-date, partially shared, code for each build/project, they share the source code by adding it to the project, without selecting the "copy to project folder" option. Annoying workaround, but it'll do until I work something better out...
Why don't you use targets with different names? Or a script after build, that copies the target to a different name?
This should copy each target to a unique name after build, without each build overwriting the other. (If things are linear)

make target, unmake target?

Since make knows all dependencies to make a target, it knows (almost) all intermediate artefacts. Is there a make version that can 'rollback', or 'unmake' a target?
(I know it removes targets created by implicit rules)
Most Makefiles provide the clean target, which removes all generated files. Additionally, autoconf-automake systems offer distclean, which also removes the files generated by configure.
These are no automatic targets, however; the makefile generation system gives a list of generated files to make. In general, make doesn't keep track which files it generated, so it can't distinguish between genuine source files (which are dependencies) and intermediate files (which are also dependencies). Even the leaf nodes of the dependency graph can't be used for distinguishment, because stamp files often have no dependencies as well.

Xcode dependencies across different build directories?

I am trying to set up Xcode for a project which contains multiple executables and static libraries. I have created multiple targets and set up the linking and dependencies, and initially everything works great. The catch...
This is an existing project which already has Visual Studio and Makefile builds. Those builds put the libraries in a lib/Debug directory and the executables in bin/Debug. So in Xcode I changed the Build Products Path to "lib" and "bin" respectively (so we can use one set of documentation for all of the platforms). This puts the compiled targets in the right place, but completely breaks both the linking (Library not found) and the dependencies.
I can fix the linking by adding $(SRCROOT)/lib/Debug to the Library Search Paths for each executable (but it feels like Xcode should be able to figure this out on its own, which makes me think I'm doing something wrong).
But — I can't figure out how to get the dependencies working again. If I change a library source file, the library will rebuild but not the dependent executables. If I force a build of the executable Xcode returns success without doing anything; it thinks the target is up to date. If I clean the target and then rebuild it works.
So, any ideas here? Is Xcode being fundamentally stupid in this regard, or is it me (I'm leaning toward the latter)?
Update: I've posted a sample project to demonstrate the issue at http://share.industriousone.com/XcodeDepsIssue.zip. Build it once, then modify MyStaticLib.c and build it again. The executable will not relink (and it should). Many thanks for any help on this one.
starkos, thanks for publishing your conclusion. It validated my experience as well. This situation really screwed me, so it was nice to know I wasn't just missing something.
I did however discover a workaround that avoids creating multiple projects or keeping the library and its dependent in the same directory. It is a hack, but it does work here.
I know it's a bit late but better than never.
For the dependency library, add a "Copy Files Build Phase", with Absolute Path as the destination, and the path text field should be the directory where the DEPENDENT target lives. Then click on Products, find the dependency library (will end with .a), and drag it into the "Copy Files Build Phase." If you now build, this will put the library into its own directory like before and THEN also copy it into the dependent's target directory.
For the dependent, you can now remove the dependency's output directory from the Library Search Paths. This will cause it to find the library copy. If you do this, the dependent will indeed be relinked each time the dependency .a is relinked.
The negatives are, of course, the extra time for the copy, and the necessity to specify (in the Copy phase) the target directory for each dependent of your library. Beats the hell out of the alternatives though....
Xcode doesn't automatically set up dependencies based on use of build products; you have to set up explicit target dependencies yourself.
Project > Edit Target Settings, General tab, + button, add any targets that are prerequisites to building the selected target. That should get you going again.
I've researched this some more and the answer is no, Xcode 3.x doesn't track dependencies between targets that live in different directories. You can work around it by giving each library its own project, and adding each of those to a master project. Or you can keep all of your targets in one directory. Pick your poison.
Here is my solution for this weird behavior in xcode 4.3.1. You have to add build pre-action in scheme:
rm -f ${BUILT_PRODUCTS_DIR}/${EXECUTABLE_PATH}
and choose which build settings to use for this script. Each time before build, target executable will be removed and rebuild completely. It helped for me, and i hope it helps you.
NOTE: Have tried to put this script in project build phase, and result was negative - debugger could not connect process to start debugging.
Good luck!
OK, it would help to have the text of the Linking... build line that's failing. But a couple of things:
1) You shouldn't be linking to anything in $(SRCROOT). That's your project sources. The two places to find things to link are $(SYMROOT) (the Build Products directory) or $(DSTROOT) (the Installed Products directory).
One thing you could do is to have a common Build Directory, then use 'xcodebuild install' action to install the products in the Installation Directory. The other is to use a Copy Files build phase to copy them after building, so you can link against them in $(SYMROOT) but still have them where your Windows compatriots expect them.
THere is probably a way to set up the per-target build products directories correctly, but I'd really have to see the project itself to figure it out.

Resources