Run different code in Makefile depending on the age of a file - makefile

I am attempting to create an auto-update mechanism for shared, inherited base-makefile code, and I only want it to check for updates once per day. Having fixed my misunderstanding of how ifeq works in Makefiles, I think the last thing I need to figure out is why the values I'm comparing don't behave as expected.
I'm trying to use the last modified timestamp on a file .makefile-update-ts to indicate the last time an update was run, and checking that by using find . -mtime +24h -name '.makefile-update-ts' to indicate whether it's old enough yet. I'm not getting any syntax errors, and the values I get back from my various find commands seem to be correct, so I don't understand why, but my logic isn't working...
So let's start at the most basic and work our way up:
Starting with a pretty vanilla Makefile:
work:
ifeq (a,a)
#echo "A"
else
#echo "not A"
endif
$ make work
A
✅ works as expected
If I then graduate to use a variable to store the "a" value:
thing=a
compare=a
work:
ifeq ($(thing),$(compare))
#echo "A"
else
#echo "not A"
endif
$ make work
A
✅ works as expected
So let's switch to using find to get some indications of whether the file exists, and whether it's >= 24h old.
ts-filename=.makefile-update-ts
exists=`find . -name '$(ts-filename)'`
old=`find . -mtime +24h -name '$(ts-filename)'`
match=./$(ts-filename)
work:
#echo "$(match) (match)"
#echo "$(exists) (exists)"
#echo "$(old) (old)"
$touch .makefile-update-ts
$make work
./.makefile-update-ts (match)
./.makefile-update-ts (exists)
(old)
✅ Seems like the find commands are getting the values I want and I'm able to echo them out. Since I just created the file, the "old" search finds nothing, which is expected.
Time to graduate to the final script... or so I thought. In the makefile below I've added some extra echos to illustrate the state of things at the time of compare, which when taken alongside the output of the various commands that get executed, doesn't make sense.
ts-filename=.makefile-update-ts
exists=`find . -name '$(ts-filename)'`
old=`find . -mtime +24h -name '$(ts-filename)'`
match=./$(ts-filename)
work: .check-for-update
#echo "working..."
.check-for-update:
# is the file > 24 hours old?
ifeq ($(old),$(match))
#echo checking for Makefile updates...
#make .update
else
#echo $(match) - "match"
#echo $(old) - "old"
# if the file doesn't exist at all yet, pull the update
ifeq ($(exists),$(match))
#echo "last update was recent, not updating..."
else
#echo $(exists) - "exists"
#echo Getting base Makefile...
#make .update
endif
endif
.update:
touch $(ts-filename)
$ make work
./.makefile-update-ts - match
- old
./.makefile-update-ts - exists
Getting base Makefile...
touch .makefile-update-ts
working...
❌ Why, if the file exists, and it's not "old" ($(old) is empty), does it print "Getting base Makefile..."??? I would expect it to instead print, "last update was recent, not updating..."

The problem is back to the original misunderstanding. Makefiles are not shell scripts. Consider this:
exists=`find . -name '$(ts-filename)'`
old=`find . -mtime +24h -name '$(ts-filename)'`
backticks are a shell feature. Make doesn't know anything about them. You've here declared two variables that contain the literal strings containing backticks and the commands, not, as in the shell, the output of running those commands.
So for this:
ifeq ($(old),$(match))
it expands to compare the literal strings:
ifeq (`find . -mtime +24h -name '$(ts-filename)'`,./.makefile-update-ts)
Obviously those strings are not equal.
If you want to run a shell script and assign its output to a make variable, you have to use make's shell function, like this:
exists := $(shell find . -name '$(ts-filename)')
old := $(shell find . -mtime +24h -name '$(ts-filename)')
A few points here:
I use := instead of = to make it a bit more efficient.
I strongly recommend that you use whitespace around your assignment operators when dealing with make variables, rather than run them all together with no whitespace as you would have to with shell variables. Maybe this cognitive dissonance will help reinforce the difference. Plus, it's just better.
Oh, you may ask, why does your third example work? Simple, it's because you are passing the literal strings to the shell and the shell is doing the expansion for you. You can easily see this, if you remove the # prefixes on your shell script lines. Another tip is, NEVER add # to any makefile recipe until you are 100% sure it's working perfectly (and, even then you might consider not doing it).
If you remove them you'll see:
work:
echo "$(old) (old)"
This will print:
echo "`find . -mtime +24h -name '$(ts-filename)'` `find . -mtime +24h -name '$(ts-filename)'`"
./.makefile-update-ts ./.makefile-update-ts
Instead of what you'd expect IF the make variable contained the already-expanded content, which would be:
echo "./.makefile-update-ts ./.makefile-update-ts"
./.makefile-update-ts ./.makefile-update-ts
Another tip is, when you are trying to use MAKE variables from within a shell script you should always (unless the value might contain single quotes) single quotes so that the shell won't mess around with the value:
work:
echo '$(old) (old)'
would print:
echo '`find . -mtime +24h -name '$(ts-filename)'` `find . -mtime +24h -name '$(ts-filename)'`'
`find . -mtime +24h -name '$(ts-filename)'` `find . -mtime +24h -name '$(ts-filename)'`

Related

Makefile with find command results in error "paths must precede expression"

I have the following Makefile which should find all .tex files starting with prefix "slides" and then compile all these latex files:
TSLIDES = $(shell find . -maxdepth 1 -iname 'slides*.tex' -printf '%f\n')
TPDFS = $(TSLIDES:%.tex=%.pdf)
all: $(TPDFS)
$(TPDFS): %.pdf: %.tex
latexmk -pdf $<
However, I keep getting the error messages (I am pretty sure it used to work and am very confused why I am getting this error now...)
/usr/bin/find: paths must precede expression: `slides01-intro.tex'
/usr/bin/find: possible unquoted pattern after predicate `-iname'?
In the manual, I found this
NON-BUGS
Operator precedence surprises
The command find . -name afile -o -name bfile -print will never print
afile because this is actually equivalent to find . -name afile -o \(
-name bfile -a -print \). Remember that the precedence of -a is
higher than that of -o and when there is no operator specified
between tests, -a is assumed.
“paths must precede expression” error message
$ find . -name *.c -print
find: paths must precede expression
Usage: find [-H] [-L] [-P] [-Olevel] [-D ... [path...] [expression]
This happens because *.c has been expanded by the shell resulting in
find actually receiving a command line like this:
find . -name frcode.c locate.c word_io.c -print
That command is of course not going to work. Instead of doing things
this way, you should enclose the pattern in quotes or escape the
wildcard:
$ find . -name '*.c' -print
$ find . -name \*.c -print
But this does not help in my case as I have used quotes to avoid shell expansion. Any idea how I can fix this (I have also tried TSLIDES = $(shell find . -maxdepth 1 -iname 'slides*.tex' in the first line of my Makefile but it exits with the same error?
EDIT: I am on windows and use the git bash (which is based on mingw-64).
You should always make very clear up-front in questions using Windows, that you're using Windows. Running POSIX-based tools like make on Windows always requires a bit of extra work. But I'm assuming based on the mingw-w64 label that you are, in fact, on Windows.
I tried your example on my GNU/Linux system and it worked perfectly. My suspicion is that your version of GNU make is invoking Windows cmd.exe instead of a POSIX shell like bash. In Windows cmd.exe, the single-quote character ' is not treated like a quote character.
Try replacing your single quotes with double-quotes " and see if it works:
TSLIDES = $(shell find . -maxdepth 1 -iname "slides*.tex" -printf "%f\n")
I'm also not sure if the \n will be handled properly. But you don't really need it, you can just use -print (or even, in GNU find, leave it out completely as it's the default action).
I'm not a Windows person so the above might not help but it's my best guess. If not please edit your question and provide more details about the environment you're using: where you got your version of make, where you're running it from, etc.

Go into every subdirectory and mass rename files by stripping leading characters

From the current directory I have multiple sub directories:
subdir1/
001myfile001A.txt
002myfile002A.txt
subdir2/
001myfile001B.txt
002myfile002B.txt
where I want to strip every character from the filenames before myfile so I end up with
subdir1/
myfile001A.txt
myfile002A.txt
subdir2/
myfile001B.txt
myfile002B.txt
I have some code to do this...
#!/bin/bash
for d in `find . -type d -maxdepth 1`; do
cd "$d"
for f in `find . "*.txt"`; do
mv "$f" "$(echo "$f" | sed -r 's/^.*myfile/myfile/')"
done
done
however the newly renamed files end up in the parent directory
i.e.
myfile001A.txt
myfile002A.txt
myfile001B.txt
myfile002B.txt
subdir1/
subdir2/
In which the sub-directories are now empty.
How do I alter my script to rename the files and keep them in their respective sub-directories? As you can see the first loop changes directory to the sub directory so not sure why the files end up getting sent up a directory...
Your script has multiple problems. In the first place, your outer find command doesn't do quite what you expect: it outputs not only each of the subdirectories, but also the search root, ., which is itself a directory. You could have discovered this by running the command manually, among other ways. You don't really need to use find for this, but supposing that you do use it, this would be better:
for d in $(find * -maxdepth 0 -type d); do
Moreover, . is the first result of your original find command, and your problems continue there. Your initial cd is without meaningful effect, because you're just changing to the same directory you're already in. The find command in the inner loop is rooted there, and descends into both subdirectories. The path information for each file you choose to rename is therefore stripped by sed, which is why the results end up in the initial working directory (./subdir1/001myfile001A.txt --> myfile001A.txt). By the time you process the subdirectories, there are no files left in them to rename.
But that's not all: the find command in your inner loop is incorrect. Because you do not specify an option before it, find interprets "*.txt" as designating a second search root, in addition to .. You presumably wanted to use -name "*.txt" to filter the find results; without it, find outputs the name of every file in the tree. Presumably you're suppressing or ignoring the error messages that result.
But supposing that your subdirectories have no subdirectories of their own, as shown, and that you aren't concerned with dotfiles, even this corrected version ...
for f in `find . -name "*.txt"`;
... is an awfully heavyweight way of saying this ...
for f in *.txt;
... or even this ...
for f in *?myfile*.txt;
... the latter of which will avoid attempts to rename any files whose names do not, in fact, change.
Furthermore, launching a sed process for each file name is pretty wasteful and expensive when you could just use bash's built-in substitution feature:
mv "$f" "${f/#*myfile/myfile}"
And you will find also that your working directory gets messed up. The working directory is a characteristic of the overall shell environment, so it does not automatically reset on each loop iteration. You'll need to handle that manually in some way. pushd / popd would do that, as would running the outer loop's body in a subshell.
Overall, this will do the trick:
#!/bin/bash
for d in $(find * -maxdepth 0 -type d); do
pushd "$d"
for f in *.txt; do
mv "$f" "${f/#*myfile/myfile}"
done
popd
done
You can do it without find and sed:
$ for f in */*.txt; do echo mv "$f" "${f/\/*myfile/\/myfile}"; done
mv subdir1/001myfile001A.txt subdir1/myfile001A.txt
mv subdir1/002myfile002A.txt subdir1/myfile002A.txt
mv subdir2/001myfile001B.txt subdir2/myfile001B.txt
mv subdir2/002myfile002B.txt subdir2/myfile002B.txt
If you remove the echo, it'll actually rename the files.
This uses shell parameter expansion to replace a slash and anything up to myfile with just a slash and myfile.
Notice that this breaks if there is more than one level of subdirectories. In that case, you could use extended pattern matching (enabled with shopt -s extglob) and the globstar shell option (shopt -s globstar):
$ for f in **/*.txt; do echo mv "$f" "${f/\/*([!\/])myfile/\/myfile}"; done
mv subdir1/001myfile001A.txt subdir1/myfile001A.txt
mv subdir1/002myfile002A.txt subdir1/myfile002A.txt
mv subdir1/subdir3/001myfile001A.txt subdir1/subdir3/myfile001A.txt
mv subdir1/subdir3/002myfile002A.txt subdir1/subdir3/myfile002A.txt
mv subdir2/001myfile001B.txt subdir2/myfile001B.txt
mv subdir2/002myfile002B.txt subdir2/myfile002B.txt
This uses the *([!\/]) pattern ("zero or more characters that are not a forward slash"). The slash has to be escaped in the bracket expression because we're still inside of the pattern part of the ${parameter/pattern/string} expansion.
Maybe you want to use the following command instead:
rename 's#(.*/).*(myfile.*)#$1$2#' subdir*/*
You can use rename -n ... to check the outcome without actually renaming anything.
Regarding your actual question:
The find command from the outer loop returns 3 (!) directories:
.
./subdir1
./subdir2
The unwanted . is the reason why all files end up in the parent directory (that is .). You can exclude . by using the option -mindepth 1.
Unfortunately, this was onyl the reason for the files landing in the wrong place, but not the only problem. Since you already accepted one of the answers, there is no need to list them all.
a slight modification should fix your problem:
#!/bin/bash
for f in `find . -maxdepth 2 -name "*.txt"`; do
mv "$f" "$(echo "$f" | sed -r 's,[^/]+(myfile),\1,')"
done
note: this sed uses , instead of / as the delimiter.
however, there are much faster ways.
here is with the rename utility, available or easily installed wherever there is bash and perl:
find . -maxdepth 2 -name "*.txt" | rename 's,[^/]+(myfile),/$1,'
here are tests on 1000 files:
for `find`; do mv 9.176s
rename 0.099s
that's 100x as fast.
John Bollinger's accepted answer is twice as fast as the OPs, but 50x as slow as this rename solution:
for|for|mv "$f" "${f//}" 4.316s
also, it won't work if there is a directory with too many items for a shell glob. likewise any answers that use for f in *.txt or for f in */*.txt or find * or rename ... subdir*/*. answers that begin with find ., on the other hand, will also work on directories with any number of items.

Bash:How to display regular files that have been changed at least x minutes ago, but not earlier that y minutes ago?

I want to write a script that will find and display regular files from given directory that have been changed at last 1 minutes ago, but not earlier that 10 minutes ago.
What i done dont work.
#!/bin/bash
for i in $(find $1 -type f -mmin +1 -mmin -10)
do
echo "$i"
done
You are not explaining how it "doesn't work" but what you have will break if you pass in arguments with spaces in them.
find "$1" -type f -mmin +1 -mmin -10
This should work for one argument with spaces or other problematic characters in it.
find "$#" -type f -mmin +1 -mmin -10
This allows you to pass in multiple paths on the command line, with proper quoting.
Try mkdir "foo bar" "ick poo" and run the script on those.
./myscript foo bar ick poo # won't work
This obviously receives four string arguments, not two directory names with spaces in them. Not good.
./myscript "foo bar" "ick poo"
This works provided your script properly quotes the command-line arguments internally. An unquoted $1 turns into two arguments foo and bar whereas a properly quoted "$1" correctly refers to the directory named foo bar.
There is a wealth of guides to shell quoting but they are not getting the attention they deserve. Many introductory texts fail to explain these issues properly, or even contain examples with improper quoting.
If you don't want to escape spaces when calling the script you can do
#!/bin/bash
dir="$#"
find "$dir" -type f -mmin +1 -mmin -10
e.g. output
$ ./script test dir
test dir/test.txt
Otherwise just do
#!/bin/bash
find "$1" -type f -mmin +1 -mmin -10
e.g. output
$ ./script "test dir"
test dir/test.txt
Also make sure you permissions are locked down to only be executable for you, to prevent injection attacks.

Building up a command string for find

I'm trying to parse the android source directory and i need to extract all the directory names excluding certain patterns. If you notice below., for now i included only 1 directory to the exclude list, but i will be adding more.,
The find command doesn't exclude the directory with name 'docs'.
The commented out line works., but the other one doesn't. For easy debugging, i included the min and maxdepth which i would remove later.
Any comments or hints on why it doesn't work?
#! /bin/bash
ANDROID_PATH=$1
root=/
EXCLUDES=( doc )
cd ${root}
for dir in "${EXCLUDES[#]}"; do
exclude_name_cmd_string=${exclude_name_cmd_string}$(echo \
"-not -name \"${dir}*\" -prune")
done
echo -e ${exclude_name_cmd_string}
custom_find_cmd=$(find ${ANDROID_PATH} -mindepth 1 -maxdepth 1 \
${exclude_name_cmd_string} -type d)
#custom_find_cmd=$(find ${ANDROID_PATH} -mindepth 1 -maxdepth 1 \
# -not -name "doc*" -prune -type d)
echo ${custom_find_cmd}
Building up a command string with possibly-quoted arguments is a bad idea. You get into nested quoting levels and eval and a bunch of other dangerous/confusing syntactic stuff.
Use an array to build the find; you've already got the EXCLUDES in one.
Also, the repeated -not and -prune seems weird to me. I would write your command as something like this:
excludes=()
for dir in "${EXCLUDES[#]}"; do
excludes+=(-name "${dir}*" -prune -o)
done
find "${ANDROID_PATH}" -mindepth 1 -maxdepth 1 "${excludes[#]}" -type d -print
The upshot is, you want the argument to -name to be passed to find as a literal wildcard that find will expand, not a list of files returned by the shell's expansion, nor a string containing literal quotation marks. This is very hard to do if you try to build the command as a string, but trivial if you use an array.
Friends don't let friends build shell commands as strings.
When I run your script (named fin.sh) as:
bash -x fin.sh $HOME/tmp
one of the lines of trace output is:
find /Users/jleffler/tmp -mindepth 1 -maxdepth 1 -not -name '"doc*"' -prune -type d
Do you see the single quotes around the double quotes? That's bash trying to be helpful. I'm guessing that your "doesn't work" problem is that you still get directories under doc* included in the output; other than that, it seems to work for me.
How to fix that?
...it seems you've found a way to fix that...I'm not sure I'd trust it with a Bourne shell (but the Korn shell seems to agree with Bash), but it looks like it might work with Bash. I'm pretty sure this is something that changed during the last 30 years or so, but it is hard to prove that; getting hands on the old code is not easy.
I also wonder whether you need repeated -prune options if you have repeated excluded directories; I'm not sufficiently familiar with -prune to be sure.
Found the problem. Its with the escape sequence in the exclude_name_cmd_string.
Correct syntax should have been
exclude_name_cmd_string=${exclude_name_cmd_string}$(echo \
"-not -name ${dir}* -prune")

How do I get the files in a directory and all of its subdirectories in bash?

I'm working on a C kernel and I want to make it easier to compile all of the sources by using a bash script file. I need to know how to do a foreach loop and only get the files with a .c extension, and then get the filename of each file I find so I can make gcc compile each one.
Use find to walk through your tree
and then read the list it generates using while read:
find . -name \*.c | while read file
do
echo process $file
done
If the action that you want to do with file is not so complex
and can be expressed using ore or two commands, you can avoid while
and make all things with the find itself. For that you will use -exec:
find . -name \*.c -exec command {} \;
Here you write your command instead of command.
You can also use -execdir:
find . -name \*.c -execdir command {} \;
In this case command will be executed in the directory of found file (for each file that was found).
If you're using GNU make, you can do this using only make's built-in functions, which has the advantage of making it independent of the shell (but the disadvantage of being slower than find for large trees):
# Usage: $(call find-recursive,DIRECTORY,PATTERN)
find-recursive = \
$(foreach f,$(wildcard $(1)/*),\
$(if $(wildcard $(f)/.),\
$(call find-recursive,$(f),$(2)),\
$(filter $(2),$(f))))
all:
#echo $(call find-recursive,.,%.c)

Resources