bash: How to delimit strings to find files - bash

What syntax should I use in a bash script to list files based on 3 dynamic values:
- older than X days
- in a specified directory
- whose name contains a specified string?
FILEAGE=7
FILEDIR='"/home/ecom/tmp"'
FILESTRING='"search-results-*"'
FILES_FOR_REMOVAL=$("/usr/bin/find "${FILEDIR}" -maxdepth 1 -type f -mtime +${FILEAGE} -name "${FILESTRING}" -exec ls -lth {} \;")
echo ${FILES_FOR_REMOVAL}
If I try the above I get:
-bash: /usr/bin/find "/home/ecom/tmp" -maxdepth 1 -type f -mtime +7 -name "search-results-*" -exec ls -lth {} \;: No such file or directory

Remove superfluous quotes:
FILEAGE=7
FILEDIR='/home/ecom/tmp'
FILESTRING='search-results-*'
FILES_FOR_REMOVAL=$(/usr/bin/find "${FILEDIR}" -maxdepth 1 -type f -mtime +${FILEAGE} -name "${FILESTRING}" -exec ls -lth {} \;)

Your syntax for 'find' looks ok. Try removing the quotes around the command string, i.e.
FILES_FOR_REMOVAL=$(/usr/bin/find "${FILEDIR}" -maxdepth 1 -type f -mtime +${FILEAGE} -name "${FILESTRING}" -exec ls -lth {} \;)

FILEAGE=7
FILEDIR='/home/ecom/tmp'
FILESTRING='search-results-*'
/usr/bin/find "${FILEDIR}" -maxdepth 1 -type f -mtime +${FILEAGE} -name "${FILESTRING}" -exec /bin/ls -lth '{}' \;
There were some extra quotes that created the error. Also specify full path to /bin/ls to avoid problems with potential aliasing of ls(1). And to get filenames on a separate line, I dropped the $FILES_FOR_REMOVAL variable. You can also use
/usr/bin/find "${FILEDIR}" -maxdepth 1 -type f -mtime +${FILEAGE} -name "${FILESTRING}" -ls

(I can't add comments, but ... )
To reliably handle file names with spaces, you may want to consider storing the file list in a temp text file instead of a variable and loop through it using a while construct (instead of a for)
For example:
FILEAGE=7
FILEDIR='/home/ecom/tmp'
FILESTRING='search-results-*'
TEMPFILE=".temp${RANDOM}"
CMD="find \"${FILEDIR}\" -maxdepth 1 -type f -mtime +${FILEAGE} -name \"${FILESTRING}\" -exec /bin/ls -lth '{}' \;"
$CMD > $TEMPFILE # write output to file
while read thefile; do
do_somthing_to $thefile
done < $TEMPFILE
rm $TEMPFILE # clean up after
Or, if you're only going to use the list once, pipe the output directly to the while construct:
$CMD | while read thefile; do
do_something_to $thefile
done

Related

Rename file if it is the only one with the extension in directory

This works however I would like to do it only if it is the only .jpg for the given directory, the one below will just rename them all to folder.jpg, overwriting the other files:
find . -type f -name '*.jpg' -execdir mv {} 'folder.jpg' \;
I guess find cannot filter by the number of matches, but you can always exec a shell which does more elaborate checks for you:
find . -type f -name '*.jpg' -execdir sh -c '[ $# = 1 ] && mv "$1" folder.jpg' sh {} +

Using command substitution in find -exec

How can I use command substitution in find … -exec … to avoid using xargs in the following command?
find -L -- /path/to/directory -mindepth 2 -maxdepth 2 -type d -exec dirname '{}' \; | xargs basename -a
I tried the following using command substitution, but it output . for each result instead of the desired output:
find -L -- /path/to/directory -mindepth 2 -maxdepth 2 -type d -exec basename "$(dirname '{}')" \;
Your first command will return strange results if a path contains whitespace.
Use a small shell script:
find -L -- . -mindepth 2 -maxdepth 2 -type d -exec sh -c 'basename "$(dirname "{}")"' \;
Alternative syntax to pass one path argument to the script:
find -L -- . -mindepth 2 -maxdepth 2 -type d -exec sh -c 'basename "$(dirname "$1")"' sh {} \;
Or pass as many arguments to the script as possible:
find -L -- . -mindepth 2 -maxdepth 2 -type d -exec sh -c '
for path do
basename "$(dirname "$path")"
done
' sh {} +
With GNU utilities it's possible to output NUL-terminated strings with dirname passed to xargs -0. The basename command is not run if there are no arguments (-r):
find -L -- . -mindepth 2 -maxdepth 2 -type d -exec dirname -z {} + | xargs -r0 basename -a

find -exec basename {} vs find -exec echo $(basename {})

I'm sure I'm missing something but I can't figure it out. Given:
$ find -type f
./hello.txt
./wow.txt
./yay.txt
how come the next two commands render different results?
$ find -type f -exec basename {} \;
hello.txt
wow.txt
yay.txt
$ find -type f -exec echo $(basename {}) \;
./hello.txt
./wow.txt
./yay.txt
$(basename {}) is evaluated before the command runs. The result is {} so the command echo $(basename {}) becomes echo {} and basename is not run for each file.
A quick debug on that using the bash -x debugger demonstrated this,
[The example is my own, just for demonstration purposes]
bash -xc 'find -type f -name "*.sh" -exec echo $(basename {}) \;'
++ basename '{}'
+ find -type f -name '*.sh' -exec echo '{}' ';'
./1.sh
./abcd/another_file_1_not_ok.sh
./abcd/another_file_2_not_ok.sh
./abcd/another_file_3_not_ok.sh
And for just basename {}
bash -xc 'find -type f -name "*.sh" -exec basename {} \;'
+ find -type f -name '*.sh' -exec basename '{}' ';'
1.sh
another_file_1_not_ok.sh
another_file_2_not_ok.sh
another_file_3_not_ok.sh
As you can see in the first example, echo $(basename {}) gets resolved in two steps, basename {} is nothing but the basename on the actual file (which outputs the plain file name) which is then interpreted as echo {}. So it is nothing but mimic-ing the exact behaviour when you use find with exec and echo the files as
bash -xc 'find -type f -name "*.sh" -exec echo {} \;'
+ find -type f -name '*.sh' -exec echo '{}' ';'
./1.sh
./abcd/another_file_1_not_ok.sh
./abcd/another_file_2_not_ok.sh
./abcd/another_file_3_not_ok.sh

find: How to use found paths in the -exec directive?

I have a dozen files named
~/DOMAIN1.de/bin/dbdeploy.php
~/DOMAIN2.de/bin/dbdeploy.php
~/DOMAIN3.de/bin/dbdeploy.php
I want to run them all with the same arguments.
My bash script reads:
cd ~
find . -maxdepth 1 -type d -name "*\.de" -exec php56 bin/dbdeploy.php "$1" "$2" \;
However, the path given to exec seems not to be relative to the found subdirectory but rather to my PWD:
$ bash -x ./.dbpush "some argument"
+ cd ~
+ find . -maxdepth 1 -type d -name '*\.de' -exec php56 bin/dbdeploy.php 'some argument' ';'
Could not open input file: bin/dbdeploy.php
Could not open input file: bin/dbdeploy.php
Could not open input file: bin/dbdeploy.php
How can I use the found path in the -exec directive?
Ok, actually I found the answer myself:
The "find"-results are stored in {}, so the line reads
find . -maxdepth 1 -type d -name "*\.de" -exec php56 {}/bin/dbdeploy.php "$1" "$2" \;
Alternativly
find . -type f -wholename "*\.de/bin/dbdeploy.php" -exec php56 {} "$1" "$2" \;

ls command and size of files in shell script

count=0; #count for counting
IFS='
'
for x in `ls -l $input`; #for loop using ls command
do
a=$(ls -ls | awk '{print $6}') #print[6] is sizes of file
echo $a
b=`echo $a | awk '{split($0,numbers," "); print numbers[1]}'`
echo $b
if [ $b -eq 0 ] # b is only size of a file
then
count=`expr $count + 1` #if b is zero , the count will increase one by one
fi
echo $count
done
I want to find 0 size files . I do that using find command. The second thing is I want to count number of has 0 size of files using ls command and awk. But It doesn't true code . What is my mistake ?
The -s test is true if a file has non-zero size. If that test fails for file, increment your empty-file count.
empty_files=0
for f in "$input"/*; do
[ -s "$f" ] || : $(( empty_files++ ))
done
Your main mistake is that you're parsing ls!
If you want to find (regular) files that are empty, and if you have a version of find that supports the -empty predicate, use it:
find . -type f -empty
Note that this will recurse in subfolders too; if you don't want that, use:
find . -maxdepth 1 -type f -empty
(assuming that your find also supports -maxdepth).
If you only want to count how many empty (regular) files you have:
find . -maxdepth 1 -type f -empty -printf x | wc -m
and if you want to perform both operations at the same time, i.e., print out the name or save them in an array for future use, and count them:
empty_files=()
while IFS= read -r -d '' f; do
empty_files+=( "$f" )
done < <(find . -maxdepth 1 -type f -empty -print0)
printf 'There are %d empty files:\n' "${#empty_files[#]}"
printf ' %s\n' "${empty_files[#]}"
With Bash≥4.4, you could use mapfile instead of the while-read loop:
mapfile -t -d '' empty_files < <(find . -maxdepth 1 -type f -empty -print0)
printf 'There are %d empty files:\n' "${#empty_files[#]}"
printf ' %s\n' "${empty_files[#]}"
For a POSIX-compliant way, use test with the -s option:
find . -type f \! -exec test -s {} \; -print
and if you don't want to recurse into subdirectories, you'll have to -prune them:
find . \! -name . -prune -type f \! -exec test -s {} \; -print
and if you want to count them:
find . \! -name . -prune -type f \! -exec test -s {} \; -exec printf x | wc -m
and here, if you want to perform both operations (count them and save them in an array for later use), use the previous while-read loop (or mapfile if you live in the future) with this find:
find . \! -name . -prune -type f \! -exec test -s {} \; -exec printf '%s\0' {} \;
Also see chepner's answer for a pure shell solution (needs minor tweaking to be POSIX compliant).
Regarding your comment
I want to count and delete [empty files]. How can I do that at the same time?
If you have GNU find (or a find that supports all the goodies):
find . -maxdepth 1 -type f -empty -printf x -delete | wc -m
if not,
find . \! -name . -prune -type f \! -exec test -s {} \; -printf x -exec rm {} \; | wc -m
Make sure that the -delete (or -exec rm {} \;) predicate is at the end! do not exchange the order of the predicates!

Resources