Test if the find command found files or not - bash

I have a simple script to check a file of today's date. In the below example, the else condition is ignored if the file is not found. Why so?
if filedir=$(find . -type f -name "*.sql.gz" -mtime -1 -printf "%f\n"); then
echo $filedir
else
echo "oops"
fi

find returns an exit-code of 0 if all arguments are processed successfully, and has only a non-zero exitcode if there was an error. The exit code for find does not indicate whether or not files are found.
This is unlike, for example grep. You can see the difference in behaviour when you use
if filedir=$(find . -type f -name "*.sql.gz" -mtime -1 -printf "%f\n" | grep '.'); then

As Ljm Dullaart explained earlier, the find command does not return a specific code, when no file match patterns or rules.
Though, You may test it found a match, by checking the variable filedir is not empty:
[ -n "${filedir}" ]
if filedir="$(find . -type f -name '*.sql.gz' -mtime -1)" && [ -n "${filedir}" ]; then
printf 'Found: %s\n' "${filedir}"
else
printf 'Oops! Found no file matching *.sql.gz!\n' >&2
fi

Related

How to stop Bash expansion of '*.h" in a function?

In trying to run the following function—Bash is expanding my variable in an unexpected way—thus preventing me from getting my expected result.
It comes down to the way bash deals with a "*.h" which I am passing in to the function.
Here is the function I call:
link_files_of_type_from_directory "*.h" ./..
And where I would expect this variable to stay this way all the way through at some point, by the time it hits the echo $command_to_run; part of my Bash script...this variable has expanded to...
MyHeader1.h MyHeader2.h MyHeader3.h
and so on.
What I want is for Bash to not expand my files so that my code runs the following:
find ./.. -type f -name '*.h'
Instead of
find ./.. -type f -name MyHeader1.h MyHeader2.h MyHeader3.h
This is the code:
function link_files_of_type_from_directory {
local file_type=$1;
local directory_to_link=$2;
echo "File type $file_type";
echo "Directory to link $directory_to_link";
command="find $directory_to_link -type f -name $file_type";
echo $command;
#for i in $(find $directory_to_link -type f -name $file_type);
for i in $command;
do
echo $i;
if test -e $(basename $i); then
echo $i exists;
else
echo Linking: $i;
ln -s $i;
fi
done;
}
How can I prevent the expansion so that Bash does search for files that end in *.h in my the directory I want to pass in?
UPDATE 1:
So I've updated the call to be
link_files_of_type_from_directory "'*.h'" ..
And the function now assembles the string of the command to be evaluated like so:
mmd="find $directory_to_link -type f -name $file_type";
When I echo it out—it's correct :)
find .. -type f -name '*.h'
But I can't seem to get the find command to actually run. Here are the errors / mistakes I'm getting while trying to correctly assemble the for loop:
# for i in $mmd; # LOOPS THROUGH STRINGS IN COMMAND
# for i in '$(mdd)'; # RUNS MMD LITERALLY
# for i in ${!mmd}; # Errors out with: INVALID VARIABLE NAME — find .. -type f -name '*.h':
Would love help on this part—even though it is a different question :)
With quoting of your variables, removed semicolons and your loop wrapped into an -exec action to prevent problems with spaces, tabs and newlines in filenames, your function looks like this:
function link_files_of_type_from_directory {
local file_type=$1
local directory_to_link=$2
echo "File type $file_type"
echo "Directory to link $directory_to_link"
find "$directory_to_link" -type f -name "$file_type" -exec sh -c '
for i do
echo "$i"
if test -e "$(basename "$i")"; then
echo "$i exists"
else
echo "Linking: $i"
ln -s "$i"
fi
done
' sh {} +
}

Count filenumber in directory with blank in its name

If you want a breakdown of how many files are in each dir under your current dir:
for i in $(find . -maxdepth 1 -type d) ; do
echo -n $i": " ;
(find $i -type f | wc -l) ;
done
It does not work when the directory name has a blank in the name. Can anyone here tell me how I must edite this shell script so that such directory names also accepted for counting its file contents?
Thanks
Your code suffers from a common issue described in http://mywiki.wooledge.org/BashPitfalls#for_i_in_.24.28ls_.2A.mp3.29.
In your case you could do this instead:
for i in */; do
echo -n "${i%/}: "
find "$i" -type f | wc -l
done
This will work with all types of file names:
find . -maxdepth 1 -type d -exec sh -c 'printf "%s: %i\n" "$1" "$(find "$1" -type f | wc -l)"' Counter {} \;
How it works
find . -maxdepth 1 -type d
This finds the directories just as you were doing
-exec sh -c 'printf "%s: %i\n" "$1" "$(find "$1" -type f | wc -l)"' Counter {} \;
This feeds each directory name to a shell script which counts the files, similarly to what you were doing.
There are some tricks here: Counter {} are passed as arguments to the shell script. Counter becomes $0 (which is only used if the shell script generates an error. find replaces {} with the name of a directory it found and this will be available to the shell script as $1. This is done is a way that is safe for all types of file names.
Note that, wherever $1 is used in the script, it is inside double-quotes. This protects it for word splitting or other unwanted shell expansions.
I found the solution what I have to consider:
Consider_this
#!/bin/bash
SAVEIFS=$IFS
IFS=$(echo -en "\n\b")
for i in $(find . -maxdepth 1 -type d); do
echo -n " $i: ";
(find $i -type f | wc -l) ;
done
IFS=$SAVEIFS

bash: find with grep in if always true

Ok so this code works
if grep -lq something file.txt ; then
So why something like this doesnt? what am i doing wrong?
if find . -name file.txt -exec grep -lq something {} \;
its always true as long as the directory exist.
From the find man page:
Exit Status
find exits with status 0 if all files are processed successfully, greater than 0 if errors occur. This is deliberately a very broad description, but if the return value is non-zero, you should not rely on the correctness of the results of find.
What you're getting back from your command is the exit value of the find and not the grep. Find almost always returns an exit value of zero as long as the query is good.
I was thinking this might work:
find . -name file.txt -print0 | xargs --0 grep -lq something
But that will return only the exit status of the last execution of grep. If grep was executed multiple times, you won't get the intermediate values. However, this probably won't be an issue with your command.
A very simple way is to check if find's output is empty:
output=$( find . -name file.txt -exec grep -lq something {} \; )
if [ -n "$output" ]
then
# found
else
# not found
fi
One approach which will short-circuit as soon as a file containing the desired contents is found (presuming that your intent is to look for whether any file matches, as opposed to whether every file matches):
check_for_content() {
target=$1; shift
while IFS= read -r -d '' filename; do
if grep -lq -e "$target" "$filename"; then
return 0
fi
done < <(find "$#" -print0)
return 1
}
Usage:
check_for_content thing-to-look-for -type f -name file.txt

Variable from conditional statement

I have a few scripts I am taking ownership of that use Bash shell, there is a find statement inside a conditional statement.
Something like this:
if [ -z $(find / -type f -perm -002) ] ; then echo "no world writable found"
where as an else I would like to display what was found instead of world write perms found.
I can do:
echo $(find / -type f -perm -002) has world write permissions
or set variable to $(find / -type f -perm -002).
But was wondering if there was a a better way to do this. Is there another way to retrieve the contents of the find statement as a variable?
You just take the output and store it in a variable. If it is not empty you can print its contents. This way you only need to run the command once.
RESULT=$(find / -type f -perm -002)
if [ -z "$RESULT" ]
then
echo "no world writable found"
else
echo "$RESULT has world write permissions"
fi
You can use use sed to insert a headline, if you like.
REPORT=$(find /tmp -type f -perm -002 | sed '1s/^/Found world write permissions:\n/')
echo ${REPORT:-No world writable found.}
Notice: your example seems to be broken, because find can return more than one line.
And awk can do both at once:
find /tmp -type f -perm -002 |
awk -- '1{print "Found world write permissions:";print};END{if(NR==0)print "No world writable found."}'
If you don't mind not having the message no world writable found, you can use a single find statement, and that's all:
find / -type f -perm -002 -printf '%p has world write permissions\n'
If you need to store the returned files for future use, store them in an array (assuming Bash):
#!/bin/bash
files=()
while IFS= read -r -d '' f; do
files+=( "$f" )
# You may also print the message:
printf '%s has world write permissions\n' "$f"
done < <(find / -type f -perm -002 -print0)
# At this point, you have all the found files
# You may print a message if no files were found:
if ((${#files[#]}==0)); then
printf 'No world writable files found\n'
exit 0
fi
# Here you can do some processing with the files found...

Why does this conditional return "No such file or directory"

My conditional works properly when the dirs exist, but if they don't, it seems to execute both then and else statements (is that the correct term?).
script.sh
#!/bin/bash
if [[ $(find path/to/dir/*[^thisdir] -type d -maxdepth 0) ]]
then
find path/to/dir/*[^thisdir] -type d -maxdepth 0 -exec mv {} new/location \;
echo "Huzzah!"
else
echo "hey hey hey"
fi
prompt
For the first call, the dirs are there; in the second, they've been moved from the first call.
$ sh script.sh
Huzzah!
$ sh script.sh
find: path/to/dir/*[^thisdir]: No such file or directory
hey hey hey
How can I fix this?
tried suggestion(s)
if [[ -d $(path/to/dir/*[^thisdir]) ]]
then
find path/to/dir/*[^thisdir] -type d -maxdepth 0 -exec mv {} statamic-1.3-personal/admin/themes \;
echo "Huzzah!"
else
echo "hey hey hey"
fi
result
$ sh script.sh
script.sh: line 1: path/to/dir/one_of_the_dirs_to_be_moved: is a directory
hey hey hey
There seem to be some errors:
First, the pattern path/to/dir/*[^thisdir] is interpreted in bash in the same manner than path/to/dir/*[^dihstr] mean *all filename ending by d, i, h, s, t or r.
Than if you are searching for something in this dir (path/to/dir) but not on path/to/dir/thisdir, and not on a nth subdir, you could bannish find and write:
Edit: There was an error on my sample too: [ -e $var ] was wrong.
declare -a files=( path/to/dir/!(thisdir) )
if [ -e $files ] ;then
mv -t newlocation "${files[#]}"
echo "Huzzah!"
else
echo "hey hey hey"
fi
If you need find for searching in subirs, please give us samples and/or more descriptions.
Your error is probably occurring at if [[ $(find path/to/dir/*[^thisdir] -type d -maxdepth 0) ]] and then it goes to else because find errors out.
find wants its directory parameter to exist. Based on what you are trying to do you should probably consider
$(find path/to/dir/ -name "appropriate name pattern" -type d -maxdepth 1)
Also, I'd consider using actual logical function in if. See this for file conditionals.
Try adding a #!/bin/bash on the first line to ensure that it is bash that is executing your script, as recommended by this post:
Why is both the if and else executed?
The OP wishes to move all files excluding thisdir to a new location.
A solution using find would be to exclude thisdir using find's functionality, rather than by using bash's shell expansion:
#!/bin/bash
if [[ $(find path/to/directory/* -maxdepth 0 -type d -not -name 'thisdir') ]]
then
find path/to/directory/* -maxdepth 0 -type d -not -name 'thisdir' -exec mv {} new/location \;
echo "Huzzah!"
else
echo "hey hey hey"
fi
This has been tested, and works under bash version 4.2.39, and GNU findutils v4.5.10.

Resources