bash: Check if a file in sequence is missing - bash

I have a bunch of files in a directory whose names contain numbers.
/mnt/exp-data/6/instrument/caen2018/stage0/S0Test_26060_500ns_CW_0ns_CBT_0ns_DEBT.root
/mnt/exp-data/6/instrument/caen2018/stage0/S0Run_26061_500ns_CW_0ns_CBT_0ns_DEBT.root
/mnt/exp-data/6/instrument/caen2018/stage0/S0Test_26063_500ns_CW_0ns_CBT_0ns_DEBT.root
/mnt/exp-data/6/instrument/caen2018/stage0/S0Run_26065_500ns_CW_0ns_CBT_0ns_DEBT.root
What I'd like to do is find which files are missing and then do something with those. In the above case the files which contains numbers 26062, 26064 are missing.
So far I'm doing the following
#___________________________________________________________________________________________________________________________
#-3-Find the missing runs
REPLAYED_RUNS_DIR=/ /mnt/exp-data/6/instrument/caen2018/stage0
echo "..........Looking for non replayed runs in the range $smallest_run-$biggest_run"
for (( i=$smallest_run; i<=$biggest_run; ++i));do
filename="$REPLAYED_DATA_DIR/*$i*.root"
#echo $filename
if [ ! -f $filename ]; then
echo "Run $i does not exist."
./produce_file $i
fi
done
This snippet manages to find files that are missing, however I have a few issues:
In some cases I get the following error for files that do exist and I have no idea why.
./check_missing.sh: line 53: [: /mnt/exp-data/6/instrument/caen2018/stage0/S0Run_25829_500ns_CW_0ns_CBT_0ns_DEBT.root: binary operator expected
If I uncomment echo filename the I get as an output the full name and directory of the files, as if I was doing ls instead of echo`. Is this to be expected?
Is there a better way (i,e, faster, more efficient) to do what I'm trying to do?

Here is a script to do this in a more massive way.
#!/bin/bash
d="path/to/directory"
start=$1
end=$2
join -v1 <(
seq "$start" "$end"
) <(
find "$d" -type f -printf "%f\0" |
awk -F"/" -v RS="\0" '{split($NF,a,"_"); print a[2]}' | sort
) | xargs -r -n1 echo ./produce_file
join -v1 file1 file2 will output all lines of file1 not in file2. In place of those two files, using process substitution, we put the sequence to be tested, and the filenames by find, filtered by awk to get the number in them and finally sorted, because join wants sorted inputs.
Finally you can pipe the result to your script produce_file. -r stands for --no-run-if-empty which is a GNU extension to avoid one execution with empty input if the previous command gave no arguments.
Remove echo after testing. If your script can process multiple number arguments, remove -n1 also, to process all of them at once.
Testing:
> mkdir -p path/to/directory
touch path/to/directory/S0Test_26060_0ns_CBT_0ns_DEBT.root
touch path/to/directory/S0Run_26061_500ns.root
touch path/to/directory/S0Test_26063_500ns_CW.root
touch path/to/directory/S0Test_26065_500ns.root
touch path/to/directory/S0Test_30000_500ns.root
> bash test.sh 26060 26065
./produce_file 26062
./produce_file 26064

I found another way to do it using find and [ -z "$filename"] to check if find returned an empty entry.
for (( i=$start; i<=$endn; ++i ));do
filename=`find "$DIR" -type f -name "*$i*"`
if [ -z "$filename" ]; then
echo "File $i does not exist."
fi
done

Related

Testing for existing file with an extension fails with globbing [duplicate]

This question already has answers here:
Test whether a glob has any matches in Bash
(22 answers)
Closed 4 years ago.
I'm trying to check if a file exists, but with a wildcard. Here is my example:
if [ -f "xorg-x11-fonts*" ]; then
printf "BLAH"
fi
I have also tried it without the double quotes.
For Bash scripts, the most direct and performant approach is:
if compgen -G "${PROJECT_DIR}/*.png" > /dev/null; then
echo "pattern exists!"
fi
This will work very speedily even in directories with millions of files and does not involve a new subshell.
Source
The simplest should be to rely on ls return value (it returns non-zero when the files do not exist):
if ls /path/to/your/files* 1> /dev/null 2>&1; then
echo "files do exist"
else
echo "files do not exist"
fi
I redirected the ls output to make it completely silent.
Here is an optimization that also relies on glob expansion, but avoids the use of ls:
for f in /path/to/your/files*; do
## Check if the glob gets expanded to existing files.
## If not, f here will be exactly the pattern above
## and the exists test will evaluate to false.
[ -e "$f" ] && echo "files do exist" || echo "files do not exist"
## This is all we needed to know, so we can break after the first iteration
break
done
This is very similar to grok12's answer, but it avoids the unnecessary iteration through the whole list.
If your shell has a nullglob option and it's turned on, a wildcard pattern that matches no files will be removed from the command line altogether. This will make ls see no pathname arguments, list the contents of the current directory and succeed, which is wrong. GNU stat, which always fails if given no arguments or an argument naming a nonexistent file, would be more robust. Also, the &> redirection operator is a bashism.
if stat --printf='' /path/to/your/files* 2>/dev/null
then
echo found
else
echo not found
fi
Better still is GNU find, which can handle a wildcard search internally and exit as soon as at it finds one matching file, rather than waste time processing a potentially huge list of them expanded by the shell; this also avoids the risk that the shell might overflow its command line buffer.
if test -n "$(find /dir/to/search -maxdepth 1 -name 'files*' -print -quit)"
then
echo found
else
echo not found
fi
Non-GNU versions of find might not have the -maxdepth option used here to make find search only the /dir/to/search instead of the entire directory tree rooted there.
Use:
files=(xorg-x11-fonts*)
if [ -e "${files[0]}" ];
then
printf "BLAH"
fi
You can do the following:
set -- xorg-x11-fonts*
if [ -f "$1" ]; then
printf "BLAH"
fi
This works with sh and derivatives: KornShell and Bash. It doesn't create any sub-shell. $(..) and `...` commands used in other solutions create a sub-shell: they fork a process, and they are inefficient. Of course it works with several files, and this solution can be the fastest, or second to the fastest one.
It works too when there aren't any matches. There isn't a need to use nullglob as one of the commentators say. $1 will contain the original test name, and therefore the test -f $1 won't success, because the $1 file doesn't exist.
for i in xorg-x11-fonts*; do
if [ -f "$i" ]; then printf "BLAH"; fi
done
This will work with multiple files and with white space in file names.
The solution:
files=$(ls xorg-x11-fonts* 2> /dev/null | wc -l)
if [ "$files" != "0" ]
then
echo "Exists"
else
echo "None found."
fi
> Exists
Use:
if [ "`echo xorg-x11-fonts*`" != "xorg-x11-fonts*" ]; then
printf "BLAH"
fi
The PowerShell way - which treats wildcards different - you put it in the quotes like so below:
If (Test-Path "./output/test-pdf-docx/Text-Book-Part-I*"){
Remove-Item -force -v -path ./output/test-pdf-docx/*.pdf
Remove-Item -force -v -path ./output/test-pdf-docx/*.docx
}
I think this is helpful because the concept of the original question covers "shells" in general not just Bash or Linux, and would apply to PowerShell users with the same question too.
The Bash code I use:
if ls /syslog/*.log > /dev/null 2>&1; then
echo "Log files are present in /syslog/;
fi
Strictly speaking, if you only want to print "Blah", here is the solution:
find . -maxdepth 1 -name 'xorg-x11-fonts*' -printf 'BLAH' -quit
Here is another way:
doesFirstFileExist(){
test -e "$1"
}
if doesFirstFileExist xorg-x11-fonts*
then printf "BLAH"
fi
But I think the most optimal is as follows, because it won't try to sort file names:
if [ -z $(find . -maxdepth 1 -name 'xorg-x11-fonts*' -printf 1 -quit) ]
then
printf "BLAH"
fi
Here's a solution for your specific problem that doesn't require for loops or external commands like ls, find and the like.
if [ "$(echo xorg-x11-fonts*)" != "xorg-x11-fonts*" ]; then
printf "BLAH"
fi
As you can see, it's just a tad more complicated than what you were hoping for, and relies on the fact that if the shell is not able to expand the glob, it means no files with that glob exist and echo will output the glob as is, which allows us to do a mere string comparison to check whether any of those files exist at all.
If we were to generalize the procedure, though, we should take into account the fact that files might contain spaces within their names and/or paths and that the glob char could rightfully expand to nothing (in your example, that would be the case of a file whose name is exactly xorg-x11-fonts).
This could be achieved by the following function, in bash.
function doesAnyFileExist {
local arg="$*"
local files=($arg)
[ ${#files[#]} -gt 1 ] || [ ${#files[#]} -eq 1 ] && [ -e "${files[0]}" ]
}
Going back to your example, it could be invoked like this.
if doesAnyFileExist "xorg-x11-fonts*"; then
printf "BLAH"
fi
Glob expansion should happen within the function itself for it to work properly, that's why I put the argument in quotes and that's what the first line in the function body is there for: so that any multiple arguments (which could be the result of a glob expansion outside the function, as well as a spurious parameter) would be coalesced into one. Another approach could be to raise an error if there's more than one argument, yet another could be to ignore all but the 1st argument.
The second line in the function body sets the files var to an array constituted by all the file names that the glob expanded to, one for each array element. It's fine if the file names contain spaces, each array element will contain the names as is, including the spaces.
The third line in the function body does two things:
It first checks whether there's more than one element in the array. If so, it means the glob surely got expanded to something (due to what we did on the 1st line), which in turn implies that at least one file matching the glob exist, which is all we wanted to know.
If at step 1. we discovered that we got less than 2 elements in the array, then we check whether we got one and if so we check whether that one exist, the usual way. We need to do this extra check in order to account for function arguments without glob chars, in which case the array contains only one, unexpanded, element.
I found a couple of neat solutions worth sharing. The first still suffers from "this will break if there are too many matches" problem:
pat="yourpattern*" matches=($pat) ; [[ "$matches" != "$pat" ]] && echo "found"
(Recall that if you use an array without the [ ] syntax, you get the first element of the array.)
If you have "shopt -s nullglob" in your script, you could simply do:
matches=(yourpattern*) ; [[ "$matches" ]] && echo "found"
Now, if it's possible to have a ton of files in a directory, you're pretty well much stuck with using find:
find /path/to/dir -maxdepth 1 -type f -name 'yourpattern*' | grep -q '.' && echo 'found'
I use this:
filescount=`ls xorg-x11-fonts* | awk 'END { print NR }'`
if [ $filescount -gt 0 ]; then
blah
fi
Using new fancy shmancy features in KornShell, Bash, and Z shell shells (this example doesn't handle spaces in filenames):
# Declare a regular array (-A will declare an associative array. Kewl!)
declare -a myarray=( /mydir/tmp*.txt )
array_length=${#myarray[#]}
# Not found if the first element of the array is the unexpanded string
# (ie, if it contains a "*")
if [[ ${myarray[0]} =~ [*] ]] ; then
echo "No files not found"
elif [ $array_length -eq 1 ] ; then
echo "File was found"
else
echo "Files were found"
fi
for myfile in ${myarray[#]}
do
echo "$myfile"
done
Yes, this does smell like Perl. I am glad I didn't step in it ;)
IMHO it's better to use find always when testing for files, globs or directories. The stumbling block in doing so is find's exit status: 0 if all paths were traversed successfully, >0 otherwise. The expression you passed to find creates no echo in its exit code.
The following example tests if a directory has entries:
$ mkdir A
$ touch A/b
$ find A -maxdepth 0 -not -empty -print | head -n1 | grep -q . && echo 'not empty'
not empty
When A has no files grep fails:
$ rm A/b
$ find A -maxdepth 0 -not -empty -print | head -n1 | grep -q . || echo 'empty'
empty
When A does not exist grep fails again because find only prints to stderr:
$ rmdir A
$ find A -maxdepth 0 -not -empty -print | head -n1 | grep -q . && echo 'not empty' || echo 'empty'
find: 'A': No such file or directory
empty
Replace -not -empty by any other find expression, but be careful if you -exec a command that prints to stdout. You may want to grep for a more specific expression in such cases.
This approach works nicely in shell scripts. The originally question was to look for the glob xorg-x11-fonts*:
if find -maxdepth 0 -name 'xorg-x11-fonts*' -print | head -n1 | grep -q .
then
: the glob matched
else
: ...not
fi
Note that the else-branched is reached if xorg-x11-fonts* had not matched, or find encountered an error. To distinguish the case use $?.
If there is a huge amount of files on a network folder using the wildcard is questionable (speed, or command line arguments overflow).
I ended up with:
if [ -n "$(find somedir/that_may_not_exist_yet -maxdepth 1 -name \*.ext -print -quit)" ] ; then
echo Such file exists
fi
if [ `ls path1/* path2/* 2> /dev/null | wc -l` -ne 0 ]; then echo ok; else echo no; fi
Try this
fileTarget="xorg-x11-fonts*"
filesFound=$(ls $fileTarget)
case ${filesFound} in
"" ) printf "NO files found for target=${fileTarget}\n" ;;
* ) printf "FileTarget Files found=${filesFound}\n" ;;
esac
Test
fileTarget="*.html" # Where I have some HTML documents in the current directory
FileTarget Files found=Baby21.html
baby22.html
charlie 22.html
charlie21.html
charlie22.html
charlie23.html
fileTarget="xorg-x11-fonts*"
NO files found for target=xorg-x11-fonts*
Note that this only works in the current directory, or where the variable fileTarget includes the path you want to inspect.
You can also cut other files out
if [ -e $( echo $1 | cut -d" " -f1 ) ] ; then
...
fi
Use:
if ls -l | grep -q 'xorg-x11-fonts.*' # grep needs a regex, not a shell glob
then
# do something
else
# do something else
fi
man test.
if [ -e file ]; then
...
fi
will work for directory and file.

Shell script to loop over all files in a folder and pick them in numerical order

I have the following code to loop through the files of a folder. Files are named 1.txt, 2.txt all the way to 15.txt
for file in .solutions/*; do
if [ -f "$file" ]; then
echo "test case ${file##*/}:"
cat ./testcases/${file##*/}
echo
echo "result:"
cat "$file"
echo
echo
fi
done
My issue I get 1.txt then 10.txt to 15.txt displayed.
I would like it to be displayed in numerical order instead of lexicographical order, in other words I want the loop to iterate though the files in numerical order. Is there any way to achieve this?
ls *.txt | sort -n
This would solve the problem, provided .solutions is a directory and no directory is named with an extension .txt.
and if you want complete accuracy,
ls -al *.txt | awk '$0 ~ /^-/ {print $9}' | sort -n
Update:
As per your edits,
you can simply do this,
ls | sort -n |
while read file
do
#do whatever you want here
:
done
Looping through ls is usually a bad idea since file names can have newlines in them. Redirecting using process substitution instead of piping the results will keep the scope the same (variables you set will stay after the loop).
#!/usr/bin/env bash
while IFS= read -r -d '' file; do
echo "test case ${file##*/}:"
cat ./testcases/${file##*/}
echo
echo "result:"
cat "$file"
echo
echo
done < <(find '.solutions/' -name '*.txt' -type f -print0 | sort -nz)
Setting IFS to "" keeps the leading/trailing spaces, -r to stop backslashes messing stuff up, and -d '' to use NUL instead of newlines.
The find command looks normal files -type f, so the if [ -f "$file" ] check isn't needed. It finds -name '*.txt' files in '.solutions/' and prints them -print0 NUL terminated.
The sort command accepts NUL terminated strings with the -z option, and sorts them numerically with -n.

For this statement, why could I get the error "[: too many arguments"? I am sure I use the "[ ]" correctly in the shell [duplicate]

This question already has answers here:
Test whether a glob has any matches in Bash
(22 answers)
Closed 4 years ago.
I'm trying to check if a file exists, but with a wildcard. Here is my example:
if [ -f "xorg-x11-fonts*" ]; then
printf "BLAH"
fi
I have also tried it without the double quotes.
For Bash scripts, the most direct and performant approach is:
if compgen -G "${PROJECT_DIR}/*.png" > /dev/null; then
echo "pattern exists!"
fi
This will work very speedily even in directories with millions of files and does not involve a new subshell.
Source
The simplest should be to rely on ls return value (it returns non-zero when the files do not exist):
if ls /path/to/your/files* 1> /dev/null 2>&1; then
echo "files do exist"
else
echo "files do not exist"
fi
I redirected the ls output to make it completely silent.
Here is an optimization that also relies on glob expansion, but avoids the use of ls:
for f in /path/to/your/files*; do
## Check if the glob gets expanded to existing files.
## If not, f here will be exactly the pattern above
## and the exists test will evaluate to false.
[ -e "$f" ] && echo "files do exist" || echo "files do not exist"
## This is all we needed to know, so we can break after the first iteration
break
done
This is very similar to grok12's answer, but it avoids the unnecessary iteration through the whole list.
If your shell has a nullglob option and it's turned on, a wildcard pattern that matches no files will be removed from the command line altogether. This will make ls see no pathname arguments, list the contents of the current directory and succeed, which is wrong. GNU stat, which always fails if given no arguments or an argument naming a nonexistent file, would be more robust. Also, the &> redirection operator is a bashism.
if stat --printf='' /path/to/your/files* 2>/dev/null
then
echo found
else
echo not found
fi
Better still is GNU find, which can handle a wildcard search internally and exit as soon as at it finds one matching file, rather than waste time processing a potentially huge list of them expanded by the shell; this also avoids the risk that the shell might overflow its command line buffer.
if test -n "$(find /dir/to/search -maxdepth 1 -name 'files*' -print -quit)"
then
echo found
else
echo not found
fi
Non-GNU versions of find might not have the -maxdepth option used here to make find search only the /dir/to/search instead of the entire directory tree rooted there.
Use:
files=(xorg-x11-fonts*)
if [ -e "${files[0]}" ];
then
printf "BLAH"
fi
You can do the following:
set -- xorg-x11-fonts*
if [ -f "$1" ]; then
printf "BLAH"
fi
This works with sh and derivatives: KornShell and Bash. It doesn't create any sub-shell. $(..) and `...` commands used in other solutions create a sub-shell: they fork a process, and they are inefficient. Of course it works with several files, and this solution can be the fastest, or second to the fastest one.
It works too when there aren't any matches. There isn't a need to use nullglob as one of the commentators say. $1 will contain the original test name, and therefore the test -f $1 won't success, because the $1 file doesn't exist.
for i in xorg-x11-fonts*; do
if [ -f "$i" ]; then printf "BLAH"; fi
done
This will work with multiple files and with white space in file names.
The solution:
files=$(ls xorg-x11-fonts* 2> /dev/null | wc -l)
if [ "$files" != "0" ]
then
echo "Exists"
else
echo "None found."
fi
> Exists
Use:
if [ "`echo xorg-x11-fonts*`" != "xorg-x11-fonts*" ]; then
printf "BLAH"
fi
The PowerShell way - which treats wildcards different - you put it in the quotes like so below:
If (Test-Path "./output/test-pdf-docx/Text-Book-Part-I*"){
Remove-Item -force -v -path ./output/test-pdf-docx/*.pdf
Remove-Item -force -v -path ./output/test-pdf-docx/*.docx
}
I think this is helpful because the concept of the original question covers "shells" in general not just Bash or Linux, and would apply to PowerShell users with the same question too.
The Bash code I use:
if ls /syslog/*.log > /dev/null 2>&1; then
echo "Log files are present in /syslog/;
fi
Strictly speaking, if you only want to print "Blah", here is the solution:
find . -maxdepth 1 -name 'xorg-x11-fonts*' -printf 'BLAH' -quit
Here is another way:
doesFirstFileExist(){
test -e "$1"
}
if doesFirstFileExist xorg-x11-fonts*
then printf "BLAH"
fi
But I think the most optimal is as follows, because it won't try to sort file names:
if [ -z $(find . -maxdepth 1 -name 'xorg-x11-fonts*' -printf 1 -quit) ]
then
printf "BLAH"
fi
Here's a solution for your specific problem that doesn't require for loops or external commands like ls, find and the like.
if [ "$(echo xorg-x11-fonts*)" != "xorg-x11-fonts*" ]; then
printf "BLAH"
fi
As you can see, it's just a tad more complicated than what you were hoping for, and relies on the fact that if the shell is not able to expand the glob, it means no files with that glob exist and echo will output the glob as is, which allows us to do a mere string comparison to check whether any of those files exist at all.
If we were to generalize the procedure, though, we should take into account the fact that files might contain spaces within their names and/or paths and that the glob char could rightfully expand to nothing (in your example, that would be the case of a file whose name is exactly xorg-x11-fonts).
This could be achieved by the following function, in bash.
function doesAnyFileExist {
local arg="$*"
local files=($arg)
[ ${#files[#]} -gt 1 ] || [ ${#files[#]} -eq 1 ] && [ -e "${files[0]}" ]
}
Going back to your example, it could be invoked like this.
if doesAnyFileExist "xorg-x11-fonts*"; then
printf "BLAH"
fi
Glob expansion should happen within the function itself for it to work properly, that's why I put the argument in quotes and that's what the first line in the function body is there for: so that any multiple arguments (which could be the result of a glob expansion outside the function, as well as a spurious parameter) would be coalesced into one. Another approach could be to raise an error if there's more than one argument, yet another could be to ignore all but the 1st argument.
The second line in the function body sets the files var to an array constituted by all the file names that the glob expanded to, one for each array element. It's fine if the file names contain spaces, each array element will contain the names as is, including the spaces.
The third line in the function body does two things:
It first checks whether there's more than one element in the array. If so, it means the glob surely got expanded to something (due to what we did on the 1st line), which in turn implies that at least one file matching the glob exist, which is all we wanted to know.
If at step 1. we discovered that we got less than 2 elements in the array, then we check whether we got one and if so we check whether that one exist, the usual way. We need to do this extra check in order to account for function arguments without glob chars, in which case the array contains only one, unexpanded, element.
I found a couple of neat solutions worth sharing. The first still suffers from "this will break if there are too many matches" problem:
pat="yourpattern*" matches=($pat) ; [[ "$matches" != "$pat" ]] && echo "found"
(Recall that if you use an array without the [ ] syntax, you get the first element of the array.)
If you have "shopt -s nullglob" in your script, you could simply do:
matches=(yourpattern*) ; [[ "$matches" ]] && echo "found"
Now, if it's possible to have a ton of files in a directory, you're pretty well much stuck with using find:
find /path/to/dir -maxdepth 1 -type f -name 'yourpattern*' | grep -q '.' && echo 'found'
I use this:
filescount=`ls xorg-x11-fonts* | awk 'END { print NR }'`
if [ $filescount -gt 0 ]; then
blah
fi
Using new fancy shmancy features in KornShell, Bash, and Z shell shells (this example doesn't handle spaces in filenames):
# Declare a regular array (-A will declare an associative array. Kewl!)
declare -a myarray=( /mydir/tmp*.txt )
array_length=${#myarray[#]}
# Not found if the first element of the array is the unexpanded string
# (ie, if it contains a "*")
if [[ ${myarray[0]} =~ [*] ]] ; then
echo "No files not found"
elif [ $array_length -eq 1 ] ; then
echo "File was found"
else
echo "Files were found"
fi
for myfile in ${myarray[#]}
do
echo "$myfile"
done
Yes, this does smell like Perl. I am glad I didn't step in it ;)
IMHO it's better to use find always when testing for files, globs or directories. The stumbling block in doing so is find's exit status: 0 if all paths were traversed successfully, >0 otherwise. The expression you passed to find creates no echo in its exit code.
The following example tests if a directory has entries:
$ mkdir A
$ touch A/b
$ find A -maxdepth 0 -not -empty -print | head -n1 | grep -q . && echo 'not empty'
not empty
When A has no files grep fails:
$ rm A/b
$ find A -maxdepth 0 -not -empty -print | head -n1 | grep -q . || echo 'empty'
empty
When A does not exist grep fails again because find only prints to stderr:
$ rmdir A
$ find A -maxdepth 0 -not -empty -print | head -n1 | grep -q . && echo 'not empty' || echo 'empty'
find: 'A': No such file or directory
empty
Replace -not -empty by any other find expression, but be careful if you -exec a command that prints to stdout. You may want to grep for a more specific expression in such cases.
This approach works nicely in shell scripts. The originally question was to look for the glob xorg-x11-fonts*:
if find -maxdepth 0 -name 'xorg-x11-fonts*' -print | head -n1 | grep -q .
then
: the glob matched
else
: ...not
fi
Note that the else-branched is reached if xorg-x11-fonts* had not matched, or find encountered an error. To distinguish the case use $?.
If there is a huge amount of files on a network folder using the wildcard is questionable (speed, or command line arguments overflow).
I ended up with:
if [ -n "$(find somedir/that_may_not_exist_yet -maxdepth 1 -name \*.ext -print -quit)" ] ; then
echo Such file exists
fi
if [ `ls path1/* path2/* 2> /dev/null | wc -l` -ne 0 ]; then echo ok; else echo no; fi
Try this
fileTarget="xorg-x11-fonts*"
filesFound=$(ls $fileTarget)
case ${filesFound} in
"" ) printf "NO files found for target=${fileTarget}\n" ;;
* ) printf "FileTarget Files found=${filesFound}\n" ;;
esac
Test
fileTarget="*.html" # Where I have some HTML documents in the current directory
FileTarget Files found=Baby21.html
baby22.html
charlie 22.html
charlie21.html
charlie22.html
charlie23.html
fileTarget="xorg-x11-fonts*"
NO files found for target=xorg-x11-fonts*
Note that this only works in the current directory, or where the variable fileTarget includes the path you want to inspect.
You can also cut other files out
if [ -e $( echo $1 | cut -d" " -f1 ) ] ; then
...
fi
Use:
if ls -l | grep -q 'xorg-x11-fonts.*' # grep needs a regex, not a shell glob
then
# do something
else
# do something else
fi
man test.
if [ -e file ]; then
...
fi
will work for directory and file.

Shell how to use a command on every file in a directory

So what I have to do is find all regular files within and below the directory. For each of these regular files, I have to egrep for pattern($ARG) and find out if the output of the file matches the pattern ($ARG), if it does it will add one to the counter.
What I have so far is the file command:
$count = 0
file *
However, I am having trouble getting egrep &ARG > /dev/null/ ; echo $? to run through each file that appears from (file *).
I understand that file * | egrep directory > /dev/null ; echo $? will output 0 because it find the pattern 'directory' in the file, but am having trouble getting it to loop through each regular file so I can add one to the counter every time the pattern is matched.
The question is not clear, but if you're looking for number of files containing a pattern
grep -l "pattern" * 2>/dev/null | wc -l
will give you that. Errors are ignored coming from directories.
If you want recursively do the complete tree including dot files
grep -r -l "pattern" | wc -l
You can try this:
counter=0
find /path/to/directory/ -type f | \
while read file
do
if grep -i -e -q "$pattern" "$file"
then counter=$((counter+1))
fi
done
echo "$counter"
See http://mywiki.wooledge.org/BashFAQ/020
counter=0
shopt -s globstar nullglob
for file in **; do
grep -qiE "$pattern" "$file" && ((counter++))
done
echo "$counter"
If you want to include hidden files, add shopt -s dotglob

How to use grep in a for loop

Could someone please help with this script. I need to use grep to loop to through the filenames that need to be changed.
#!/bin/bash
file=
for file in $(ls $1)
do
grep "^.old" | mv "$1/$file" "$1/$file.old"
done
bash can handle regular expressions without using grep.
for f in "$1"/*; do
[[ $f =~ \.old ]] && continue
# Or a pattern instead
# [[ $f == *.old* ]] && continue
mv "$f" "$f.old"
done
You can also move the name checking into the pattern itself:
shopt -s extglob
for f in "$1/"!(*.old*); do
mv "$f" "$f.old"
done
If I understand your question correctly, you want to make rename a file (i.e. dir/file.txt ==> dir/file.old) only if the file has not been renamed before. The solution is as follow.
#!/bin/bash
for file in "$1/"*
do
backup_file="${file%.*}.old"
if [ ! -e "$backup_file" ]
then
echo mv "$file" "$backup_file"
fi
done
Discussion
The script currently does not actual make back up, it only displays the action. Run the script once and examine the output. If this is what you want, then remove the echo from the script and run it again.
Update
Here is the no if solution:
ls "$1/"* | grep -v ".old" | while read file
do
echo mv "$file" "${file}.old"
done
Discussion
The ls command displays all files.
The grep command filter out those files that has the .old extension so they won't be displayed.
The while loop reads the file names that do not have the .old extension, one by one and rename them.

Resources