I'm trying to write a bash script to search for files. If the file is found the script should copy it. Otherwise the script should print a message to notify me that it wasn't found.
#!/bin/bash
result=/home/images/newfolder/
while read -r $FILE
do
FOUND="($find $(pwd) -name "$FILE"* -type f print -quit)"
if [ "x$FOUND" != "x" ]
then
echo "copying file: $FOUND"
cp "$FOUND" $result
else
echo "NOT FOUND: $FILE"
fi
done </root/filelist.txt
FOUND 11234567890.jpeg
NOT FOUND 1890.jpeg
FOUND 183290.jpeg
This is a working script. assuming that you have a /tmp/filelist.txt containing what you looking for and files are going to be copied to /tmp. fill free to edit and use.
#!/bin/bash
result=/tmp
while read -r FILE
do
FOUND=$(find $(pwd) -name "$FILE" -type f)
if [ -z "$FOUND" ]; then
echo "NOT FOUND: $FILE"
else
echo "copying file: $FILE"
cp $FILE $result
fi
done < /tmp/filelist.txt
Related
I am currently running a script (check_files.sh) that may or may not produce match* files (eg. match1.txt, match2.txt, etc. I then want to go on to run an R script on the produced match* files.
However, before I run the R script, I have tried a line in my script which checks if the files are present.
if [ -s match* ]
then
for f in match*
do
Rscript --vanilla format_matches.R ${f}
rm match*
done
else
echo "No matches present"
fi
However, I keep getting an error message (as there are often a lot of match files producted):
./check_files.sh: line 52: [: too many arguments
Is there an alternative to [ -s match* ] which would not throw up the error message? I take it that the error message appears as there are multiple match* files produced.
If you expect filenames with spaces, this is a bit ugly but robust:
found=0
find -maxdepth 1 -type f -name 'match*' -print0 | while IFS= read -rd $'\0' f
do
found=1
Rscript --vanilla format_matches.R "$f"
rm "$f"
done
if [ "$found" -eq 0 ]; then
>&2 echo "No matches present"
fi
You could change your logic to the following:
found=0
for f in match*; do
[ -e "$f" ] || continue
found=1
Rscript --vanilla format_matches.R "$f"
rm "$f"
done
if [ "$found" -eq 0 ]; then
>&2 echo "No matches present"
fi
Is there an alternative to [ -s match* ] which would not throw up the error message?
The following works for my example file match1.txt
if [ -f ~/match* ]; then
echo "yeha"
fi
I have a music archive with lots of folders and sub-folders (Cover Art etc.) so instead of manually removing hundreds of Folder.jpg, Desktop.ini and Thumb.db files, I decided to do a simple bash script but things got really messy.
I did a simple test by creating dummy folders like this:
/home/dummy/sub1 -
sub1sub1
sub1sub1sub1
sub1sub1sub2
sub2 -
sub2sub1
sub2sub2
sub2sub2sub1
and copied some random .jpg, .mp3, .ini files across these folders. My bash script looks currently like this:
function delete_jpg_ini_db {
if [[ $f == *.jpg ]]; then
echo ".jpg file, removing $f"
gvfs-trash $f
elif [[ $f == *.ini ]]; then
echo ".ini file, removing $f"
gvfs-trash -f $f
elif [[ $f == *.db ]]; then
echo ".db file, removing $f"
gvfs-trash -f $f
else echo "not any .jpg, .ini or .db file, skipping $f"
fi
}
function iterate_dir {
for d in *; do
if [ -d $d ]; then
echo "entering sub-directory: $d" && cd $d
pwd
for f in *; do
if [ -f $f ]; then #check if .jpg, .ini or .db, if so delete
delete_jpg_ini_db
elif [ -d $f ]; then #enter sub-dir and iterate again
if [ "$(ls -A $f)" ]; then
iterate_dir
else
echo "sub-directory $f is empty!"
fi
fi
done
fi
done
}
pwd
iterate_dir
When I run it, it successfully iterates through sub1, sub1sub1 and sub1sub1sub1, but it halts there instead of going back to home and searching sub2 next.
I am new in Bash scripting, all help is appreciated..
Thanks.
And in one command you can run:
find /home/dummy/sub1 -name "*.jpg" -o -name "*.ini" -o -name "*.db" -delete
And if you want to see which files would be deleted, replace -delete with -print (just filenames) or with -ls (like ls -l output).
here is the changed code....
function delete_jpg_ini_db {
if [[ $f == *.jpg ]]; then
echo ".jpg file, removing $f"
gvfs-trash $f
elif [[ $f == *.ini ]]; then
echo ".ini file, removing $f"
gvfs-trash -f $f
elif [[ $f == *.db ]]; then
echo ".db file, removing $f"
gvfs-trash -f $f
else echo "not any .jpg, .ini or .db file, skipping $f"
fi
}
function iterate_dir {
for d in *; do
if [ -d "$d" ]; then
echo "entering sub-directory: $d" && cd $d
pwd
for f in *; do
if [ -f "$f" ]; then #check if .jpg, .ini or .db, if so delete
delete_jpg_ini_db
elif [ -d $f ]; then #enter sub-dir and iterate again
if [ "$(ls -A $f)" ]; then
iterate_dir
else
echo "sub-directory $f is empty!"
fi
fi
done
cd ..
fi
done
}
pwd
iterate_dir
Mistakes
You did have support for file name with space in them
You did not navigate back after your inner for loop..
Try it...
I am trying to find if a file exist in an iPhone application directory
Unfortunately, apps directory differs from a device to another
On my device, i use the following command to see if the file exists:
if [[ -f "/var/mobile/Applications/D0D2B991-3CDA-457B-9187-1F02A84FF3AB/AppName.app/filename.txt" ]]; then
echo "The File Exists";
else
echo "The File Does Not Exist";
fi
I want a command that would automatically search if the file exist without the need to specify the "variable" name inside the path.
I tried this:
if [[ -f "/var/mobile/Applications/*/AppName.app/filename.txt" ]]; then
echo "The File Exists";
else
echo "The File Does Not Exist";
fi
But no luck, it didn't find the file,
Maybe because i have 2 path of /var/mobile/Applications/*/AppName.app/ since i have cloned the app.
I would like to get a way to be able to find if the file filename.txt exists inside any folder named AppName.app inside this directory /var/mobile/Applications/*/
You can do this as follows:
[[ $(find /var/mobile/Applications/*/AppName.app/ -name filename.txt -print -quit | wc -l) -gt 0 ]] && echo "The File Exists" || echo "The File Does Not Exist"
The -f test can only take one argument. You would need to put it in a loop to check if some glob exists and its matches some regular file, i.e.
shopt -s nullglob
found=
for file in /var/mobile/Applications/*/AppName.app/filename.txt; do
[[ -f $file ]] && found=: && break
done
[[ -n $found ]] && echo "The File Exists" || echo "The File Does Not Exist"
If you're not sure specifically where the file is located you can use find, doing something like below which will exit early if found. (should work for gnu find, haven't tested on bsd)
if [[ -f $(find /some_root_directory -type f -name 'filename.txt' -print -quit) ]]; then
echo "The File Exists"
else
echo "The File Does Not Exist"
fi
# if a glob matches nothing, remove it instead of leaving the literal glob
shopt -s nullglob
# stick all matches in an array
files=( /var/mobile/Applications/*/AppName.app/filename.txt )
case "${#files[#]}" in
0 ) echo "Sorry, no such file." ;;
1 ) echo "The file exists: ${files[0]}" ;;
* ) echo "There are multiple files matching this pattern: ${files[*]}" ;;
esac
I like this technique for the purpose:
if find /var/mobile/Applications/*/AppName.app/ -name filename.txt -print -quit | grep -q .; then
echo "The File Exists"
else
echo "The File Does Not Exist"
fi
This has some advantages over this form:
[[ $(find ..... -print -quit | wc -l) -gt 0 ]]
Because:
It doesn't need a $() subshell
It doesn't need to count lines with wc
It doesn't need to compare numbers with the -gt operator
It doesn't need to be inside a [[ ... ]]
Basically it's a find ... | grep -q . versus [[ $(find ... | wc -l) -gt 0 ]]
Or find ... | grep -q . versus [[ -f $(find ...) ]]
I'm trying to compile a very simple bash script that will do the following actions (the script I have so far doesn't seem to function at all so I won't waste time putting this up for you to look at)
I need it to find files by their names. I need the script to take the user input and search the .waste directory for a match, should the folder be empty i'd need to echo out "No match was found because the folder is empty!", and just normally failing to find a match a simple "No match found."
I have defined: target=/home/user/bin/.waste
You can use the built in find command to do this
find /path/to/your/.waste -name 'filename.*' -print
Alternatively, you can set this as a function in your .bash_profile
searchwaste() {
find /path/to/your/.waste -name "$1" -print
}
Note that there are quotes around the $1. This will allow you to do file globbing.
searchwaste "*.txt"
The above command would search your .waste directory for any .txt files
Here you go, pretty straightforward script:
#!/usr/bin/env bash
target=/home/user/bin/.waste
if [ ! "$(ls -A $target)" ]; then
echo -e "Directory $target is empty"
exit 0
fi
found=0
while read line; do
found=$[found+1]
echo -e "Found: $line"
done < <(find "$target" -iname "*$1*" )
if [[ "$found" == "0" ]]; then
echo -e "No match for '$1'"
else
echo -e "Total: $found elements"
fi
Btw. in *nix world there are not folders, but there are directories :)
This is a solution.
#!/bin/bash
target="/home/user/bin/.waste"
read name
output=$( find "$target" -name "$name" 2> /dev/null )
if [[ -n "$output" ]]; then
echo "$output"
else
echo "No match found"
fi
I am writing a shell script. My script goes into a directory, but I want that it only proceeds and executes the next commands if the directory contains any data (the directory is not empty). Otherwise it shouldn't goes any further. How can I specify such a condition in my shell script?
use test with "$(ls -A $DIR)" like:
if [ "$(ls -A $DIR)" ]; then
echo "directory is not empty"
else
echo "directory is Empty"
fi
if [ `find dir -type f | wc -l` = "0" ]
then
echo no files
else
echo files
fi
This'll check dir and subdirectories. find dir -maxdepth 1 -type f will check only dir
If you want to count subdirectories as well as files:
find dir -mindepth 1 -maxdepth 1
if [ "$(ls -A $DIR)" ]; then
echo "Take action $DIR is not Empty"
else
echo "$DIR is Empty"
fi