BASH recursive find filename and copy to ftp - bash

I am working on a Bash script (see below) that recursively searches through directories on a SAN for files with a specific file name that is newer than 4 hours. Then copies all these files to a specific FTP Location and email to say the copy has been completed. The script works fine except that it only copies files on the top level directory. The error I am getting on the lower directories is this:
#
remote: -v
ftp: local: -v: No such file or directory
local: ./Test01/test02/folder02_01_1200_m30.mp4 remote: ./Test01/test02/folder02_01_1200_m30.mp4
229 Entering Extended Passive Mode (|||45127|)
550 ./Test01/test02/folder02_01_1200_m30.mp4: File does not exist. (2)
221 Goodbye.
#
Here is the Script
#!/bin/bash
#The location from where the script should search
GSPORIGIN='/Volumes/folder01/folder02'
#File Names to be moved
FILE1='*1200_m30.mp4'
#FTP Details
HOST='xxxx.upload.com'
USER='xxxxxxx'
PASSWD='xxxxxxxxxxxx'
#the destination directory on the FTP
DESTDIR="/8619/_!/TEST"
# Go to the location from where the search should start
cd $GSPORIGIN
for file in `find . -type f -name "*1200_m30.mp4" -mmin -240`
do
echo $file
if [ -f $file ] ; then
ftp -n -v $HOST << EOT
ascii
user $USER $PASSWD
prompt
cd $DESTDIR
mput -v $file
EOT
echo "$file has been copied to FTP" | mail -s "$file has been copied to FTP in Directory $DESTDIR" xxx.xxx#xxx.com;
else exit 1
fi
done

To do what you're doing, you'll have to recreate the directories on the destination FTP.
use the basename/dirname commmands and a mkdir command like this :
for file in `find . -type f -name "*1200_m30.mp4" -mmin -240`
do
echo $file
if [ -f $file ] ; then
destdirname=`dirname "$file"`
ftp -n -v $HOST << EOT
ascii
user $USER $PASSWD
prompt
cd $DESTDIR
mkdir $destdirname
mput -v $file
EOT
echo "$file has been copied to FTP" | mail -s "$file has been copied to FTP in Directory $DESTDIR" xxx.xxx#xxx.com;
else exit 1
fi

To copy multiple files in nested directories: I would suggest you to look at rsync utility to do this job for you.
rsync will create all the remote directories whenever needed and it will keep files completely in sync even after frequent runs.

Related

Bash script save csv in array and search for part of string

after editing my script I would shortly like to explain what i want to do:
Check if files are in Folder
Look at begin of file name
search for file less than 1 hour old
take the file and do sqlldr ..if this succeeds move file to an other folder ...if not send mail
This is my Script, can someone please tell me if this is going to work? I am not sure about the syntax and also not sure if nr. 3 and 4. send mail works like this.
#!/bin/sh
#check if files are in folder
declare -a arrCSV #create array
for file in *.csv
do
arrCSV=("${CSV[#]}" "$file")
done
shopt -s nullglob
for file in read*.csv; do
#run on all files starting with "read" and ending with ".csv"
for find $LOCATION -name $file -type f -mmin -60 do
if
sqlldr read*.csv
then mv "$file" "$HOME/fail/" ;
else{ echo "Failed to load" | mail -s "FAIL" email#email.com}
done
done
for file in write*.csv; do
#run on all files starting with "write" and ending with ".csv"
for find $LOCATION -name $file -type f -mmin -60 do
if
sqlldr write*.csv
then mv "$filen" "$HOME/unisem/fail/" ;
else { echo "Failed to load 2" | mail -s "FAIL" email#email.com}
done
done
You don't need an array if the read... and write... files can be processed in any order:
shopt -s nullglob
for file in read*.csv; do
# run on all files starting with "read" and ending with ".csv"
sqldr ...
done
for file in write*.csv; do
# run on all files starting with "write" and ending with ".csv"
sqldr ...
done

Bash script delete file inside another folder if not present in both

The goal of the script is to check to see if a filename exists inside a folder. If the file name does NOT exist, then delete the file.
This is the script I got so far
#!/bin/bash
echo "What's the folder name?"
read folderName
$fileLocation="/home/daniel/Dropbox/Code/Python/FR/alignedImages/$folderName"
for files in "/home/daniel/Dropbox/Code/Python/FR/trainingImages/$folderName"/*
do
fileNameWithFormatFiles=${files##*$folderName/}
fileNameFiles=${fileNameWithFormat%%.png*}
for entry in "/home/daniel/Dropbox/Code/Python/FR/alignedImages/$folderName"/*
do
fileNameWithFormat=${entry##*$folderName/}
fileName=${fileNameWithFormat%%.png*}
if [ -f "/home/daniel/Dropbox/Code/Python/FR/alignedImages/$fileNameFiles.jpg" ]
then
echo "Found File"
else
echo $files
rm -f $files
fi
done
done
read
I have two folders, alignedImages and trainingImages.
All of the images in alignedImages will be inside trainingImages, but not the otherway around. So, I'm trying to make it so that if trainingImages does not contain a file with the same name as the file in alignedImages, then I want it to delete the file in trainingImages.
Also, the pictures are not the same, so I can't just compare md5's or hashes or whatever. Just the file names would be the same, except they are .jpg instead of .png
fileLocation="/home/daniel/Dropbox/Code/Python/FR/alignedImages/$folderName"
echo "What's the folder name?"
read folderName
rsync --delete --ignore-existing $fileLocation $folderName
rsync command is what you are looking for and when given the --delete option it will delete from destination dir any file that doesn't exist in source dir and --ignore-existing will cause rsync skip copying files from source if a file with same name already exist in destination dir.
The side effect of this is that it would copy any file in source dir but not in destination. You say all files in source are in destination so I guess that's ok
there is a better way! files, not for loops!
#!/bin/bash
echo "What's the folder name?"
read folderName
cd "/home/daniel/Dropbox/Code/Python/FR/alignedImages/$folderName"
find . -type f -name "*.png" | sed 's/\.png//' > /tmp/align.list
cd "/home/daniel/Dropbox/Code/Python/FR/trainingImages/$folderName"
find . -type f -name "*.jpg" | sed 's/\.jpg//' > /tmp/train.list
here's how to find files in both lists:
fgrep -f /tmp/align.list /tmp/train.list | sed 's/.*/&.jpg/' > /tmp/train_and_align.list
fgrep -v finds non-matches instead of matches: find files in train but not align:
fgrep -v -f /tmp/align.list /tmp/train.list | sed 's/.*/&.jpg/' > /tmp/train_not_align.list
test delete of all files in train_not_align.list:
cd "/home/daniel/Dropbox/Code/Python/FR/trainingImages/$folderName"
cat /tmp/train_not_align.list | tr '\n' '\0' | xargs -0 echo rm -f
(if this produces good output, remove the echo statement to actually delete those files.)

How to untar specific files from a number of tar files and zip them?

The requirement is to extract all the *.properties files from multiple tars and put them into a zip.
I tried this:
find . -iwholename "*/ext*/*.tar.gz"|xargs -n 1 tar --wildcards '*.properties' -xvzf | zip -# tar-properties.zip
This is creating a zip with the .properties files in all the tars.
But the issue is the tars are structured as in each tar contains a properties folder which contains the files. The above command is creating a zip with a single properties folder which contains all the files .
Is there a way to put these in the zip with a folder structure like {name of the tar}/properties/*.properties ?
You could use this script. My solution uses --transform as well. Please check first if your tar command supports it with tar --help 2>&1 | grep -Fe --transform.
#!/bin/bash
[ -n "$BASH_VERSION" ] || {
echo "You need bash to run this script." >&2
exit 1
}
TEMPDIR=/tmp/properties-files
OUTPUTFILE=$PWD/tar-properties.zip ## Must be an absolute path.
IFS=
if [[ ! -d $TEMPDIR ]]; then
mkdir -p "$TEMPDIR" || {
echo "Unable to create temporary directory $TEMPDIR." >&2
exit 1
}
fi
NAMES=()
while read -r FILE; do
NAMEOFTAR=${FILE##*/} ## Remove dir part.
NAMEOFTAR=${NAMEOFTAR%.tar.gz} to remove extension ## Remove .tar.gz.
echo "Extracting $FILE."
tar --wildcards '*.properties' -xvzf "$FILE" -C "$TEMPDIR" --transform "s#.*/#${NAMEOFTAR//#/\\#}/properties/#" || {
echo "An error occurred extracting to $TEMPDIR." >&2
exit 1
}
NAMES+=("$NAMEOFTAR")
done < <(exec find . -type f -iwholename '*/ext*/*.tar.gz')
(
cd "$TEMPDIR" >/dev/null || {
echo "Unable to change directory to $TEMPDIR."
exit 1
}
zip -a "$OUTPUTFILE" "${NAMES[#]}"
)
Save it to a script then run it on the directory where those files are to be searched with
bash /path/to/script.sh`
You can probably do the trick with tar option --transform, --xform. This option permits to manipulate path thanks to a sed expression.
find . -iwholename "*/ext*/*.tar.gz"|xargs -n 1 tar --wildcards '*.properties' -xvzf --xform 's#.*/#name_of_the_tar/properties/#' | zip -# tar-properties.zip

Unix - Shell script to find a file from any directory and move it

i'm currently working through an exercise book and I have to create a shell script that will find a file from any directory and move it.
Though I am having difficulties as the file could be in any directory (so I do not have a path to find it). I have used the find option with the -print flag tho what would be the next step to move it using mv command?
My code so far reads in a variable, detects if a file has been entered, if it is a file or a directory, or if it exists.
The next stage as mentioned above is to find the file and then move it into a "test" file.
If anyone has any recommendations it would be greatly appreciated.
#!/bin/bash
bin="deleted"
if [ ! -e bin ] ; then
mkdir $bin
fi
file=$1
#error to display that no file has been entered
if [[ ! $file ]]; then
echo "no file has been entered"
fi
#file does not exist, display error message
if [[ ! -f $file ]]; then
echo "$file does not exsist!"
fi
#check to see if the input is a directory
if [[ -d $file ]]; then
echo "$file is a directory!"
if [[ -e $file ]]; then *** move to test folder
****This is where I am having the problems
find / -type f -name FILENAME | xargs -I foobar echo mv foobar /tmp (remove echo to make the command actually work .. i put it there just to save yourself from accidentally moving files just to try out the command)
Note that -I foobar means that in mv foobar /tmp replace the foobar string with full path of the file found.
for example, try: find / -type f -name FILENAME | xargs -I foobar foobar is a cool file

Bash script to backup files to remote FTP. Deleting old files

I'm writing a bash script to send backups to a remote ftp server. The backup files are generated with a WordPress plugin so half the work is done for me from the start.
The script does several things.
It looks in the local backup dir for any files older than x and deletes them
It connects to FTP and puts the backup files in a dir with the current date as a name
It deletes any backup dirs for backups older than x
As I am not fluent in bash, this is a mishmash of a bunch of scripts I found around the net.
Here is my script:
#! /bin/bash
BACKDIR=/var/www/wp-content/backups
#----------------------FTP Settings--------------------#
FTP=Y
FTPHOST="host"
FTPUSER="user"
FTPPASS="pass"
FTPDIR="/backups"
LFTP=$(which lftp) # Path to binary
#-------------------Deletion Settings-------------------#
DELETE=Y
DAYS=3 # how many days of backups do you want to keep?
TODAY=$(date --iso) # Today's date like YYYY-MM-DD
RMDATE=$(date --iso -d $DAYS' days ago') # TODAY minus X days - too old files
#----------------------End of Settings------------------#
if [ -e $BACKDIR ]
then
if [ $DELETE = "Y" ]
then
find $BACKDIR -iname '*.zip' -type f -mtime +$DAYS -delete
echo "Old files deleted."
fi
if [ $FTP = "Y" ]
then
echo "Initiating FTP connection..."
cd $BACKDIR
$LFTP << EOF
open ${FTPUSER}:${FTPPASS}#${FTPHOST}
mkdir $FTPDIR
cd $FTPDIR
mkdir ${TODAY}
cd ${TODAY}
mput *.zip
cd ..
rm -rf ${RMDATE}
bye
EOF
echo "Done putting files to FTP."
fi
else
echo "No Backup directory."
exit
fi
There are 2 specific things I can't get done:
The find command doesn't delete any of the old files in the local backup dir.
I would like mput to only put the .zip files that were created today.
Thanks in advance for the help.
To send only zip files that were created today:
MPUT_ZIPS="$(find $BACKDIR -iname '*.zip' -type f -maxdepth 1 -mtime 1 | sed -e 's/^/mput /')"
[...]
$LFTP << EOF
open ${FTPUSER}:${FTPPASS}#${FTPHOST}
mkdir $FTPDIR
cd $FTPDIR
mkdir ${TODAY}
cd ${TODAY}
${MPUT_ZIPS}
cd ..
rm -rf ${RMDATE}
bye
EOF
Hope this helps =)
2) If you put todays backup files in a separate directory or link them to a separate directory, you can cd today and just transfer these files.

Resources