Bash script to backup files to remote FTP. Deleting old files - bash

I'm writing a bash script to send backups to a remote ftp server. The backup files are generated with a WordPress plugin so half the work is done for me from the start.
The script does several things.
It looks in the local backup dir for any files older than x and deletes them
It connects to FTP and puts the backup files in a dir with the current date as a name
It deletes any backup dirs for backups older than x
As I am not fluent in bash, this is a mishmash of a bunch of scripts I found around the net.
Here is my script:
#! /bin/bash
BACKDIR=/var/www/wp-content/backups
#----------------------FTP Settings--------------------#
FTP=Y
FTPHOST="host"
FTPUSER="user"
FTPPASS="pass"
FTPDIR="/backups"
LFTP=$(which lftp) # Path to binary
#-------------------Deletion Settings-------------------#
DELETE=Y
DAYS=3 # how many days of backups do you want to keep?
TODAY=$(date --iso) # Today's date like YYYY-MM-DD
RMDATE=$(date --iso -d $DAYS' days ago') # TODAY minus X days - too old files
#----------------------End of Settings------------------#
if [ -e $BACKDIR ]
then
if [ $DELETE = "Y" ]
then
find $BACKDIR -iname '*.zip' -type f -mtime +$DAYS -delete
echo "Old files deleted."
fi
if [ $FTP = "Y" ]
then
echo "Initiating FTP connection..."
cd $BACKDIR
$LFTP << EOF
open ${FTPUSER}:${FTPPASS}#${FTPHOST}
mkdir $FTPDIR
cd $FTPDIR
mkdir ${TODAY}
cd ${TODAY}
mput *.zip
cd ..
rm -rf ${RMDATE}
bye
EOF
echo "Done putting files to FTP."
fi
else
echo "No Backup directory."
exit
fi
There are 2 specific things I can't get done:
The find command doesn't delete any of the old files in the local backup dir.
I would like mput to only put the .zip files that were created today.
Thanks in advance for the help.

To send only zip files that were created today:
MPUT_ZIPS="$(find $BACKDIR -iname '*.zip' -type f -maxdepth 1 -mtime 1 | sed -e 's/^/mput /')"
[...]
$LFTP << EOF
open ${FTPUSER}:${FTPPASS}#${FTPHOST}
mkdir $FTPDIR
cd $FTPDIR
mkdir ${TODAY}
cd ${TODAY}
${MPUT_ZIPS}
cd ..
rm -rf ${RMDATE}
bye
EOF
Hope this helps =)

2) If you put todays backup files in a separate directory or link them to a separate directory, you can cd today and just transfer these files.

Related

Shell script to archive & delete files older than 5 days based on created date of the files

I am trying to compress 5 days' worth log at a time and moving the compressed files to another location and deleting the logs files from original location. I need bash script to accomplish this. I got the files compressed using the below command, but not able to move them to the archive folder. I also need to compress based on date created. Now it's compressing all the files starting with a specific name.
#!/bin/bash
cd "C:\Users\ann\logs"
for filename in acap*.log*; do
# this syntax emits the value in lowercase: ${var,,*} (bash version 4)
mkdir -p archive
gzip "$filename_.zip" "$filename"
mv "$filename" archive
done
#!/bin/bash
mkdir -p archive
for file in $(find . -mtime +3 -type f -printf "%f ")
do
if [[ "$file" =~ ^acap.*\.log$ ]]
then
tar -czf archive/${file}.tar.gz $file
rm $file
fi
done
This finds all files in the current directory that match the regex and compresses them in an tar for every file. Then it deletes all the files.

Bash script upload to ftp & delete files 7 days and older

The script is unable to delete anything older than 7 days in FTP. Says the error "550 Can't remove directory: No such file or directory" All of the FTP backups are just in the main directory.
#!/bin/sh
USERNAME="user"
PASSWORD="pass"
SERVER="ftpbackupservergoeshere"
NOW="$(date +'%m-%d-%Y')"
DAYS=7
RMDATE=$(date --iso -d $DAYS' days ago')
# local directory to pickup *.tar.gz file
FILE="/root/$NOW.tar.gz"
cd /home/minecraft/multicraft/servers
find . -name \*.log -type f -delete
find . -name \*.log.gz -type f -delete
cd
tar -zcf $NOW.tar.gz /home/minecraft
# login to remote server
ftp -n -v $SERVER <<EOF
user $USERNAME $PASSWORD
binary
put $FILE $NOW.tar.gz
cd ..
rm -rf ${RMDATE}
bye
EOF
rm /root/$NOW.tar.gz

Backup Yesterday Files in Folder

I develop following script to gzip yesterday files in a directory, any improvements suggestions
Yesterday=`TZ=GMT+24 date +%d-%b-%y`;
mkdir $Yesterday
mv /tmp/logs/servicemix.log.* /tmp/logs/$Yesterday
for File in /tmp/logs/$Yesterday/app.log.*;
do gzip $File;
done
Regards
1.Replace
mkdir $Yesterday
by
mkdir -p /tmp/logs/${Yesterday}
gzip the files you have moved
When you move servicemix* files, do not gzip app* files
Use following lines of codes
TIME=`date +"%b-%d-%y"` # This Command will add date in Backup File Name.
now=$(date +"%T")
FILENAME="filename-$TIME:$now.tar.gz" # Here i define Backup file name format.
SRCDIR="/home" # Location of Important Data Directory (Source of backup).
DESDIR="/home/user/backup" # Destination of backup file.
tar -cpzf $DESDIR/$FILENAME $SRCDIR #creating backup as tar file
echo
echo "Backup finished"
Were they changed or created yesterday? Use find with the correct modifier.
Yesterday=`TZ=GMT+24 date +%d-%b-%y`;
mkdir $Yesterday
# -1 is equal to last 24 hours
# ctime is for last inode modified time. If you need creation date you need a different modifier. See attached link.
find -ctime -1 -exec mv '{}' "$yesterday/" \;
But this is i think pretty much the best option: Use 'find' to determine files modified yesterday.

Bash script of unzipping unknown name files

I have a folder that after an rsync will have a zip in it. I want to unzip it to its own folder(if the zip is L155.zip, to unzip its content to L155 folder). The problem is that I dont know it's name beforehand(although i know it will be "letter-number-number-number"), so I have to unzip an uknown file to its unknown folder and this to be done automatically.
The command “unzip *”(or unzip *.zip) works in terminal, but not in a script.
These are the commands that have worked through terminal one by one, but dont work in a script.
#!/bin/bash
unzip * #also tried .zip and /path/to/file/* when script is on different folder
i=$(ls | head -1)
y=${i:0:4}
mkdir $y
unzip * -d $y
First I unzip the file, then I read the name of the first extracted file through ls and save it in a variable.I take the first 4 chars and make a directory with it and then again unzip the files to that specific folder.
The whole procedure after first unzip is done, is because the files inside .zip, all start with a name that the zip already has, so if L155.ZIP is the zip, the files inside with be L155***.txt.
The zip file is at /path/to/file/NAME.zip.
When I run the script I get errors like the following:
unzip: cannot find or open /path/to/file/*.ZIP
unzip: cannot find or open /path/to/file//*.ZIP.zip
unzip: cannot find or open /path/to/file//*.ZIP.ZIP. No zipfiles found.
mkdir: cannot create directory 'data': File exists data
unzip: cannot find or open data, data.zip or data.ZIP.
Original answer
Supposing that foo.zip contains a folder foo, you could simply run
#!/bin/bash
unzip \*.zip \*
And then run it as bash auto-unzip.sh.
If you want to have these files extracted into a different folder, then I would modify the above as
#!/bin/bash
cp *.zip /home/user
cd /home/user
unzip \*.zip \*
rm *.zip
This, of course, you would run from the folder where all the zip files are stored.
Another answer
Another "simple" fix is to get dtrx (also available in the Ubuntu repos, possibly for other distros). This will extract each of your *.zip files into its own folder. So if you want the data in a different folder, I'd follow the second example and change it thusly:
#!/bin/bash
cp *.zip /home/user
cd /home/user
dtrx *.zip
rm *.zip
I would try the following.
for i in *.[Zz][Ii][Pp]; do
DIRECTORY=$(basename "$i" .zip)
DIRECTORY=$(basename "$DIRECTORY" .ZIP)
unzip "$i" -d "$DIRECTORY"
done
As noted, the basename program removes the indicated suffix .zip from the filename provided.
I have edited it to be case-insensitive. Both .zip and .ZIP will be recognized.
for zfile in $(find . -maxdepth 1 -type f -name "*.zip")
do
fn=$(echo ${zfile:2:4}) # this will give you the filename without .zip extension
mkdir -p "$fn"
unzip "$zfile" -d "$fn"
done
If the folder has only file file with the extension .zip, you can extract the name without an extension with the basename tool:
BASE=$(basename *.zip .zip)
This will produce an error message if there is more than one file matching *.zip.
Just to be clear about the issue here, the assumption is that the zip file does not contain a folder structure. If it did, there would be no problem; you could simply extract it into the subfolders with unzip. The following is only needed if your zipfile contains loose files, and you want to extract them into a subfolder.
With that caveat, the following should work:
#!/bin/bash
DIR=${1:-.}
BASE=$(basename "$DIR/"*.zip .zip 2>/dev/null) ||
{ echo More than one zipfile >> /dev/stderr; exit 1; }
if [[ $BASE = "*" ]]; then
echo No zipfile found >> /dev/stderr
exit 1
fi
mkdir -p "$DIR/$BASE" ||
{ echo Could not create $DIR/$BASE >> /dev/stderr; exit 1; }
unzip "$DIR/$BASE.zip" -d "$DIR/$BASE"
Put it in a file (anywhere), call it something like unzipper.sh, and chmod a+x it. Then you can call it like this:
/path/to/unzipper.sh /path/to/data_directory
simple one liner I use all the time
$ for file in `ls *.zip`; do unzip $file -d `echo $file | cut -d . -f 1`; done

BASH recursive find filename and copy to ftp

I am working on a Bash script (see below) that recursively searches through directories on a SAN for files with a specific file name that is newer than 4 hours. Then copies all these files to a specific FTP Location and email to say the copy has been completed. The script works fine except that it only copies files on the top level directory. The error I am getting on the lower directories is this:
#
remote: -v
ftp: local: -v: No such file or directory
local: ./Test01/test02/folder02_01_1200_m30.mp4 remote: ./Test01/test02/folder02_01_1200_m30.mp4
229 Entering Extended Passive Mode (|||45127|)
550 ./Test01/test02/folder02_01_1200_m30.mp4: File does not exist. (2)
221 Goodbye.
#
Here is the Script
#!/bin/bash
#The location from where the script should search
GSPORIGIN='/Volumes/folder01/folder02'
#File Names to be moved
FILE1='*1200_m30.mp4'
#FTP Details
HOST='xxxx.upload.com'
USER='xxxxxxx'
PASSWD='xxxxxxxxxxxx'
#the destination directory on the FTP
DESTDIR="/8619/_!/TEST"
# Go to the location from where the search should start
cd $GSPORIGIN
for file in `find . -type f -name "*1200_m30.mp4" -mmin -240`
do
echo $file
if [ -f $file ] ; then
ftp -n -v $HOST << EOT
ascii
user $USER $PASSWD
prompt
cd $DESTDIR
mput -v $file
EOT
echo "$file has been copied to FTP" | mail -s "$file has been copied to FTP in Directory $DESTDIR" xxx.xxx#xxx.com;
else exit 1
fi
done
To do what you're doing, you'll have to recreate the directories on the destination FTP.
use the basename/dirname commmands and a mkdir command like this :
for file in `find . -type f -name "*1200_m30.mp4" -mmin -240`
do
echo $file
if [ -f $file ] ; then
destdirname=`dirname "$file"`
ftp -n -v $HOST << EOT
ascii
user $USER $PASSWD
prompt
cd $DESTDIR
mkdir $destdirname
mput -v $file
EOT
echo "$file has been copied to FTP" | mail -s "$file has been copied to FTP in Directory $DESTDIR" xxx.xxx#xxx.com;
else exit 1
fi
To copy multiple files in nested directories: I would suggest you to look at rsync utility to do this job for you.
rsync will create all the remote directories whenever needed and it will keep files completely in sync even after frequent runs.

Resources