Bash script upload to ftp & delete files 7 days and older - bash

The script is unable to delete anything older than 7 days in FTP. Says the error "550 Can't remove directory: No such file or directory" All of the FTP backups are just in the main directory.
#!/bin/sh
USERNAME="user"
PASSWORD="pass"
SERVER="ftpbackupservergoeshere"
NOW="$(date +'%m-%d-%Y')"
DAYS=7
RMDATE=$(date --iso -d $DAYS' days ago')
# local directory to pickup *.tar.gz file
FILE="/root/$NOW.tar.gz"
cd /home/minecraft/multicraft/servers
find . -name \*.log -type f -delete
find . -name \*.log.gz -type f -delete
cd
tar -zcf $NOW.tar.gz /home/minecraft
# login to remote server
ftp -n -v $SERVER <<EOF
user $USERNAME $PASSWORD
binary
put $FILE $NOW.tar.gz
cd ..
rm -rf ${RMDATE}
bye
EOF
rm /root/$NOW.tar.gz

Related

Delete all files on ftp created before 10 days from now

Is there a command to delete all files with a certain name created a couple of days ago from my ftp server, running ubuntu 14.04?
Here is what I have
find /path/to/files* -mtime +10 -exec rm {} \;
Try this on the ftp server (with mydir changed into your dir).
find mydir -type f ! -mtime 5 -exec echo rm {} \;
When the echo shows what you like, remove the echo.

ftp shell mput file not match

I use ftp shell script to upload data below:
find . -type d -exec ./recursive-ftp.sh {} \;
recursive-ftp.sh is below:
#!/bin/bash
ftp_site="<skip>"
username="<skip>"
passwd="<skip>"
localDir="<skip>"
folder=$1
cd <skip>
pwd
ftp -in <<EOF
open $ftp_site
user $username $passwd
mput *
mkdir $folder
cd $folder
lcd $folder
mput *
close
bye
But I found the file size & md5 is different from the orignal file.
ftp use tcp,why are some files different?

Bash script to backup files to remote FTP. Deleting old files

I'm writing a bash script to send backups to a remote ftp server. The backup files are generated with a WordPress plugin so half the work is done for me from the start.
The script does several things.
It looks in the local backup dir for any files older than x and deletes them
It connects to FTP and puts the backup files in a dir with the current date as a name
It deletes any backup dirs for backups older than x
As I am not fluent in bash, this is a mishmash of a bunch of scripts I found around the net.
Here is my script:
#! /bin/bash
BACKDIR=/var/www/wp-content/backups
#----------------------FTP Settings--------------------#
FTP=Y
FTPHOST="host"
FTPUSER="user"
FTPPASS="pass"
FTPDIR="/backups"
LFTP=$(which lftp) # Path to binary
#-------------------Deletion Settings-------------------#
DELETE=Y
DAYS=3 # how many days of backups do you want to keep?
TODAY=$(date --iso) # Today's date like YYYY-MM-DD
RMDATE=$(date --iso -d $DAYS' days ago') # TODAY minus X days - too old files
#----------------------End of Settings------------------#
if [ -e $BACKDIR ]
then
if [ $DELETE = "Y" ]
then
find $BACKDIR -iname '*.zip' -type f -mtime +$DAYS -delete
echo "Old files deleted."
fi
if [ $FTP = "Y" ]
then
echo "Initiating FTP connection..."
cd $BACKDIR
$LFTP << EOF
open ${FTPUSER}:${FTPPASS}#${FTPHOST}
mkdir $FTPDIR
cd $FTPDIR
mkdir ${TODAY}
cd ${TODAY}
mput *.zip
cd ..
rm -rf ${RMDATE}
bye
EOF
echo "Done putting files to FTP."
fi
else
echo "No Backup directory."
exit
fi
There are 2 specific things I can't get done:
The find command doesn't delete any of the old files in the local backup dir.
I would like mput to only put the .zip files that were created today.
Thanks in advance for the help.
To send only zip files that were created today:
MPUT_ZIPS="$(find $BACKDIR -iname '*.zip' -type f -maxdepth 1 -mtime 1 | sed -e 's/^/mput /')"
[...]
$LFTP << EOF
open ${FTPUSER}:${FTPPASS}#${FTPHOST}
mkdir $FTPDIR
cd $FTPDIR
mkdir ${TODAY}
cd ${TODAY}
${MPUT_ZIPS}
cd ..
rm -rf ${RMDATE}
bye
EOF
Hope this helps =)
2) If you put todays backup files in a separate directory or link them to a separate directory, you can cd today and just transfer these files.

BASH recursive find filename and copy to ftp

I am working on a Bash script (see below) that recursively searches through directories on a SAN for files with a specific file name that is newer than 4 hours. Then copies all these files to a specific FTP Location and email to say the copy has been completed. The script works fine except that it only copies files on the top level directory. The error I am getting on the lower directories is this:
#
remote: -v
ftp: local: -v: No such file or directory
local: ./Test01/test02/folder02_01_1200_m30.mp4 remote: ./Test01/test02/folder02_01_1200_m30.mp4
229 Entering Extended Passive Mode (|||45127|)
550 ./Test01/test02/folder02_01_1200_m30.mp4: File does not exist. (2)
221 Goodbye.
#
Here is the Script
#!/bin/bash
#The location from where the script should search
GSPORIGIN='/Volumes/folder01/folder02'
#File Names to be moved
FILE1='*1200_m30.mp4'
#FTP Details
HOST='xxxx.upload.com'
USER='xxxxxxx'
PASSWD='xxxxxxxxxxxx'
#the destination directory on the FTP
DESTDIR="/8619/_!/TEST"
# Go to the location from where the search should start
cd $GSPORIGIN
for file in `find . -type f -name "*1200_m30.mp4" -mmin -240`
do
echo $file
if [ -f $file ] ; then
ftp -n -v $HOST << EOT
ascii
user $USER $PASSWD
prompt
cd $DESTDIR
mput -v $file
EOT
echo "$file has been copied to FTP" | mail -s "$file has been copied to FTP in Directory $DESTDIR" xxx.xxx#xxx.com;
else exit 1
fi
done
To do what you're doing, you'll have to recreate the directories on the destination FTP.
use the basename/dirname commmands and a mkdir command like this :
for file in `find . -type f -name "*1200_m30.mp4" -mmin -240`
do
echo $file
if [ -f $file ] ; then
destdirname=`dirname "$file"`
ftp -n -v $HOST << EOT
ascii
user $USER $PASSWD
prompt
cd $DESTDIR
mkdir $destdirname
mput -v $file
EOT
echo "$file has been copied to FTP" | mail -s "$file has been copied to FTP in Directory $DESTDIR" xxx.xxx#xxx.com;
else exit 1
fi
To copy multiple files in nested directories: I would suggest you to look at rsync utility to do this job for you.
rsync will create all the remote directories whenever needed and it will keep files completely in sync even after frequent runs.

bash: After testing mtime by following a symlink, I need to delete the symlink itself and not the target file

Right now I have a script that creates symlinks to anything newer than 2 weeks in the public folders into another folder. However, I can't find any good way of getting rid of the stale symlinks individually as opposed to wiping everything out. I need to test the symlink target mtime and if it's older than 2 weeks, delete the symlink itself and not the linked file.
#!/bin/bash
source="/media/public/"
dest="/pool/new/"
if [[ ! -d $dest ]]; then
exit 1
fi
if [ `hostname` == "punk" ] && [ `uname -o` == "GNU/Linux" ]; then
#rm -f $dest/*
find -L $dest -mtime 14 -type f -exec echo "delete symlink: " {} \;
find -L $source -mtime -14 -type f -exec ln -s -t $dest {} \;
fi
Right now the first find command will delete the target as opposed to the symlink.
Use simply
-exec rm {} +
rm will delete the link itself, not the target.

Resources