I have a VPS running on Centos 7, and created a cron job to dump my database (Sql 8.0) and to create a tar to backup my entire site's files and this goes on everyday
I want to create another bash / cron job to connect to my backup server and upload those backup files stored on my VPS.
The problem is, I can't get it to upload only the newest files, not the entire files as there will be 7 backups every week.
I want it to only upload today's files, not all available files.
Should I use rsync ?
Here is my bash so far:
#!/bin/sh
USERNAME="ftp user"
PASSWORD="ftp password"
SERVER="IP or domain"
# local directory to pickup *.tar.gz file
FILE="/path/"
# remote server directory to upload backup
BACKUPDIR="/pro/backup/sql"
# login to remote server
ftp -n -i $SERVER <<EOF
user $USERNAME $PASSWORD
cd $BACKUPDIR
mput $FILE/*.tar.gz
quit
EOF
You can use find with -ctime to search for .tar.gz files changed in the last 7 days and then loop on the results, ftping each. Using this logic with your existing solution:
#!/bin/sh
USERNAME="ftp user"
PASSWORD="ftp password"
SERVER="IP or domain"
# local directory to pickup *.tar.gz file
FILE="/path/"
# remote server directory to upload backup
BACKUPDIR="/pro/backup/sql"
while read fil;
do
# login to remote server
ftp -n -i $SERVER <<EOF
user $USERNAME $PASSWORD
cd $BACKUPDIR
mput "$FILE/$fil"
quit
EOF
done < "$(find $FILE -ctime -7 -name "*.tar.gz")"
Related
I need a script to upload files of sites with directories to hosting via ftp.
I try to create a script, but it doesn't work. NO files are on the server. Can you help me, please?
My script
#!/bin/bash
HOST='ip_address'
USER='user'
PASSWD='password'
SERVER_FOLDER='/site'
cd /local_folder_with_sites_files
ftp -in <<END_SCRIPT
open $HOST
user $USER $PASSWD
cd $SERVER_FOLDER
mput -r *
close
bye
END_SCRIPT
echo "Upload complete"
exit 0
Output:
directory: not a plain file.
Permission denied.
Passive mode refused.
OS: Ubuntu 16.04 Panel: VestaCP
But when i upload files via filezilla, uploads is complete.
If anybody has a script which uploads files and folders via ftp, please, show me for example.
I have this script to upload and delete files on my remote host:
#!/bin/bash
echo Starting Website upload ...
echo This may take some time depending on your internet connection ...
echo Waiting for remote connnection ...
lftp -u server121.web-hosting.com << ftpEOF
prompt
cd public_html
delete index.html
cd images
mdelete *.jpg
cd ..
lcd /Applications/PlexEmail/streamnet/
put index.html
lcd images
cd images
mput *.jpg
bye
ftpEOF
echo Website upload successfull ...
After upgrading my Mac to High Sierra 10.13, the ftp command no longer exists.
Any ideas on how to get this to work with lftp?
I am not sure if this could be remedied by Programming means but I had a MailX shell script that was working properly in the test server. But when I run the script in the production server, the recipient only receives a file named 'eml' that can't be even opened. I was informed by the system administrator that the configurations of both servers are the same and I should be adjusting my code.
But I used the exact same shell script and it works in the test server.
cd /home/guava/autoemail
datediff=1
datetoday=$(date +%Y%m%d)
foldername=$(date --date="${datetoday} -${datediff} day" +%Y%m%d)
mv DEALS_ENTERED_TODAY_ALL_2OM_UP.xls DEALS_ENTERED_TODAY_ALL_2OM_UP_$foldername.xls
zip -P $foldername DEALS_ENTERED_TODAY_ALL_2OM_UP_$foldername.zip DEALS_ENTERED_TODAY_ALL_2OM_UP_$foldername.xls
cat /home/guava/autoemail/email_body.txt | mailx -s "AML_20M_DAILY_TRANSACTION_REPORT_GUAVA_$foldername" -a /home/guava/autoemail/DEALS_ENTERED_TODAY_ALL_2OM_UP_$foldername.zip ben#onionwank.com
rm DEALS_ENTERED_TODAY_ALL_2OM_UP_$foldername.xls
rm DEALS_ENTERED_TODAY_ALL_2OM_UP_$foldername.zip
what it does:
-declares the date yesterday
-renames an excel file with the yesterday's date
-zip the excel file with a password
-email it to the user
-delete the used files.
I just want to ask if there is anything I can improve with my code so I can use it in the production server. Why does the server send an 'eml' file instead of the attachment I defined?
It is possible that this is a server issue but the system admins don't seem to know what to do.
Hello just created a ksh script to ftp .jpg image to a remote server but the images are showing in bad qulity when I send them with the script is there a line I should fix to not alterate the image a deliver the image to the remote server like the original please help, should I add the bin line?
cp -r /path/dir/*.jpg /path/dir
cp -r /path/dir/*.JPG /path/dirREMOTE
USER='xxx'
PASSWORD=xxx'
source_dir='cd /path/images/'
target_dir='cd /images'
ftp -n xxx.xx.xxx.xx <<_FTP
quote USER $USER
quote PASS $PASSWORD
lcd /xxx/xxx/
cd /xxx
mput *.jpg
bye
_FTP
/home/test_scripts/test_script9.sh
/home/test_scripts/test_script7.sh
exit
enter code here
I want to ftp file to my remote server and then move those files into another directory in the remote server
The ftp happens correctly but the movement is throwing an error like
550 RNFR command failed.
Can you please help?
My script is
#!/bin/sh
echo "Enter the version of the xml (eg:- v17.25)"
read version
HOST_FIRST='un01'
HOST_LAST='01'
USER='someuser'
PASSWD='somepassword'
HOST="$HOST_FIRST$FILE$HOST_LAST"
ftp -n $HOST <<-END_SCRIPT
quote USER $USER
quote PASS $PASSWD
cd /tmp
put myfile.xml
rename myfile.xml /tmp/test_ftp
quit
END_SCRIPT
exit 0
You have put myfile.xml to the tmp dir, why not just
edit
Change your script from
rename myfile.xml /tmp/test_ftp
TO
rename myfile.xml test_ftp
You have already put the file in the /tmp directory, which you have already done a cd /tmp .
That should work.
You can't specify a path in an ftp rename command and expect it to be moved as well.
And sorry, not what you want to here, but there is no move command in ftp. It is that way for security.
IHTH.