Issue with FTP Transfer Script for CRON - ftp

I'm trying to get a Backup Script working for rackspace. The final part where it sends the backup to a server isnt working.
#!/bin/sh
#Set information specific to your site
webroot="/mnt/stor08-wc1-ord1/666666/www.mysite.com/"
db_host="mysql51-900.wc1.ord1.stabletransit.com"
db_user="666666_backup"
db_password="PassBackup2015"
db_name="666666_wealth"
#Set the date and name for the backup files
date=`date '+%F-%H-%M'`
backupname="backup.$date.tar.gz"
#Dump the mysql database
mysqldump -h $db_host -u $db_user --password="$db_password" $db_name > $webroot/db_backup.sql
#Backup Site
tar -czpvf $webroot/sitebackup.tar.gz $webroot/web/content/
#Compress DB and Site backup into one file
tar --exclude 'sitebackup' --remove-files -czpvf $webroot/$backupname $webroot/sitebackup.tar.gz $webroot/db_backup.sql
HOST='172.0.0.1'
USER='FILESERV\ftp'
PASSWD='PassBackup2015!'
ftp $HOST <<END_SCRIPT
user $USER $PASSWD
cd $webroot
put $backupname
quit
END_SCRIPT
exit 0

You may need to use ftp -n to prevent auto-login from occurring. Using auto-login with the user command often causes problems.
In other words, something like:
ftp -n $HOST <<END_SCRIPT
user $USER $PASSWD
cd $webroot
put $backupname
quit
END_SCRIPT

Related

Postgres database backup not working locally (Crontab + Shell script using expect)

I am having issues on my Ubuntu server: I have two scripts which perform a pg_dump of two databases (a remote and a local one). However the backup file for the local one always ends up empty.
When I run the script manually, no problem.
The issue is when the script is ran via crontab while I am NOT logged into the machine. If I'm in a SSH session no problem, it works with crontab but when I'm not connected it does not work.
Check out my full scripts/setup under, and feel free to suggest any improvements. For now I just want it to work but if my method is insecure/unefficient I would gladly hear about alternatives :)
So far I've tried:
Using the postgres user for the local database (instead of another user I use to access the DB with my applications)
Switch pg_dump for /usr/bin/pg_dump
Here's my setup:
Crontab entry:
0 2 * * * path/to/my/script/local_databasesBackup.sh ; path/to/my/script/remote_databasesBackup.sh
scriptInitialization.sh
set LOCAL_PWD "password_goes_here"
set REMOTE_PWD "password_goes_here"
Expect script, called by crontab (local/remote_databaseBackup.sh):
#!/usr/bin/expect -f
source path/to/my/script/scriptInitialization.sh
spawn path/to/my/script/localBackup.sh expect "Password: " send "${LOCAL_PWD}\r"
expect eof exit
Actual backup script (local/remoteBackup.sh):
#!/bin/bash
DATE=$(date +"%Y-%m-%d_%H%M")
delete_yesterday_backup_and_perform_backup () {
/usr/bin/pg_dump -U postgres -W -F t localDatabaseName > /path/to/local/backup/folder/$DATE.tar
YESTERDAY_2_AM=$(date --date="02:00 yesterday" +"%Y-%m-%d_%H%M")
YESTERDAY_BACKUP_FILE=/path/to/local/backup/folder/$YESTERDAY_2_AM.tar
if [ -f "$YESTERDAY_BACKUP_FILE" ]; then
echo "$YESTERDAY_BACKUP_FILE exists. Deleting"
rm $YESTERDAY_BACKUP_FILE
else
echo "$YESTERDAY_BACKUP_FILE does not exist."
fi
}
CURRENT_DAY_NUMBER=$(date +"%d")
FIRST_DAY_OF_THE_MONTH="01"
if [ "$CURRENT_DAY_NUMBER" = "$FIRST_DAY_OF_THE_MONTH" ]; then
echo "First day of the month: Backup without deleting the previous backup"
/usr/bin/pg_dump -U postgres -W -F t localDatabaseName > /path/to/local/backup/folder/$DATE.tar
else
echo "Not the first day of the month: Delete backup from yesterday and backup"
delete_yesterday_backup_and_perform_backup
fi
The only difference between my local and remote script is the pg_dump parameters:
Local looks like this /usr/bin/pg_dump -U postgres -W -F t localDatabaseName > /path/to/local/backup/folder/$DATE.tar
Remote looks like this: pg_dump -U remote_account -p 5432 -h remote.address.com -W -F t remoteDatabase > /path/to/local/backup/folder/$DATE.tar
I ended up making two scripts because I thought it may have been the cause of the issue. However I'm pretty sure it's not at the moment.

ftp transfer works with filezilla but not command line

I am trying to transfer images to a server via ftp.
When I use Filezilla, it works: I can see my files on the server.
When I use these raw ftp commands:
ftp -p -v -n $server << EOF
quote USER $user
quote PASS $pass
prompt off
cd Stock
mput *.jpg
quit
EOF
it doesn't work, I can't see my images on the server, even if in my terminal it looks like it worked:
227 Entering Passive Mode (89,151,93,136,207,15).
150 Opening ASCII mode data connection.
226 Transfer complete.
1225684 bytes sent in 1.88 secs (651.70 Kbytes/sec)
Any idea what could cause this?
Add BINARY to force binary mode:
ftp -p -v -n $server << EOF
quote USER $user
quote PASS $pass
prompt off
cd Stock
BINARY
mput *.jpg
quit
EOF

wget: How can I download files but no folders via FTP?

I need to download the files located in the root of in my FTP server, but not the folders, is there a way to do that?
I'm currently connecting with this string:
wget --timeout 20 -m -nH --user "user" --password "pass" ftp://123.123.com
Thanks.
wget doesn't have filtering capabilities.
However you can create a simple script to collect the files first and then get them using wget.
echo open $HOST > ftp.txt
echo user $USER $PASS >> ftp.txt
echo ls >> ftp.txt
echo bye >> ftp.txt
ftp < ftp.txt | awk '^-' | awk '{print $9}' > files.txt
wget --timeout 20 -m -nH --user "user" --password "pass" -i files.txt ftp://123.123.com

Bash - ncftpls is not working

So, I'm trying to get the list of files and folders in the uppermost directory of my server and set it as a variable.
I'm using ncftpls to get the list of files and folders. It runs, but it doesn't display any files or folders.
LIST=$(ncftpls -u $USER -p $PASSWORD -P $PORT ftp://$HOST/)
echo $LIST
I tried not setting it as a variable and just running the command ncftpls, but it still won't display any of the files or folders.
The strange this is, though, when I run this script
ncftp -u $USER -p $PASSWORD -P $PORT ftp://$HOST/ <<EOF
ls
EOF
it outputs all the files and folders just fine. Although, then I can't set it as a variable (I don't think).
If anyone has any ideas on what is going on, that'd be much appreciated!
The only acceptable time to use FTP was the 1970s, when you could trust a host by the fact that it was allowed onto the internet.
Do not try to use it today. Use sftp, rsync, ssh or another suitable alternative.
You can capture output of any command with $(..):
In your case,
list=$(
ncftp -u $USER -p $PASSWORD -P $PORT ftp://$HOST/ <<EOF
ls
EOF
)
This happens to be equivalent to
list=$(ncftp -u $USER -p $PASSWORD -P $PORT ftp://$HOST/ <<< "ls")

Permission Issue: Creating postgres back as postgres user

I have following postgres backup script, its a shell script and written to run ans postgres user.
But the problem is postgres user doesn't have permission to write these directories. I as a user don't have sudo on these machines but I have changed the directory to has 755 and added to one of the group that has major permission to do read-write-execute. Since postgres user isn't part of the unix user group I guess I am running into this issue.
My goal is to put this in the cron-tab but prior to that I need to get the script running with proper permission:
#!/bin/bash
# location to store backups
backup_dir="/location/to/dir"
# name of the backup file has the date
backup_date=`date +%d-%m-%Y`
# only keep the backup for 30 days (maintain low storage)
number_of_days=30
databases=`psql -l -t | cut -d'|' -f1 | sed -e 's/ //g' -e '/^$/d'`
for i in $databases; do
if [ "$i" != "template0" ] && [ "$i" != "template1" ]; then
echo Dumping $i to $backup_dir$i\_$backup_date
pg_dump -Fc $i > $backup_dir$i\_$backup_date
fi
done
find $backup_dir -type f -prune -mtime +$number_of_days -exec rm -f {} \;
Before doing this be sure to login as a super user (sudo su) and try executing these:
useradd -G unix postgres (Add postgres user to unix group)
su postgres (Login as postgres user)
mkdir folder (Go to the directory where postgres needs to write files)
***From this line down is my answer to #find-missing-semicolon question
Just to illustrate an example with a shell script, you can capture the password using the read command and put it to a variable. Here I stored the password in password and echoed it afterwards. I hope this helps.
`#!/bin/bash`
`read -s -p "Password: " password`
`echo $password`

Resources