How to fix Automated Backup script for postgres [Window]? - windows

I have look up a duplicate question PostgreSQL: Automated Backup in Windows and other source https://wiki.postgresql.org/wiki/Automated_Backup_on_Windows. I have try to make a simple batch script for my own [eg. Setup Path , Set up password... etc] in order to do restore database in the future. However, it seems like my batch script for backup database does not work at all. I can't figure out where is my mistaken point.
Here is my batch script for backup postgres database.
#echo off
SET PGPASSWORD=%Ech0-5910&123
set root=C:\Program Files (x86)\pgAdmin 4\v3\runtime\
echo on
cd %root%
echo on
pg_dump.exe -h 192.168.1.161 -p 5432 -U postgres -F c -b -v -f "D:\Backup\DatabaseBackUp\SQL\123456.backup" testdb
Updated Script follow #Gerhard Barnard answer
#echo off
echo 192.168.1.161:5432:_wolfcom:postgres:R0m3o^%%Ech0-5910^&>"%APPDATA%\postgresql\pgpass.conf"
set "root=C:\Program Files (x86)\pgAdmin 4\v3\runtime\"
cd /d "%root%"
pg_dump.exe -h 192.168.1.161 -p 5432 -U postgres -F c -b -v -f "D:\Backup\DatabaseBackUp\SQL\123456.backup" _wolfcom
pause

Your code should wrap all paths in double quotes to eliminate whitespace. Keep in mind cmd interprets each space delimited work as a new command. We need to escape the & as it will become a physical operator in batch, lastly it is prefered to use the /d option when using cd in case you come from another drive letter:
#echo off
SET "PGPASSWORD=%Ech0-5910^&123"
set "root=C:\Program Files (x86)\pgAdmin 4\v3\runtime\"
cd /d "%root%"
pg_dump.exe -h 192.168.1.161 -p 5432 -U postgres -F c -b -v -f "D:\Backup\DatabaseBackUp\SQL\123456.backup" testdb
Also note, you never use the password in your pg_dump command, so you need to consider that as well. Best practice is to edit
%APPDATA%\postgresql\pgpass.conf
and add
*:5432:*:username:password
to automate that part in your script:
#echo off
echo *:5432:*:postgres:%Ech0-5910^&123>""%APPDATA%\postgresql\pgpass.conf"
set "root=C:\Program Files (x86)\pgAdmin 4\v3\runtime\"
cd /d "%root%"
pg_dump.exe -h 192.168.1.161 -p 5432 -U postgres -F c -b -v -f "D:\Backup\DatabaseBackUp\SQL\123456.backup" testdb
If the dir does not exist, create it "%APPDATA%\postgresql"

Related

SET PGPASSWORD with asterisk character not working

I use a batch file to copy data from a database to other like this:
SET PGPASSWORD=passtest
"C:\Program Files\e-SUS\database\postgresql-9.6.13-4-windows-x64\bin\psql.exe" -h 10.10.10.10 -p 5433 -d esus -U postgres -c "\copy (SELECT * from mytable) to 'e:\data.csv' with csv header"
IF EXIST e:\data.csv ( "C:\Program Files\e-SUS\database\postgresql-9.6.13-4-windows-x64\bin\psql.exe" -h 11.11.11.11 -p 5433 -d esus -U postgres -c "\copy mytable from 'e:\data.csv' with csv header delimiter ','" )
this works correctly, but if my password have a asterisk character like SET PGPASSWORD=pass*test this not works... I try to use SET PGPASSWORD=pass%*test but this not works too.
Any idea?
You need to quote the asterisk:
SET PGPASSWORD="sec*ret"

Postgres database backup not working locally (Crontab + Shell script using expect)

I am having issues on my Ubuntu server: I have two scripts which perform a pg_dump of two databases (a remote and a local one). However the backup file for the local one always ends up empty.
When I run the script manually, no problem.
The issue is when the script is ran via crontab while I am NOT logged into the machine. If I'm in a SSH session no problem, it works with crontab but when I'm not connected it does not work.
Check out my full scripts/setup under, and feel free to suggest any improvements. For now I just want it to work but if my method is insecure/unefficient I would gladly hear about alternatives :)
So far I've tried:
Using the postgres user for the local database (instead of another user I use to access the DB with my applications)
Switch pg_dump for /usr/bin/pg_dump
Here's my setup:
Crontab entry:
0 2 * * * path/to/my/script/local_databasesBackup.sh ; path/to/my/script/remote_databasesBackup.sh
scriptInitialization.sh
set LOCAL_PWD "password_goes_here"
set REMOTE_PWD "password_goes_here"
Expect script, called by crontab (local/remote_databaseBackup.sh):
#!/usr/bin/expect -f
source path/to/my/script/scriptInitialization.sh
spawn path/to/my/script/localBackup.sh expect "Password: " send "${LOCAL_PWD}\r"
expect eof exit
Actual backup script (local/remoteBackup.sh):
#!/bin/bash
DATE=$(date +"%Y-%m-%d_%H%M")
delete_yesterday_backup_and_perform_backup () {
/usr/bin/pg_dump -U postgres -W -F t localDatabaseName > /path/to/local/backup/folder/$DATE.tar
YESTERDAY_2_AM=$(date --date="02:00 yesterday" +"%Y-%m-%d_%H%M")
YESTERDAY_BACKUP_FILE=/path/to/local/backup/folder/$YESTERDAY_2_AM.tar
if [ -f "$YESTERDAY_BACKUP_FILE" ]; then
echo "$YESTERDAY_BACKUP_FILE exists. Deleting"
rm $YESTERDAY_BACKUP_FILE
else
echo "$YESTERDAY_BACKUP_FILE does not exist."
fi
}
CURRENT_DAY_NUMBER=$(date +"%d")
FIRST_DAY_OF_THE_MONTH="01"
if [ "$CURRENT_DAY_NUMBER" = "$FIRST_DAY_OF_THE_MONTH" ]; then
echo "First day of the month: Backup without deleting the previous backup"
/usr/bin/pg_dump -U postgres -W -F t localDatabaseName > /path/to/local/backup/folder/$DATE.tar
else
echo "Not the first day of the month: Delete backup from yesterday and backup"
delete_yesterday_backup_and_perform_backup
fi
The only difference between my local and remote script is the pg_dump parameters:
Local looks like this /usr/bin/pg_dump -U postgres -W -F t localDatabaseName > /path/to/local/backup/folder/$DATE.tar
Remote looks like this: pg_dump -U remote_account -p 5432 -h remote.address.com -W -F t remoteDatabase > /path/to/local/backup/folder/$DATE.tar
I ended up making two scripts because I thought it may have been the cause of the issue. However I'm pretty sure it's not at the moment.

Bash cron job on hpanel not locating directory

I have the following code on cron job, it runs but the code does not really do what it supposed to. It does not create the directory plus is does not do anything in the code. Please help check if the way I pointed to the directory is wrong.
#!/bin/bash
NAMEDATE=`date +%F_%H-%M`_`whoami`
NAMEDATE2=`date `
mkdir ~/home/u3811*****/domains/website.com/public_html/cron/backup/files/$NAMEDATE -m 0755
mysqldump -u u3811*****_boss -p"*******" u3811*****_data | gzip ~/home/u3811*****/domains/website.com/public_html/cron/backup/files/$NAMEDATE/db.sql.gz
echo "This is the database backup for website.com on $NAMEDATE2" |
mailx -a ~/home/u3811*****/domains/website.com/public_html/cron/backup/files/$NAMEDATE/db.sql.gz -s "website.com Database attached" -- mail#gmail.com
chmod -R 0644 ~/home/u3811*****/domains/website.com/public_html/cron/backup/files/$NAMEDATE/*
exit 0
Your NAMEDATE variable needs to be modified a bit, as shown below, for more information about variables in bash see this link
NAMEDATE=$(date +%F_%H-%M"_"$(whoami))
When you issue the mkdir command you will need to pass the -p option to create the complete directory structure if it doesn't exists.
mkdir -p ~/home/u3811numbers/domains/website.com/public_html/cron/backup/files/$NAMEDATE -m 0755
Also, the ~ character on Linux based distributions is used as a shortcut for the home directory of the user that invokes it so, in the line below the result is /home//home/u3811*****/domains/website.com/public_html/cron/backup/files/2020-09-04_23-13_ you can read more about it in here
In you last command before the exit, you might need to pass a wildcard (*) to avoid removing the executable bit on the directory, see below
chmod -R 0644 ~/home/u3811*****/domains/website.com/public_html/cron/backup/files/$NAMEDATE/
The final version of your script will look something like this.
#!/bin/bash
NAMEDATE=$(date +%F_%H-%M"_"$(whoami))
NAMEDATE2=date
mkdir -p ~/home/u3811******/domains/website.com/public_html/cron/backup/files/$NAMEDATE -m 0755
mysqldump -u u3811*****_boss -p"******" u3811*****_data | gzip > ~/home/u3811*****/domains/website.com/public_html/cron/backup/files/$NAMEDATE/db.sql.gz
echo "This is the database backup for website.com on $NAMEDATE2" | mailx -a ~/home/u3811*****/domains/website.com/public_html/cron/backup/files/$NAMEDATE/db.sql.gz -s "website.com Database attached" -- mail#gmail.com
chmod -R 0644 ~/home/u3811*****/domains/website.com/public_html/cron/backup/files/$NAMEDATE/*
To debug a bash script you can always pass the -x flag for more information take a look at this article

Importing a sql file using a batch script

I'm trying to make a batch file which creates a database in PHPmyAdmin then imports a database file. This is what I'm using below.
#echo on
C:\xampp\xampp_start
C:\xampp\mysql\bin\mysqld
C:\xampp\mysql\bin\mysql -u root -e "DROP DATABASE selkirk_stock_control";
C:\xampp\mysql\bin\mysql -u root -e "CREATE DATABASE IF NOT EXISTS selkirk_stock_control";
C:\xampp\mysql\bin\mysql -u root -p selkirk_stock_control ^< C:\xampp\htdocs\30316755\capstone-projects-2020-selkirk-stock-control\Inventory\config\dummyData.sql
pause
Can anyone see why it's not working?
I fixed the problem. I got rid of the ^ and changed the file path. It was incorrect.
#echo on
C:\xampp\xampp_start
C:\xampp\mysql\bin\mysqld
C:\xampp\mysql\bin\mysql -u root -e "DROP DATABASE selkirk_stock_control";
C:\xampp\mysql\bin\mysql -u root -e "CREATE DATABASE IF NOT EXISTS selkirk_stock_control";
C:\xampp\mysql\bin\mysql -u root -p selkirk_stock_control < C:\xampp\htdocs\30316755\capstone-projects-2020-selkirk-stock-control\Inventory\config\sql\selkirk_stock_control.sql
C:\xampp\mysql\bin\mysql -u root -p selkirk_stock_control < C:\xampp\htdocs\30316755\capstone-projects-2020-selkirk-stock-control\Inventory\config\dummyData\dummyData.sql
pause

Adding shell if statement inside lftp

I'm trying to use SFTP to copy some files from one server to another, this task should run everyweek. The script I use :
HOST='sftp://my.server.com'
USER='user1'
PASSWORD='passwd'
DIR=$HOSTNAME
REMOTE_DIR='/home/remote'
LOCAL_DIR='/home/local'
# LFTP via SFTP connexion
lftp -u "$USER","$PASSWORD" $HOST <<EOF
# changing directory
cd "$REMOTE_DIR"
$(if [ ! -d "$DIR" ]; then
mkdir $DIR
fi)
put -O "$REMOTE_DIR"/$DIR "$LOCAL_DIR"/uploaded.txt
EOF
My issue is that put is executed without taking in consideration the result of if statment.
PS : The error message I got is the following :
put: Access failed: No such file (/home/backups/myhost/upload.txt)
LFTP has no if statement!
What you are doing here?
lftp -u "$USER","$PASSWORD" $HOST <<EOF
cd "$REMOTE_DIR"
$(if [ ! -d "$DIR" ]; then
mkdir $DIR
fi)
put -O "$REMOTE_DIR"/$DIR "$LOCAL_DIR"/uploaded.txt
EOF
You call a sub command in a here document. The sub command is executed locally before lftp is started and its output is pasted in the here document, which gets passed to lftp. This works just, because mkdir has no output. You do not call mkdir on the ftp server. You call the mkdir of your local shell. Effectively it is the same as if you put the if statement before the lftp execution.
if [ ! -d "$DIR" ]; then
mkdir $DIR
fi
lftp -u "$USER","$PASSWORD" $HOST <<EOF
cd "$REMOTE_DIR"
put -O "$REMOTE_DIR"/$DIR "$LOCAL_DIR"/uploaded.txt
EOF
What you are trying to do, does not work. You have to think about a different solution.
Right now I have no FTP server to test it, but it might be possible to use the -f option of LFTP's mkdir. I assume that it may work like the -f option of the Unix rm command. Try this:
lftp -u "$USER","$PASSWORD" $HOST <<EOF
cd "$REMOTE_DIR"
mkdir -f "$DIR"
put -O "$REMOTE_DIR"/$DIR "$LOCAL_DIR"/uploaded.txt
EOF
Update: It works as supposed. The creation of a directory, which exist already, throws no error, if you use the option -f:
lftp anonymous#localhost:/pub> mkdir -f dir
mkdir ok, `dir' created
lftp anonymous#localhost:/pub> mkdir -f dir
lftp anonymous#localhost:/pub> ls
drwx------ 2 116 122 4096 Aug 10 12:04 dir
Maybe you lftp client is outdated. I tested it with Debian 9.

Resources