SHELL script to make ftp connection and get xml files - shell

I need a shell script that will login to a remote FTP server, get the list of files present in only root folder and identify only xml files and get those files to local system.
Login credentials can be mentioned in the script it self. This script must be run once a day only.
Please help me with a UNIX BASH SHELL script.
Thanks

script:
#!/bin/bash
SERVER=ftp://myserver
USER=user
PASS=password
EXT=xml
DESTDIR=/destinationdir
listOfFiles=$(curl $SERVER --user $USER:$PASS 2> /dev/null | awk '{ print $9 }' | grep -E "*.$EXT$")
for file in $listOfFiles
do
curl $SERVER/$file --user $USER:$PASS -o $DESTDIR/$file
done
for scheduled run every day check the crontab:
crontab -e
for edit your current jobs and add for example:
0 0 * * * bash /path/to/script
that will mean run the script every day at midnight.

If you can install ncftpget, this is a one-line operation:
ncftpget -u user -p password ftp.remote-host.com /my/local/dir '/*.xml'

Related

Postgres database backup not working locally (Crontab + Shell script using expect)

I am having issues on my Ubuntu server: I have two scripts which perform a pg_dump of two databases (a remote and a local one). However the backup file for the local one always ends up empty.
When I run the script manually, no problem.
The issue is when the script is ran via crontab while I am NOT logged into the machine. If I'm in a SSH session no problem, it works with crontab but when I'm not connected it does not work.
Check out my full scripts/setup under, and feel free to suggest any improvements. For now I just want it to work but if my method is insecure/unefficient I would gladly hear about alternatives :)
So far I've tried:
Using the postgres user for the local database (instead of another user I use to access the DB with my applications)
Switch pg_dump for /usr/bin/pg_dump
Here's my setup:
Crontab entry:
0 2 * * * path/to/my/script/local_databasesBackup.sh ; path/to/my/script/remote_databasesBackup.sh
scriptInitialization.sh
set LOCAL_PWD "password_goes_here"
set REMOTE_PWD "password_goes_here"
Expect script, called by crontab (local/remote_databaseBackup.sh):
#!/usr/bin/expect -f
source path/to/my/script/scriptInitialization.sh
spawn path/to/my/script/localBackup.sh expect "Password: " send "${LOCAL_PWD}\r"
expect eof exit
Actual backup script (local/remoteBackup.sh):
#!/bin/bash
DATE=$(date +"%Y-%m-%d_%H%M")
delete_yesterday_backup_and_perform_backup () {
/usr/bin/pg_dump -U postgres -W -F t localDatabaseName > /path/to/local/backup/folder/$DATE.tar
YESTERDAY_2_AM=$(date --date="02:00 yesterday" +"%Y-%m-%d_%H%M")
YESTERDAY_BACKUP_FILE=/path/to/local/backup/folder/$YESTERDAY_2_AM.tar
if [ -f "$YESTERDAY_BACKUP_FILE" ]; then
echo "$YESTERDAY_BACKUP_FILE exists. Deleting"
rm $YESTERDAY_BACKUP_FILE
else
echo "$YESTERDAY_BACKUP_FILE does not exist."
fi
}
CURRENT_DAY_NUMBER=$(date +"%d")
FIRST_DAY_OF_THE_MONTH="01"
if [ "$CURRENT_DAY_NUMBER" = "$FIRST_DAY_OF_THE_MONTH" ]; then
echo "First day of the month: Backup without deleting the previous backup"
/usr/bin/pg_dump -U postgres -W -F t localDatabaseName > /path/to/local/backup/folder/$DATE.tar
else
echo "Not the first day of the month: Delete backup from yesterday and backup"
delete_yesterday_backup_and_perform_backup
fi
The only difference between my local and remote script is the pg_dump parameters:
Local looks like this /usr/bin/pg_dump -U postgres -W -F t localDatabaseName > /path/to/local/backup/folder/$DATE.tar
Remote looks like this: pg_dump -U remote_account -p 5432 -h remote.address.com -W -F t remoteDatabase > /path/to/local/backup/folder/$DATE.tar
I ended up making two scripts because I thought it may have been the cause of the issue. However I'm pretty sure it's not at the moment.

script to connect to a "list.txt" of servers

I am trying to find a way to connect to a list of servers written in a simple textfile to run one command and write the output to a file...
The small problem is, I have to login with a password... but it would not a problem to paste the password into the script.
the full command would be:
ssh "server_from_list.txt uptime | awk -F, '{sub(".*up ",x,$1);print $1}' >> /home/kauk2/uptime.out
lets assume the password is: abcd1234
Any suggestions??? I am not fit in scripting, sorry...
Many thanks to you all in advance...
regards,
Joerg
Ideally you should set up password-less login, but failing that you can use sshpass. First, get a single command working by trying the following:
export SSHPASS=abcd1234
Then you can try:
sshpass -e ssh user#server1 'uname -a'
When you get that debugged and working, you can use GNU Parallel to run the command on all servers in a file called list.txt
user#server1
user#server2
user#server3
user#server4
The command will be:
parallel -k -a list.txt sshpass -e ssh {} 'uptime'

Script dosen't work on crontab debian & ubuntu

I have the below script to automatize with cron, but when i run it "./" its work, but on cron dosen't work.
I tried many things, like a change the sh to bash for cron. Variables on my script, etc...
#!/bin/bash
DATE=$(/bin/date +%d-%m-%Y-%H-%M-%S)
USERFW="xxxx"
IPSERVER=$(ip route get 1.2.3.4 | awk '{print $7}')
for SW in `more fw.txt `
do
/usr/bin/sshpass -p "Hs#Pr&v3nT!" /usr/bin/ssh -tt -o StrictHostKeyChecking=no $USERFW#$SW <<EOF
execute backup config tftp SW-$DATE $IPSERVER
exit
exit
EOF
done
I need this code working on crontab... does anyone help me?

I want to delete file from server by bash script using ftp

I want to delete file by bash script using ftp
I use below code
$file = xyz/ab/file.txt
curl -v -u $user:$pass ftp://server.com/$file -Q "DELE $file"
but it's give these error
*Entry path is '/'
DELE xyz/ab/file.txt
* ftp_perform ends with SECONDARY: 0
< 550 Could not delete xyz/ab/file.txt: No such file or directory
* QUOT command failed with 550
How can I delete file with single line bash script command
How to delete file from ftp server with curl:
user="foo"
pass="bar"
dir="xyz/ab/" # with trailing slash
file="file.txt"
curl -v -u "$user:$pass" "ftp://server.com/$dir" -Q "-DELE $file"
or
curl -v -u "$user:$pass" 'ftp://server.com' -Q "-DELE /$dir$file"
or without leading /
curl -v -u "$user:$pass" 'ftp://server.com' -Q "-DELE $dir$file"
You could use the ftp command if you want to add additional commands.
user=foo
user=bar
ftp -n 127.0.0.1 <<EOF
quote USER $user
quote PASS $pass
delete xyz/ab/file.txt
exit
EOF
Deleting must be available on the ftp server, however. If I remember correctly, for vsftpd you must set anon_other_write_enable=YES in /etc/vsftpd.conf

bash config file from remote source with an argument [duplicate]

Say I have a file at the URL http://mywebsite.example/myscript.txt that contains a script:
#!/bin/bash
echo "Hello, world!"
read -p "What is your name? " name
echo "Hello, ${name}!"
And I'd like to run this script without first saving it to a file. How do I do this?
Now, I've seen the syntax:
bash < <(curl -s http://mywebsite.example/myscript.txt)
But this doesn't seem to work like it would if I saved to a file and then executed. For example readline doesn't work, and the output is just:
$ bash < <(curl -s http://mywebsite.example/myscript.txt)
Hello, world!
Similarly, I've tried:
curl -s http://mywebsite.example/myscript.txt | bash -s --
With the same results.
Originally I had a solution like:
timestamp=`date +%Y%m%d%H%M%S`
curl -s http://mywebsite.example/myscript.txt -o /tmp/.myscript.${timestamp}.tmp
bash /tmp/.myscript.${timestamp}.tmp
rm -f /tmp/.myscript.${timestamp}.tmp
But this seems sloppy, and I'd like a more elegant solution.
I'm aware of the security issues regarding running a shell script from a URL, but let's ignore all of that for right now.
source <(curl -s http://mywebsite.example/myscript.txt)
ought to do it. Alternately, leave off the initial redirection on yours, which is redirecting standard input; bash takes a filename to execute just fine without redirection, and <(command) syntax provides a path.
bash <(curl -s http://mywebsite.example/myscript.txt)
It may be clearer if you look at the output of echo <(cat /dev/null)
This is the way to execute remote script with passing to it some arguments (arg1 arg2):
curl -s http://server/path/script.sh | bash /dev/stdin arg1 arg2
For bash, Bourne shell and fish:
curl -s http://server/path/script.sh | bash -s arg1 arg2
Flag "-s" makes shell read from stdin.
Use:
curl -s -L URL_TO_SCRIPT_HERE | bash
For example:
curl -s -L http://bitly/10hA8iC | bash
Using wget, which is usually part of default system installation:
bash <(wget -qO- http://mywebsite.example/myscript.txt)
You can also do this:
wget -O - https://raw.github.com/luismartingil/commands/master/101_remote2local_wireshark.sh | bash
The best way to do it is
curl http://domain/path/to/script.sh | bash -s arg1 arg2
which is a slight change of answer by #user77115
You can use curl and send it to bash like this:
bash <(curl -s http://mywebsite.example/myscript.txt)
I often using the following is enough
curl -s http://mywebsite.example/myscript.txt | sh
But in a old system( kernel2.4 ), it encounter problems, and do the following can solve it, I tried many others, only the following works
curl -s http://mywebsite.example/myscript.txt -o a.sh && sh a.sh && rm -f a.sh
Examples
$ curl -s someurl | sh
Starting to insert crontab
sh: _name}.sh: command not found
sh: line 208: syntax error near unexpected token `then'
sh: line 208: ` -eq 0 ]]; then'
$
The problem may cause by network slow, or bash version too old that can't handle network slow gracefully
However, the following solves the problem
$ curl -s someurl -o a.sh && sh a.sh && rm -f a.sh
Starting to insert crontab
Insert crontab entry is ok.
Insert crontab is done.
okay
$
Also:
curl -sL https://.... | sudo bash -
Just combining amra and user77115's answers:
wget -qO- https://raw.githubusercontent.com/lingtalfi/TheScientist/master/_bb_autoload/bbstart.sh | bash -s -- -v -v
It executes the bbstart.sh distant script passing it the -v -v options.
Is some unattended scripts I use the following command:
sh -c "$(curl -fsSL <URL>)"
I recommend to avoid executing scripts directly from URLs. You should be sure the URL is safe and check the content of the script before executing, you can use a SHA256 checksum to validate the file before executing.
instead of executing the script directly, first download it and then execute
SOURCE='https://gist.githubusercontent.com/cci-emciftci/123123/raw/123123/sample.sh'
curl $SOURCE -o ./my_sample.sh
chmod +x my_sample.sh
./my_sample.sh
This way is good and conventional:
17:04:59#itqx|~
qx>source <(curl -Ls http://192.168.80.154/cent74/just4Test) Lord Jesus Loves YOU
Remote script test...
Param size: 4
---------
17:19:31#node7|/var/www/html/cent74
arch>cat just4Test
echo Remote script test...
echo Param size: $#
If you want the script run using the current shell, regardless of what it is, use:
${SHELL:-sh} -c "$(wget -qO - http://mywebsite.example/myscript.txt)"
if you have wget, or:
${SHELL:-sh} -c "$(curl -Ls http://mywebsite.example/myscript.txt)"
if you have curl.
This command will still work if the script is interactive, i.e., it asks the user for input.
Note: OpenWRT has a wget clone but not curl, by default.
bash | curl http://your.url.here/script.txt
actual example:
juan#juan-MS-7808:~$ bash | curl https://raw.githubusercontent.com/JPHACKER2k18/markwe/master/testapp.sh
Oh, wow im alive
juan#juan-MS-7808:~$

Resources