FTP filemove in shell script - shell

I need to move a file from one server to another FTP server using Shell Script.
#!/bin/sh
HOST='ftp.server.com'
USER='username'
PASSWD='password'
FILE='"/a/b/test.sh"'
DIR='/x/y/'
ftp -n $HOST <<END_SCRIPT
quote USER $USER
quote PASS $PASSWD
cd $DIR
put $FILE
quit
END_SCRIPT
exit 0
Here my question is,
I need to select a file from the server and put it into the FTP's particular location. My original file is in /a/b/test.sh path. That should be moved to /x/y path of FTP.
What am I missing.. I am new to shell script.

Try using lcd command which changes the working directory on the local machine.
Something like this:
#!/bin/sh
HOST='ftp.server.com'
USER='username'
PASSWD='password'
LOCALPATH='/a/b/'
FILE='test.sh'
DIR='/x/y/'
ftp -n $HOST <<END_SCRIPT
quote USER $USER
quote PASS $PASSWD
cd $DIR
lcd $LOCALPATH
put $FILE
quit
END_SCRIPT
exit 0

Related

Command not found when trying send zip file through ftp

I'm trying to send a zip with variable timestamp to another server through ftp. But when im trying to execute the shell its showing Command not found error. $DESSEND contains the zip file location. Guide me how to solve this.
#!/bin/bash
filename = $DESSEND/T56_OBL001_${DATEFIX}.zip
hostname="IP Address"
username="Username"
password="Password"
ftp -nv $hostname <<EOF
quote USER $username
quote PASS $password
binary
put $filename
quit
You didn't close here-doc.Try
#!/bin/bash
filename = $DESSEND/T56_OBL001_${DATEFIX}.zip
hostname="IP Address"
username="Username"
password="Password"
ftp -nv $hostname <<EOF
quote USER $username
quote PASS $password
binary
put $filename
quit
EOF
And make sure your variables DESSEND and DATEFIX are set

Reading Through A List of Files, then Sending those Files via FTP

I am making weather model charts with the Grads scripting language, and I am using a bash script so I can use a while loop to download model data (in grib2 format) and call the grads scripts for each frame on the model run. Right now, I have a loop that runs through all the scripts for a given forecast hour and uploads the image output via FTP. After this for loop completes, the grib2 data for the next hour is downloaded, and the loop runs again.
for ((i=0;i<${#SCRIPTS[#]};i++)); do
#define filename
FILENAME="${FILENAMES[i]}${FORECASTHOUR}hrfcst.png"
#run grads script
/home/mint/opengrads/Contents/opengrads -lbc "run /home/mint/opengrads/Contents/plotscripts/${SCRIPTS[i]} $CTLFILE $INIT_STRINGDATE $INIT_INTDATE $INITHOUR $FILENAME $h $MODEL $MODELFORTITLE 500"
#run ftp script
#sh /home/mint/opengrads/Contents/bashscripts/ftpsample.sh $INIT_INTDATE $INITHOUR $FILENAME $MODEL
done
This is inelegant because I open and close an FTP session each time I send a single image. I would much rather write the names of the filenames for a given forecast hour to a .txt file (ex: have a "echo ${FILENAME} >> FILEOFFILENAMES.txt" in the loop) and have my FTP script read and send all those files in a single session. Is this possible?
It's possible. You can add this to your shell script to generate the ftp script and then have it run after you've generated the files:
echo open $HOST > ftp.txt
echo user $USER $PASS >> ftp.txt
find . -type f -name '*hrfcst.png' -printf "put destination/%f %f\n" >> ftp.txt
echo bye >> ftp.txt
ftp < ftp.txt
The above code will generate file ftp.txt with commands and pass that to ftp. The generated ftp.txt will look like:
open host
user user pass
put destination/forecast1.hrfcst.png forecast1.hrfcst.png
put destination/forecast2.hrfcst.png forecast2.hrfcst.png
put destination/forecast3.hrfcst.png forecast3.hrfcst.png
...
bye
The following script will upload all files added today from local directory to remote ftp directory.
#!/bin/bash
HOST='hostname'
USER='username'
PASSWD='password'
# Local directory where the files are stored.
cd "/local/directory/from where to upload files/"
# To get all the files added today only.
TODAYSFILES=`find -maxdepth 1 -type f -mtime -1`
# remote server directory to upload backup
REMOTEDIR="/directory on remote ftp computer/"
for FILENAME in ${TODAYSFILES[#]}; do
ftp -n -v $HOST << EOT
ascii
user $USER $PASSWD
prompt
cd $REMOTEDIR
put $FILENAME
bye
EOT
done

FTP upload failed

I have a bash script that backs up my iOS files over FTP and I'm getting a few problems, I'm just wondering if anyone could help me out?
Here's my script:
#!/bin/bash
mkdir zipfolder
cp /var/mobile/Library/SMS/sms.db /var/root/zipfolder/
cp /var/mobile/Library/Notes/notes.sqlite /var/root/zipfolder/
cp /var/mobile/Library/Safari/Bookmarks.db /var/root/zipfolder/
cp /var/mobile/Library/Safari/History.plist /var/root/zipfolder/
cd var/root
zip -r zippyy.zip zipfolder
HOST=HOSTNAME
USER=USERNAME
PASS=PASSWORD
ftp -inv $HOST << EOF
user $USER $PASS
cd sms
LIST=$(ls | grep zippyy*.zip)
FILECOUNT=0
for FILE in $LIST
do
if [ -f $FILE ];
then
FILECOUNT+=1
done
FILECOUNT+=1
NEXTDB="zippyy$FILECOUNT.db"
mv zippyy.zip $NEXTDB
ftp -inv $HOST << EOF
put $NEXTDB
bye
EOF
rm -f zippyy.zip
rmdir zipfolder
I get the following errors:
?Invalid command
?Invalid command
We only support non-print format, sorry.
?Invalid command
?Invalid command
?Invalid command
?Invalid command
?Invalid command
?Invalid command
?Invalid command
?Invalid command
?Invalid command
(local-file) (remote-file)
rmdir: failed to remove 'zipfolder': Not a directory
Answer #3 for formatting
Try something like this (totally untested!)
#!/bin/bash
ROOTFOLDER="/var/root"
ZIPNAME="zipfolder"
ZIPFOLDER=$ROOTFOLDER/$ZIPNAME
LIBFOLDER="/var/mobile/Library"
ZIPFILE="zippyy.zip"
mkdir -p $ZIPFOLDER
cp $LIBFOLDER/SMS/sms.db $ZIPFOLDER/
cp $LIBFOLDER/Notes/notes.sqlite $ZIPFOLDER/
cp $LIBFOLDER/Safari/Bookmarks.db $ZIPFOLDER/
cp $LIBFOLDER/Safari/History.plist $ZIPFOLDER/
cd $ROOTFOLDER
zip -r $ZIPFILE $ZIPNAME
HOST=HOSTNAME
USER=USERNAME
PASS=PASSWORD
ftp -inv $HOST << EOF
user $USER $PASS
cd sms
dir . remote_dir.txt
bye
EOF
FILECOUNT=$(grep zippyy remote_dir.txt | wc -l)
NEXTDB="zippyy${FILECOUNT}.db"
mv $ZIPFILE $NEXTDB
ftp -inv $HOST << EOF
user $USER $PASS
put $NEXTDB
bye
EOF
Why are you using cp -i in a script? The -i switch makes the copy "interactive" and so is expecting input from the user, which it wont get because of the script.
Also, can you format your script using the "Code sample" format rather than bullet points! ;-)
New answer for formatting...
It's not entirely clear to me what you're trying to do. It looks like you're trying to find out how many existing backups there are on the ftp server and rename the new backup to go at the end of the list.
You cant execute code on an ftp server (massive security hole!) so the best way to do accomplish this would probably be to get the remote directory listing and process it locally. Try using something like:
ftp -inv $HOST << EOF
user $USER $PASS
cd sms
dir . remote_dir.txt
bye
EOF
{process remote_dir.txt now to get new backup name}
ftp -inv $HOST << EOF
user $USER $PASS
put $NEXTDB
bye
EOF

Transfer files from FTP server to local unix server

I have to transfer files whose names consists of two variables X, Y and they are in the directory ABC in ftp server to my local unix directory XYZ. after transfering files i have to go to local directory path and I should untar (input files are compressed files) them. I have to use username and password for connecting to FTP. When copying files to local server also I have to use my username and password.
Here's my current attempt. Will it work? How can I improve it?
ftp -n hostname <<EOF
user username pwd
cd ABC
get ls *X*.tar | ls *Y*.tar username1#pwd1 : XYZ
EOF
bye
for next in `ls *.tar`
do
tar -zvxf $next
done
Please try below code. Hope this helps you.
#! /bin/bash
cd local_path
USER='username'
PASSWD='password'
file_name='files'
for HOST in ftpserver
do
echo $HOST
ftp -n $HOST <<END_SCRIPT
quote USER $USER
quote PASS $PASSWD
bin
prompt
cd "remote_path"
lcd "local_path"
mget $file_name.gz*
quit
END_SCRIPT
done
#extract file
mkdir -p ../archive/$DATE
for HOST in ftpserver
do
gunzip $file_name.gz
done
I would suggest you just look into the manual of ftp command line ftp-tool and script with that.
Alternative: use wget to download the ftp-file to local machine, then scp to target machine, I suppose using public-key-authentication for ssh, that scp does not need a password, then it should end up simple like this.
wget --ftp-user=$USERNAME --ftp-password=$PASSWORD ftp://$HOSTNAME/ABC/$Y.tar
scp $Y.tar $SCPUSER#$SCPHOST/targetpath/$X.tar
You can use wget to download files from FTP server to unix system
cd YOUR_DIRECTORY
wget --user=USERNAME --password='PASSWORD' HOST_NAME/REMOTE_PATH/FILE_NAME.EXTENSION

How can I upload (FTP) files to server in a Bash script?

I'm trying to write a Bash script that uploads a file to a server. How can I achieve this? Is a Bash script the right thing to use for this?
Below are two answers. First is a suggestion to use a more secure/flexible solution like ssh/scp/sftp. Second is an explanation of how to run ftp in batch mode.
A secure solution:
You really should use SSH/SCP/SFTP for this rather than FTP. SSH/SCP have the benefits of being more secure and working with public/private keys which allows it to run without a username or password.
You can send a single file:
scp <file to upload> <username>#<hostname>:<destination path>
Or a whole directory:
scp -r <directory to upload> <username>#<hostname>:<destination path>
For more details on setting up keys and moving files to the server with RSYNC, which is useful if you have a lot of files to move, or if you sometimes get just one new file among a set of random files, take a look at:
http://troy.jdmz.net/rsync/index.html
You can also execute a single command after sshing into a server:
From man ssh
ssh [...snipped...] hostname [command] If command is specified, it is
executed on the remote host instead of a login shell.
So, an example command is:
ssh username#hostname.example bunzip file_just_sent.bz2
If you can use SFTP with keys to gain the benefit of a secured connection, there are two tricks I've used to execute commands.
First, you can pass commands using echo and pipe
echo "put files*.xml" | sftp -p -i ~/.ssh/key_name username#hostname.example
You can also use a batchfile with the -b parameter:
sftp -b batchfile.txt ~/.ssh/key_name username#hostname.example
An FTP solution, if you really need it:
If you understand that FTP is insecure and more limited and you really really want to script it...
There's a great article on this at http://www.stratigery.com/scripting.ftp.html
#!/bin/sh
HOST='ftp.example.com'
USER='yourid'
PASSWD='yourpw'
FILE='file.txt'
ftp -n $HOST <<END_SCRIPT
quote USER $USER
quote PASS $PASSWD
binary
put $FILE
quit
END_SCRIPT
exit 0
The -n to ftp ensures that the command won't try to get the password from the current terminal. The other fancy part is the use of a heredoc: the <<END_SCRIPT starts the heredoc and then that exact same END_SCRIPT on the beginning of the line by itself ends the heredoc. The binary command will set it to binary mode which helps if you are transferring something other than a text file.
You can use a heredoc to do this, e.g.
ftp -n $Server <<End-Of-Session
# -n option disables auto-logon
user anonymous "$Password"
binary
cd $Directory
put "$Filename.lsm"
put "$Filename.tar.gz"
bye
End-Of-Session
so the ftp process is fed on standard input with everything up to End-Of-Session. It is a useful tip for spawning any process, not just ftp! Note that this saves spawning a separate process (echo, cat, etc.). It is not a major resource saving, but it is worth bearing in mind.
The ftp command isn't designed for scripts, so controlling it is awkward, and getting its exit status is even more awkward.
Curl is made to be scriptable, and also has the merit that you can easily switch to other protocols later by just modifying the URL. If you put your FTP credentials in your .netrc, you can simply do:
# Download file
curl --netrc --remote-name ftp://ftp.example.com/file.bin
# Upload file
curl --netrc --upload-file file.bin ftp://ftp.example.com/
If you must, you can specify username and password directly on the command line using --user username:password instead of --netrc.
Install ncftpput and ncftpget. They're usually part of the same package.
Use this to upload a file to a remote location:
#!/bin/bash
#$1 is the file name
#usage:this_script <filename>
HOST='your host'
USER="your user"
PASSWD="pass"
FILE="abc.php"
REMOTEPATH='/html'
ftp -n $HOST <<END_SCRIPT
quote USER $USER
quote PASS $PASSWD
cd $REMOTEPATH
put $FILE
quit
END_SCRIPT
exit 0
The command in one line:
ftp -in -u ftp://username:password#servername/path/to/ localfile
#/bin/bash
# $1 is the file name
# usage: this_script <filename>
IP_address="xx.xxx.xx.xx"
username="username"
domain=my.ftp.domain
password=password
echo "
verbose
open $IP_address
USER $username $password
put $1
bye
" | ftp -n > ftp_$$.log
Working example to put your file on root...see, it's very simple:
#!/bin/sh
HOST='ftp.users.qwest.net'
USER='yourid'
PASSWD='yourpw'
FILE='file.txt'
ftp -n $HOST <<END_SCRIPT
quote USER $USER
quote PASS $PASSWD
put $FILE
quit
END_SCRIPT
exit 0
There isn't any need to complicate stuff. This should work:
#/bin/bash
echo "
verbose
open ftp.mydomain.net
user myusername mypassword
ascii
put textfile1
put textfile2
bin
put binaryfile1
put binaryfile2
bye
" | ftp -n > ftp_$$.log
Or you can use mput if you have many files...
If you want to use it inside a 'for' to copy the last generated files for an everyday backup...
j=0
var="`find /backup/path/ -name 'something*' -type f -mtime -1`"
# We have some files in $var with last day change date
for i in $var
do
j=$(( $j + 1 ))
dirname="`dirname $i`"
filename="`basename $i`"
/usr/bin/ftp -in >> /tmp/ftp.good 2>> /tmp/ftp.bad << EOF
open 123.456.789.012
user user_name passwd
bin
lcd $dirname
put $filename
quit
EOF # End of ftp
done # End of 'for' iteration
echo -e "open <ftp.hostname>\nuser <username> <password>\nbinary\nmkdir New_Folder\nquit" | ftp -nv

Resources