ftp shell script breaks up after the lcd-command - shell

I created a shell script to download several files, all starting with "2014" from an ftp server. I use mget for this and the filename 2014*.
To make sure that the files are saved at the right local place I use lcd before.
It looks like this:
#!/bin/sh
HOST='ftpserver.name.de'
USER='user1'
PASSWD='pw1'
FILE='2014*'
LOCDIR='/home/local/data2014/'
ftp -n $HOST <<END_SCRIPT
quote USER $USER
quote PASS $PASSWD
lcd $LOCDIR
mget $FILE
quit
END_SCRIPT
exit 0
when I try this, the script just runs:
lx9000: ftp_get.sh
Connected to ftpserver.name.de.
220 FTP-Server: ftpserver.name.de
331 Password required for user1
230 User user1 logged in
Local directory now: /home/local/data2014/
221 Goodbye.
why dose it stop before the downloading proceeds?
Thanks for help!

Try adding:
prompt off
before the mget.

Related

Change shell script from ftp to sftp

I have below shell script to get files from ftp server.
I need to change this script to point to same SFTP server ?
Can some one assist to change this script from ftp to sftp?
HOST='some.site.com'
USER='yourid'`enter code here`
PASSWD='yourpw'
FILE='file.txt'
ftp $HOST <<END_SCRIPT
user $USER
$PASSWD
put $FILE
quit
END_SCRIPT
exit 0
It's not so complicated, just use the sftp binary instead of the ftp. The FTP commands stay exactly the same (it's still the FTP protocol, just over an encrypted connection), the user may be specified as a part of the SFTP's host argument:
sftp $USER#$HOST <<EOF
$PASSWD
put $FILE
quit
EOF

Bash - uploading multiple files to FTP using real path?

I have this script to upload files to FTP (I know FTP is not secure, though client insists of using FTP..). It works fine, but the problem with it is does not recognize path provided when doing upload, even though message says it uploaded successfully, but nothing is uploaded.
So the script looks like this:
#!/bin/bash
HOST=host
USER=user
PASS=pass
DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
ftp -inv $HOST << EOF
user $USER $PASS
get
cd /path/in/ftp/
prompt
mput $DIR/*.csv
# End FTP Connection
bye
EOF
rm $DIR/*.csv
Here what is outputted:
Connected to host.
220 You have connected to Client's FTP server.
?Invalid command
331 User name okay, need password.
230-Welcome user from ip. You are now logged in to the server.
230 User logged in, proceed.
Remote system type is UNIX.
Using binary mode to transfer files.
?Invalid command
250 Directory changed to "/path/in/ftp/"
?Invalid command
Interactive mode on.
mput /path/inv_numbers_2016-11-21_12_09.csv? 200 PORT command successful.
150 File status okay; about to open data connection.
226 Closing data connection. Transferred 140 bytes in 1 seconds. 0KB/second.
140 bytes sent in 0.00 secs (1395.1 kB/s)
?Invalid command
221 Session Ended. Downloaded 0KB, Uploaded 1KB. Goodbye user from ip.
Now if I change mput $DIR/*.csv to mput *.csv, then it works (I get same log output like with previous one, except it shows path as being directly in directory where script is). But this only works if I would run script directly from directory it is placed in.
Any ideas?
Replace
ftp -inv $HOST << EOF
by
cd "$DIR" && ftp -inv $HOST << EOF
or by
cd "$DIR" || exit 1
ftp -inv $HOST << EOF

Download files from specific folder in AIX box

I have an AIX box. I wanted to connect to remote FTP server and download a specified folder "abc".
I have created a script; but it isn't working.
Here is my Code:
#!/bin/sh HOST='ftp.abc.xysz.net' USER='ftp' PASSWD='password' FILE='ababababababababababababab.abab';
ftp $HOST user $USER $PASSWD mget $FILE
quit END_SCRIPT
exit 0
Here is the error I receive when I execute Script..
Anyone has any idea to download the files from remote FTP server. Is there any single command available
Your script doesn't look properly formatted... it would be something like this:
#!/bin/sh
HOST='10.129.151.41'
USER='technicolor'
PASSWD='blueray'
FILE='ababababababababababababab.abab'
ftp -n <<END_SCRIPT
open $HOST
user $USER $PASSWD
get $FILE
quit
END_SCRIPT

How do you connect to FTP server via a shell-script

I am writing my first shell-script ever and I am trying to connect to an FTP server. However, I am utterly at a loss for how to do this. I tried a google search, but I am still stumped.
I am trying to connect with a username and password (not a ssh id).
Thanks for your help. Again this is my first shell-script ever.
The command man ftp should give you the necessary pointers.
This being said, this page might help you build a complete shell script
Here how you connect to FTP server via a shell-script :
nano MyConnectFTPScript.sh
#!/bin/sh
$HOST='hostAdresss'
$USER='NameOfUser'
$PASSWD='YourPass'
$FILEtoPut='myFile1'
$FILEtoGet='myFile2'
$FILEtoDelete='myFile3'
ftp -n $HOST <<END_SCRIPT
quote USER $USER
quote PASS $PASSWD
put $FILEtoPut
get $FILEtoGet
delete $FILEtoDelete
quit
END_SCRIPT
exit 0
chmod +x MyConnectFTPScript.sh
and execute :
./MyConnectFTPScript.sh
I hope these will be helpful.
Samir

Deleting files using ftp

I have developed a shell script to copy the files from source to destination and simultaneously to delete the copied files in source. I can copy the files but the files cannot be deleted in source side.
files='ls filename'
for file in $files
do
ftp -vn $hostname <<EOFD
quote USER $username
quote PASS $password
binary
cd $start_dir
rm -rf $file
quit
EOFD
done
I got errors as 'No such files or directories found'
By putting ftp outside the forloop also i got error as 'invalid command'
I also tried in ssh but it prompting for username and password
files=`ls filename`
Put backticks, not simple quotes around the command to get its output.
I also tried in ssh but it prompting for username and password - check SSH-Login without password.
Scripting FTP commands using input stream directly to ftp is usually a bad idea: it lacks any error handling, it can go totally wrong and you have no chance to control it. If you have any chance to use saner command-line client, such as lftp, curl or a similar scriptable one.
Also, it's a very bad idea to iterate over files using
files=`ls files`
for file in $files
A slightly better solution is:
for file in *
but it doesn't scale: if * (or ls output) would expand more than command line buffer, it will fail. A fairly scalable solution is something like:
find . | while read file do
do_something_with $file
done
...and yet it's not probably what you want. In fact, if you just want to transfer files from source to destination and then delete files at source, you can just use lftp with mput command and -E option to delete file after transfer, or something similar with rsync --remove-source-files.
Full-proof solution:
Replace the line
`rm -rf $file`
with
`!rm -rf $file`
This is because, at that place in the code you are on the ftp console until the EOFD string is reached, so to run any command on local system(source), you need ! to be prefixed.
Best way to test is manually executing the commands. Here's what I have tested:
mtk4#laptop:~$ ftp XXX.XXX.XXX
Connected to static-XX-XX-XX-XX.XXXXXX.com.
220---------- Welcome to Pure-FTPd [privsep] [TLS] ----------
220-You are user number 2 of 50 allowed.
220-Local time is now 07:52. Server port: 21.
220-IPv6 connections are also welcome on this server.
220 You will be disconnected after 15 minutes of inactivity.
Name (XXXXXX.XXX:XXX): XXXXXXX
331 User XXXXXXX OK. Password required
Password:
230 OK. Current restricted directory is /
Remote system type is UNIX.
Using binary mode to transfer files.
ftp> lcd test/
Local directory now /home/mtk4/test
ftp> pwd
257 "/" is your current location
ftp> !pwd
/home/mtk4/test
ftp> !ls
sample.txt
ftp> !rm sample.txt
ftp> !ls
ftp> bye
221-Goodbye. You uploaded 0 and downloaded 0 kbytes.
221 Logout.
mtk4#laptop:~$
Or another solution,
use the same for loop again, after the complete ftp is done to iterate over the same set of files and delete them.

Resources