Bash - uploading multiple files to FTP using real path? - bash

I have this script to upload files to FTP (I know FTP is not secure, though client insists of using FTP..). It works fine, but the problem with it is does not recognize path provided when doing upload, even though message says it uploaded successfully, but nothing is uploaded.
So the script looks like this:
#!/bin/bash
HOST=host
USER=user
PASS=pass
DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
ftp -inv $HOST << EOF
user $USER $PASS
get
cd /path/in/ftp/
prompt
mput $DIR/*.csv
# End FTP Connection
bye
EOF
rm $DIR/*.csv
Here what is outputted:
Connected to host.
220 You have connected to Client's FTP server.
?Invalid command
331 User name okay, need password.
230-Welcome user from ip. You are now logged in to the server.
230 User logged in, proceed.
Remote system type is UNIX.
Using binary mode to transfer files.
?Invalid command
250 Directory changed to "/path/in/ftp/"
?Invalid command
Interactive mode on.
mput /path/inv_numbers_2016-11-21_12_09.csv? 200 PORT command successful.
150 File status okay; about to open data connection.
226 Closing data connection. Transferred 140 bytes in 1 seconds. 0KB/second.
140 bytes sent in 0.00 secs (1395.1 kB/s)
?Invalid command
221 Session Ended. Downloaded 0KB, Uploaded 1KB. Goodbye user from ip.
Now if I change mput $DIR/*.csv to mput *.csv, then it works (I get same log output like with previous one, except it shows path as being directly in directory where script is). But this only works if I would run script directly from directory it is placed in.
Any ideas?

Replace
ftp -inv $HOST << EOF
by
cd "$DIR" && ftp -inv $HOST << EOF
or by
cd "$DIR" || exit 1
ftp -inv $HOST << EOF

Related

Change shell script from ftp to sftp

I have below shell script to get files from ftp server.
I need to change this script to point to same SFTP server ?
Can some one assist to change this script from ftp to sftp?
HOST='some.site.com'
USER='yourid'`enter code here`
PASSWD='yourpw'
FILE='file.txt'
ftp $HOST <<END_SCRIPT
user $USER
$PASSWD
put $FILE
quit
END_SCRIPT
exit 0
It's not so complicated, just use the sftp binary instead of the ftp. The FTP commands stay exactly the same (it's still the FTP protocol, just over an encrypted connection), the user may be specified as a part of the SFTP's host argument:
sftp $USER#$HOST <<EOF
$PASSWD
put $FILE
quit
EOF

FTPing inside of a KSH script from AIX to Windows to GET a file

#!/bin/ksh
. $HOME/bin/init.ksh
# log_start
exit_if_not_dgftp
# EXPIRE_DAYS=30
# create_arc_dir
# handle_error abort $? "Command create_arc_dir failed" $USER
# purge_archive $EXPIRE_DAYS
WORK_FILE=RetriesExceeded.csv
USER=LawsonIT-FinMM#test.org
HOST=lawsonfax.test.org
# Ftp the file
# Ftp username and password is in .netrc
$FTP -v $HOST
lcd $WORK_FILE
get RetriesExceeded.csv
quit
# archive_file $WORK_FILE
# /law/bin/mpack -s "Fax Retries Exceeded" $WORK_FILE_OUTPUT $USER
# log_stop
exit 0
[dgftp#lawapp2]/lawif/bin$ get_lawson_fax.ksh
Connected to lawsonfax
220 Microsoft FTP Service
331 Password required for dgftp.
230 User logged in.
ftp> quit
221 Goodbye.
/lawif/bin/get_lawson_fax.ksh[33]: lcd: not found
/lawif/bin/get_lawson_fax.ksh[34]: get: not found
/lawif/bin/get_lawson_fax.ksh[35]: quit: not found
[dgftp#lawapp2]/lawif/bin$
Explanation: The script connects fine to the lawson fax server, but stops at a FTP prompt. I can type in 'Get' interactively and it works, but in a KSH script it just stops at ftp prompt and then when I quit it gives three not found errors. If I am on the AIX server and manually FTP, open lawsonfax, get retriesexceed.csv there is no issue pulling the file.
Try using shell-redirection:
ftp -v "$HOST" <<DONE
lcd $WORK_FILE
get RetriesExceeded.csv
quit
DONE

ftp shell script breaks up after the lcd-command

I created a shell script to download several files, all starting with "2014" from an ftp server. I use mget for this and the filename 2014*.
To make sure that the files are saved at the right local place I use lcd before.
It looks like this:
#!/bin/sh
HOST='ftpserver.name.de'
USER='user1'
PASSWD='pw1'
FILE='2014*'
LOCDIR='/home/local/data2014/'
ftp -n $HOST <<END_SCRIPT
quote USER $USER
quote PASS $PASSWD
lcd $LOCDIR
mget $FILE
quit
END_SCRIPT
exit 0
when I try this, the script just runs:
lx9000: ftp_get.sh
Connected to ftpserver.name.de.
220 FTP-Server: ftpserver.name.de
331 Password required for user1
230 User user1 logged in
Local directory now: /home/local/data2014/
221 Goodbye.
why dose it stop before the downloading proceeds?
Thanks for help!
Try adding:
prompt off
before the mget.

cannot login to the ftp server

Anyone has an idea what will be the problem?
#!/bin/bash -x
HOST='192.163.3.3'
USER='ftpuser'
PASSWD='apple'
Logfile=a.log
while :; do
ftp -n -p -v $HOST < example.script >> a.log
grep -qF "Connected" a.log &&
grep -qF "File successfully transferred" a.log && break
done
quote USER $USER
quote PASS $PASSWD
example.script contains
put example.txt
after running it gives
grep: a.log: No such file or directory
grep: a.log: No such file or directory
.
.
example.script and the created a.log is on /home directory
the a.log contains
Connected to 192.163.3.3.
220---------- Welcome to Pure-FTPd [privsep] [TLS] ----------
220-You are user number 9 of 50 allowed.
220-Local time is now 14:38. Server port: 21.
220-This is a private system - No anonymous login
220-IPv6 connections are also welcome on this server.
220 You will be disconnected after 15 minutes of inactivity.
Remote system type is UNIX.
Using binary mode to transfer files.
local: example.txt remote: example.txt
530 You aren't logged in
Passive mode refused.
221-Goodbye. You uploaded 0 and downloaded 0 kbytes.
221 Logout.
why cant i logged in?
HOST='192.163.3.3'
USER='ftpuser'
PASSWD='apple'
FILE='example.txt'
Logfile=a.log
ftp -n -p -v $HOST << SCRIPT_END >> a.log
quote USER $USER
quote PASS $PASSWD
put $FILE
SCRIPT_END
with this it works but why? what will be the difference?
Your FTP daemon is refusing passive mode file transfers (PASV command). You have to enable passive mode transfers on Pure-FTPd. This site has nice tutorial on how to do it:
Getting passive FTP connections to work through a firewall properly - scroll down to section Setting up the FTP Server (Pure-FTPD).
I usually find creating a netrc file and using that to at least start my FTP session gets me around a lot of issues with scripting an ftp session. Most ftp programs give you the option of using an alternate netrc file, so you don't need to setup one at $HOME/.netrc.
I believe you just need a one liner like this:
machine 192.163.3.3 login ftpuser password apple
Unfortunately, I can't test it because I don't have ftp setup anywhere. Everything here is scp/ssh/sftp. Here's an example .netrc file.
It is complaining about being in passive mode, I would remove the -p from the command or allow passive FTP.
-p
Use passive mode for data transfers. Allows use of ftp in environments where a firewall prevents connections from the outside world back to the client machine. Requires that the ftp server support the PASV command. This is the default now for all clients (ftp and pftp) due to security concerns using the PORT transfer mode. The flag is kept for compatibility only and has no effect anymore.

Deleting files using ftp

I have developed a shell script to copy the files from source to destination and simultaneously to delete the copied files in source. I can copy the files but the files cannot be deleted in source side.
files='ls filename'
for file in $files
do
ftp -vn $hostname <<EOFD
quote USER $username
quote PASS $password
binary
cd $start_dir
rm -rf $file
quit
EOFD
done
I got errors as 'No such files or directories found'
By putting ftp outside the forloop also i got error as 'invalid command'
I also tried in ssh but it prompting for username and password
files=`ls filename`
Put backticks, not simple quotes around the command to get its output.
I also tried in ssh but it prompting for username and password - check SSH-Login without password.
Scripting FTP commands using input stream directly to ftp is usually a bad idea: it lacks any error handling, it can go totally wrong and you have no chance to control it. If you have any chance to use saner command-line client, such as lftp, curl or a similar scriptable one.
Also, it's a very bad idea to iterate over files using
files=`ls files`
for file in $files
A slightly better solution is:
for file in *
but it doesn't scale: if * (or ls output) would expand more than command line buffer, it will fail. A fairly scalable solution is something like:
find . | while read file do
do_something_with $file
done
...and yet it's not probably what you want. In fact, if you just want to transfer files from source to destination and then delete files at source, you can just use lftp with mput command and -E option to delete file after transfer, or something similar with rsync --remove-source-files.
Full-proof solution:
Replace the line
`rm -rf $file`
with
`!rm -rf $file`
This is because, at that place in the code you are on the ftp console until the EOFD string is reached, so to run any command on local system(source), you need ! to be prefixed.
Best way to test is manually executing the commands. Here's what I have tested:
mtk4#laptop:~$ ftp XXX.XXX.XXX
Connected to static-XX-XX-XX-XX.XXXXXX.com.
220---------- Welcome to Pure-FTPd [privsep] [TLS] ----------
220-You are user number 2 of 50 allowed.
220-Local time is now 07:52. Server port: 21.
220-IPv6 connections are also welcome on this server.
220 You will be disconnected after 15 minutes of inactivity.
Name (XXXXXX.XXX:XXX): XXXXXXX
331 User XXXXXXX OK. Password required
Password:
230 OK. Current restricted directory is /
Remote system type is UNIX.
Using binary mode to transfer files.
ftp> lcd test/
Local directory now /home/mtk4/test
ftp> pwd
257 "/" is your current location
ftp> !pwd
/home/mtk4/test
ftp> !ls
sample.txt
ftp> !rm sample.txt
ftp> !ls
ftp> bye
221-Goodbye. You uploaded 0 and downloaded 0 kbytes.
221 Logout.
mtk4#laptop:~$
Or another solution,
use the same for loop again, after the complete ftp is done to iterate over the same set of files and delete them.

Resources