store ftp command output in a variable - bash

I am using bash a script to connect to an FTP server for deleting a file.
I would like to store the output message and code of the delete command executed on the FTP server into a variable of my script.
How could I do this ?
Here is my snippet :
...
function delete_on_ftp{
ftp -i -n $ftp_host $ftp_port <<EOF
quote USER $ftp_login
quote PASS $ftp_pass
delete $1
quit
EOF
}
output_cmd=$(delete_on_ftp $myfile)
...
By the way I do above I only get the message, no way to get the returned code. Is there another way allowing to get the code and the message, in 1 or 2 variables ?
Thanks, Cheers

I just tested the following curl command, which make your task easy.
curl --ftp-ssl -vX "DELE oldfile.pdf" ftp://$user:$pass#$server/public_html/downloads/
Please do not forget the slash at the end of your directory, it is necessary.
curl: (19) RETR response: 550
550 oldfile.pdf: No such file or directory
curl: (19) RETR response: 250
250 DELE command successful
curl is available at http://curl.haxx.se/.

One of the ways to get FTP to act automatically is to use a Netrc file. By default, FTP will use $HOME/.netrc, but you can override that via the -N parameter. The format of a netrc file is fairly straight forward. A line is either a Macrodef or a line that contains login information. Here's an example below:
Netrc File
mysystem login renard password swordfish
another login renard password 123456
default login renard password foofighter
macdef init
binary
cd foo
get bar
delete bar
quit
macdef fubar
...
The three first lines are the logins for various systems. The default is a login for any system which you don't define a particular login for. The lines that start with marcodef are macros you define to do a series of steps for you. The init macro automatically runs right after login. If the last line is quit, it will quit out of FTP for you. There should be a blank line to end the macro, (although most systems will take an End of the File as the end of the macrodef too).
You can create a Netrc file on the fly, enter your FTP command in that, and then, run your FTP command with that Netrc file:
cat > $netrc_file <<<EOF
$ftp_host login $ftp_login password $ftp_password
macdef init
delete $my_file
quit
EOF
ftp -N $netrc_file
You can capture the output via STDOUT, or in a variable and then parse that for what you need:
ftp -N $netrc_file | tee $ftp_output

Other answers on this question should provide you what you want.
However, if you are keen on specifically using ftp command, you can use expect command for the same...
Note, that this is not the best way to achieve what you are trying.
expect -c "log_user 0;
spawn ftp -i -n $ftp_host $ftp_port;
expect \"<add ftp login username: prompt details here>\"
send \"quote USER $ftp_login\r\n\"
expect \"<add ftp login password: prompt details here>\"
send \"quote PASS $ftp_pass\r\n\"
expect \"<add ftp shell prompt details here>\"
log_user 1; send \"delete $1\r\n\"
log_user 0;
expect \"<add ftp shell prompt details here>\"
send \"quit\r\n\";
interact"
You may need to add some more lines in the above for the login & shell prompts returned by the ftp command.

Related

automate getting diagnostic files from a controller via ssh commands

I'd like to automate getting diagnostic files from a controller that responds to ssh commands, like e.g.
ssh diag#controller tarred > diags.tgz
Unfortunately, I have to type a password to make the above command go through.
What have I considered to get around that:
using ssh keys: not possible, since I can't login to the controller, it just expects commands and doesn't offer a shell
using ssh-pass package: I don't have admin rights on the machine and can't install packages
using "expect": works to some extent, but the resulting file is corrupted.
Here's the "expect" script I've used:
#!/usr/bin/expect -f
log_user 0
set timeout 300
spawn ssh diag#controller tarred
expect "?assword:"
send "unrealpassword\r"
expect \r\n
log_user 1
expect eof
The script makes sure that only the required output gets stored with the "log_user" commands until eof is encountered.
I've piped this script to a file and that file is corrupted, i.e. it's either too short (because of a timeout?) or too long (?).
Any idea about what goes wrong here.?

Suppress welcome message on bash remote command execution

I'm executing some commands on remote server within a shell script like this:
ssh user#host <<ENDSSH
...
ENDSSH
Upon login I'm getting a standard server welcome message echoed. Is there a way to send it to \dev\null but to keep displaying the output of executed commands?
Thanks.
Create a file ~user/.hushlogin on the remote host. This will suppress output from the login program when user logs in (such as time of last login and any message of the day).
You can edit /etc/ssh/sshd_config (for debian/ubuntu, your server might be different file) and turn the following setting to 'no'.
PrintMotd no
PrintLastLog no

BASH scripting for username/password constructs

I want to write a simple bash script using ncat to open a connection to a ISP and its port.
The first command would be:
nc address port
Upon doing this, I am prompted first to provide a username. I must hit ENTER, and then I will be prompted to provide a password and then I must hit ENTER again.
After this, I want to open a Terminal process window. Can anyone point me to sufficient resources for this type of scripting?
I know the username and password already, but I'm not too sure how to work around the fact that I must provide it and then hit enter. I'm also unsure how to open a new Terminal proceses.
Thanks in advance!
Check out expect script
Expect
Example:
# Assume $remote_server, $my_user_id, $my_password, and $my_command were read in earlier
# in the script.
# Open a telnet session to a remote server, and wait for a username prompt.
spawn telnet $remote_server
expect "username:"
# Send the username, and then wait for a password prompt.
send "$my_user_id\r"
expect "password:"
# Send the password, and then wait for a shell prompt.
send "$my_password\r"
expect "%"
# Send the prebuilt command, and then wait for another shell prompt.
send "$my_command\r"
expect "%"
# Capture the results of the command into a variable. This can be displayed, or written to disk.
set results $expect_out(buffer)
# Exit the telnet session, and wait for a special end-of-file character.
send "exit\r"
expect eof
The secret lies in the HEREDOC
You can solve this problem with something akin to:
$ command-that-needs-input <<EOF
authenticate here
issue a command
issue another command
EOF
Look at the link I provided for here documents - it includes support for variable substitution and lots of other useful things. Enjoy!

Error logging in via FTP script in KornShell

I am trying to FTP a file using a script in KornShell (ksh) and I am getting a login failed message. I can login manually just fine but when I try the automated script, it does not like the password portion of the login information.
Here's my script:
ftp -n ftp.stmp.com <<EOF
user quser pass Sky3s3ch
binary
hash
prompt
put chr*.dat
EOF
And this is the error that I get:
dns: /u04/lms/ora_shell/clients/STMP > LMS_STMP_ECHI_FTP.ksh
Not logged in.
Login failed.
Please login with USER and PASS.
Hash mark printing on (1024 bytes/hash mark).
Interactive mode off.
Please login with USER and PASS.
Please login with USER and PASS.
I would appreciate any help I can get in figuring this out. Thanks in advance.
there are many different ftp clients, but I'm not familiar with one that requires the word pass as part of a single line login like you are using. Try
ftp -n ftp.stmp.com <<EOF
user quser Sky3s3ch
. . .
EOF
Another common form is to move the hostname inside the ftp input stream, i.e.
ftp -in <<EOF
open ftp.stmp.com
quser Sky3s3ch
. . .
EOF
I don't have my sample code availab.e You may need user on the 2nd line of input, but I don't think so.
Edit
Finally, I noticed you have put chr*.dat in your input script. To transfer multiple files at the same time, you'll need the mput command instead.
I hope this helps.

Unix shell script with Iseries command

I am trying to ftp a file from unix to as400 and executing iseries command in the script.
ftp is working fine,I am getting an error in jobd command as
HOST=KCBNSXDD.svr.us.bank.net
USER=test
PASS=1234 #This is the password for the FTP user.
ftp -env $HOST << EOF
# Call 2. Here the login credentials are supplied by calling the variables.
user $USER $PASS
# Call 3. Here you will change to the directory where you want to put or get
cd "\$QARCVBEN"
# Call4. Here you will tell FTP to put or get the file.
#Ebcdic
#Mode b
quote site crtccsid *user
quote site crtccsid *sysval
put prod.txt
quote rcmd sbmjob cmd(call pgm(pmtiprcc0) parm('prod' 'DEV')) job(\$pmtiprcc) jobd(orderbatch)
550-Error occurred on command SBMJOB cmd(call pgm(pmtiprcc0)) job($pmtiprcc) jobd(orderbatch).
550 Errors occurred on SBMJOB command..
221 QUIT subcommand received.
I cannot help thinking since visiting this link that you could be missing this...
QUOTE RCMD OS/400 CL command | program [parameter1, parameter2, . . .parameterx] (Remote Command)
Should your FTP command script be...
quote os/400 cl sbmjob cmd(call pgm(pmtiprcc0) parm('prod' 'DEV')) job(\$pmtiprcc) jobd(orderbatch)

Resources