Hello I am writing the following script to automatically send several files using the ftp protocol, I decided to use expect to achieve this, I tried, the following:
#!/usr/bin/expect
Defining variables:
set user myUser
set gate myGate
set password myPassword
Defining a token that changes every 3 minutes:
set token [lindex $argv 0]
Here is the second parameter that is the name of the file to send:
set file [lindex $argv 1]
set server myServer
set password2 myPassword2
spawn ftp ${gate}
expect "some lines of response:"
send "${user}\r"
expect "password:"
send "${password}${token}\r"
expect "ftp>"
send "user myServer\r"
expect "Password:"
send "myPassword2"
Changing me to the appropriate directory:
send "pwd\r"
send "cd myFolder\r"
expect "successfully changed."
here is where the problem appears:
send "put ${file}\r"
interact
I run it as follows:
expect.exe ftpScript myToken filesToSend/
Everything is ok until the part where it tries to send the directory with the files:
ftp> myFolder/
250 Directory successfully changed.
ftp> put filesToSend/
filesToSend/: not a plain file.
ftp>
My folder is located at the same level where is stored my script:
ls:
filesToSend ftpScript
If I do an ls to the directory called filesToSend it looks as follows:
$ ls
file1 file10 file11 file12 file2 file3 file4 file5 file6 file7 file8 file9
Thus I would like to appreciate any idea to achieve this and how to improve my code since recently I started to learn expect, thanks in advance for the help.
After an answer I copied the script into the same folder where are located the files to send and I tried:
send "mput ${file}\r"
but after to run it:
expect.exe ftpScript token file10 file11 file12
with that three files for testing, I only was able to send file10 succesfully, after a confirmation:
mput file10? y
200 PORT command successful [translated to PASV by DeleGate].
150 Ok to send data.
226 File receive OK.
I believe that is only sending one file since the way that I am using to get the parameters I only have this line to get the parameters:
set file [lindex $argv 1]
I am not very sure if I need to use here a list and this is the reason that causes that only one file sent, just is considering argv1 this script, Thanks for the support, I would like to receive suggestions to fix it.
I believe you could solve your problem by using the command mput with a wildcard instead of put.
Edit your expect script to change the put line by the following :
send "mput ${file}\r"
Then invoke with :
expect ftpScript myToken filesToSend/*
I solved the issue as follows:
If we perform:
cd filesToSend
and then we modify the script as follows:
spawn ftp -i ${gate}
-i flag turns off the interactive mode which prompts for confirmation for every files.
Then we use:
send "mput *\r"
Finally we are able to send all the files without error to the destiny, running just:
expect.exe /myPath/ftpScript myToken
we receive all the files in the correct destiny:
file1 file10 file11 file12 file2 file3 file4 file5 file6 file7 file8 file9
Thanks for all the useful comments, after to research I decided to post my answer since maybe someone find this script useful to make their life easier, talking about to send multiple files using ftp,
Related
This Expect script is a part of my UNIX Bash script
expect -c "
spawn scp yoko#sd.lindeneau.com:\"encryptor *.enc\" .
expect password: { send \"$PASS\r\" }
expect 100%
sleep 1
exit
"
I am trying to copy both 'encryptor' and '*.enc' with this one SCP command. Console tells me it cannot find ' *.enc" '
The syntax for multiple files:
$ scp your_username#remotehost.edu:~/\{foo.txt,bar.txt\} .
I would guess in your case (untested)
scp yoko#sd.lindeneau.com:\\{encryptor, \*.enc\\} .
Not sure it helps but I was looking for a similar objective.
Means: Copying some selected files based on their filenames / extensions but located in different subfolders (same level of subfolders).
this works e.g. copying .csv files from Server:/CommonPath/SpecificPath/
scp -r Server:/CommonPath/\*/\*csv YourRecordingLocation
I even tested more complex "perl like" regular expressions.
Not sure the -r option is still useful.
#!/bin/bash
expect -c "
set timeout 60; # 1 min
spawn scp yoko#sd.lindeneau.com:{encryptor *.enc} .
expect \"password: $\"
send \"mypassword\r\"
expect eof
"
You can increase the timeout if your file copy takes more time to complete. I have used expect eof which will wait till the closure of the scp command. i.e. we are waiting for the End Of File (EOF) of the scp after sending the password.
I am using bash a script to connect to an FTP server for deleting a file.
I would like to store the output message and code of the delete command executed on the FTP server into a variable of my script.
How could I do this ?
Here is my snippet :
...
function delete_on_ftp{
ftp -i -n $ftp_host $ftp_port <<EOF
quote USER $ftp_login
quote PASS $ftp_pass
delete $1
quit
EOF
}
output_cmd=$(delete_on_ftp $myfile)
...
By the way I do above I only get the message, no way to get the returned code. Is there another way allowing to get the code and the message, in 1 or 2 variables ?
Thanks, Cheers
I just tested the following curl command, which make your task easy.
curl --ftp-ssl -vX "DELE oldfile.pdf" ftp://$user:$pass#$server/public_html/downloads/
Please do not forget the slash at the end of your directory, it is necessary.
curl: (19) RETR response: 550
550 oldfile.pdf: No such file or directory
curl: (19) RETR response: 250
250 DELE command successful
curl is available at http://curl.haxx.se/.
One of the ways to get FTP to act automatically is to use a Netrc file. By default, FTP will use $HOME/.netrc, but you can override that via the -N parameter. The format of a netrc file is fairly straight forward. A line is either a Macrodef or a line that contains login information. Here's an example below:
Netrc File
mysystem login renard password swordfish
another login renard password 123456
default login renard password foofighter
macdef init
binary
cd foo
get bar
delete bar
quit
macdef fubar
...
The three first lines are the logins for various systems. The default is a login for any system which you don't define a particular login for. The lines that start with marcodef are macros you define to do a series of steps for you. The init macro automatically runs right after login. If the last line is quit, it will quit out of FTP for you. There should be a blank line to end the macro, (although most systems will take an End of the File as the end of the macrodef too).
You can create a Netrc file on the fly, enter your FTP command in that, and then, run your FTP command with that Netrc file:
cat > $netrc_file <<<EOF
$ftp_host login $ftp_login password $ftp_password
macdef init
delete $my_file
quit
EOF
ftp -N $netrc_file
You can capture the output via STDOUT, or in a variable and then parse that for what you need:
ftp -N $netrc_file | tee $ftp_output
Other answers on this question should provide you what you want.
However, if you are keen on specifically using ftp command, you can use expect command for the same...
Note, that this is not the best way to achieve what you are trying.
expect -c "log_user 0;
spawn ftp -i -n $ftp_host $ftp_port;
expect \"<add ftp login username: prompt details here>\"
send \"quote USER $ftp_login\r\n\"
expect \"<add ftp login password: prompt details here>\"
send \"quote PASS $ftp_pass\r\n\"
expect \"<add ftp shell prompt details here>\"
log_user 1; send \"delete $1\r\n\"
log_user 0;
expect \"<add ftp shell prompt details here>\"
send \"quit\r\n\";
interact"
You may need to add some more lines in the above for the login & shell prompts returned by the ftp command.
I have problems with expect script. Well when I grep this file I need to put it to line under and it should look like :
/opt/ericsson/arne/bin/import.sh -f bla_bla_bla.xml -val:rall
but I don't know how to put this file in beetween this line. Because when I have put grep command in beetween in didn't work, maybe problem is -val:rall that I have after?
If someone know's how could I put name of file in File1
#!/usr/local/bin/expect --
set env(TERM) vt100
set env(SHELL) /bin/sh
set env(HOME) /usr/local/bin
set PASSWORD ter
set DUL [lindex $argv 0]
set VAR _cus_ipsec
match_max 1000
spawn ssh mashost
expect {
"assword" {send "$PASSWORD\r"}
}
expect "ran#rn23"
send -- "cd /tih/opt/bla/tih/ \r"
expect "ran#rn23"
send -- "grep -il $DUL * \r*"
expect "ran#rn23"
send -- "/opt/bl/arne/bin/imp.sh -f File1 -val:rall\r"
expect "ran#rn03"
send -- "/opt/bl/arne/b/imp.sh -f File1 -import\r"
expect "ran#rn23"
interact
Ok, thanks for the clarification, I believe I do understand what you're trying to do now.
What you need to do is change the expect statement you have after you send the grep command to one that will capture your filename. And you will probably benefit from using the regexp mode of the expect command (-re), and possibly using parenthesis to capture the filename (not used in my sample below). I do not know what are the possible filenames that you can get from your grep, so you will probably need to tweak the below quite a bit, but assuming your grep will give you a single .xml file beginning with "NAME", you could do something like the following:
send -- "grep -il $DUL * \r*"
expect -re "NAME.*\.xml"
send -- "/opt/bl/arne/bin/imp.sh -f $expect_out(0,string) -val:rall\r"
As a suggestion, you should really include some timeout options for your expect statements, and some error checking, otherwise this script will not stop if anything does not go as expected. E.g. only send if you have found what you expected, etc.
Your regexp probably will be more complicated than the one I showed you, but you can get the idea. Also, include exp_internal 1 somewhere near the top of your script to get good, solid debugging info on what your script is matching (or not matching). It will be very useful as you test it.
Let me know how that goes.
I am trying to transfer a file from one server to a remote server with the help of a TCL script.
But my script stops after the message "200 Port set okay" and continues to rum from the below telnet session.
I have checked the destination location, my file is not transferred.
Please suggest what can I do or where I am wrong
#!/usr/bin/tclsh
#!/usr/bin/expect
package require Expect
set p "mm155_005.006.010.200_bt.fw"
#**************************************************************\
FILE TRANSFER TO REMOTE SERVER \
***************************************************************
spawn ftp 10.87.121.26
expect "User (10.87.121.26:(none)):"
send "user\r"
expect "Password:"
send "pswd\r"
expect "ftp>"
send "cd FW\r"
expect "ftp>"
send "ha\r"
expect "ftp>"
send "bi\r"
expect "ftp>"
send "mput \"$p\"\r"
expect "mput $p? "
send "yes\r"
expect "ftp>"
send "ls\r"
#**************************************************************\
RUNNING THE TRANSFERED FILE \
***************************************************************
spawn telnet 10.87.121.26
expect "Login: "
send "user\r"
expect "password: "
send "pswd\r"
expect "*? > "
send "cd FW\r"
expect "*? > "
send "burnboot 30 5.6(10.200)\r"
Output
spawn ftp 10.87.121.26
Connected to 10.87.121.26.
220 VxWorks FTP server (VxWorks VxWorks5.4.2) ready.
Name (10.87.121.26:vkumar): user
331 Password required
Password:
230 User logged in
Remote system type is UNIX.
Using binary mode to transfer files.
ftp> cd FW
250 Changed directory to "C:/FW"
ftp> ha
Hash mark printing on (1024 bytes/hash mark).
ftp> bi
200 Type set to I, binary mode
ftp> mput "mm155_005.006.010.200_bt.fw"
mput mm155_005.006.010.200_bt.fw? yes
200 Port set okay \ I am unable to see hash progress bar after this line
spawn telnet 10.87.121.26
Trying 10.87.121.26...
Connected to 10.87.121.26.
Escape character is '^]'.
Login: user
password:
node84.7.PXM.a > cd FW
node84.7.PXM.a > bash-2.05b$
Instead of rolling your own solution in Expect (I also did this about 9 years ago), use the FTP module from tcllib -- it's already battle-hardened.
http://tcllib.sourceforge.net/doc/ftp.html
The script as reported is unlikely to produce exactly that output; there is nothing from the ls done after the mput. However, if the mput is hanging the most likely problem is that there is a firewall issue; FTP uses multiple sockets to do file transfers (which is why FTP is such a pain when it comes to overall firewall management). In particular, it has a command channel (the socket which you communicate with the FTP server over) and a separate data channel per file (and also with the output of some remote commands, such as ls); that's what that Port set okay is about. This is not firewall-friendly, and it's easy to misconfigure firewalls in this area (especially when there is NAT also in place).
You might (i.e., try this first) want to use passive mode instead, as that reduces the complexity at the firewall level. Try issuing a passive before the mput (just as you currently issue a binary).
Here's a bash script I wrote with a similar function;
You should be able to adapt it to your needs.
Adding a few lines to SSH in and run the script should be quite trivial.
As a side note; why are you using TCL for this?
#!/bin/bash
fileName=`ls /home/user/downloads -t1 | head -n1`
latestFile="/home/user/downloads/$fileName"
echo $latestFile
hostname="HOSTNAME"
username="USER"
password="PASS"
ftp -inv $hostname << EOF
quote USER $username
quote PASS $password
cd transfer/jirabackup
binary
put $latestFile $fileName
quit
How can we copy the files recursively to the remote server using expect script or any other script?.
Constraints.
1. We couldn't limit the number of file will be copied.
2. File size may be 1mb or upto 10mb.
I was tried with the following script. But it's only transfer upto 4 or 5 files only. (I need to transfer files, nearly 200 or 300 above)
spawn scp -r /home/test root#example.com:/home/test
sleep 2
expect "password"
send "XXXXXX"
sleep 2
Before the spawn command, add the line
set timeout -1
and replace the 2nd sleep command with
expect eof
Don't forget to add \r when you send your password: send "password\r"
I'd recommend you set up SSH keys -- then you won't be prompted for a password and you don't need the expect script at all.