pick up files from remote server and then delete - expect

I have an expect script that retrieves files from a remote location based on a pattern and places them in the current directory. After The transfer I want to delete the files that were retrieved from the remote location. My script does an MGET and does the transfer. I tried using the RM command based on the pattern that was used to perform the MGET to remove the files, but I can remove more than I retrieved. Can you help me to only delete the files that i retrieved from the remote location?
$env(SSH_FILE)\r" is the pattern variable in the script. Ex: .835
sftp> rm *.835*
Removing /EdifecsFTP/Inbound835/W6BA/90404B277947nCOR0.835
Removing /EdifecsFTP/Inbound835/W6BA/90404B277947nCORS.835
I DON"T WANT TO DELETE BASED ON THE PATTERN.
send "CD $env(SSH_CDIR)\r"
expect "No such file or directory" {quit;exit 1}
send "LCD $env(SSH_LCD)\r"
expect "path name does not exist" {quit;exit 1}
send "mget $env(SSH_FILE)\r"
send "rm $env(SSH_FILE)\r"

What you can do is:
after mget completes, get the list of files in the local directory that match the pattern:
set local_files [glob -directory $env(SSH_LCD) -tails $env(SSH_FILE)
and delete any remote file name that matches:
foreach f $local_files {
send "rm $f\r"
expect $prompt
}

Related

inotifywait: wait for some time after the first file was uploaded to server [duplicate]

I want to send an e-mail notification to guys in our company if a file changed in their staff folder on the server.
I have a script that works fine on sending an e-mail on every file change using inotifywait.
What I would like to do is on multiple file uploads (lets say 10 jpg's are being uploaded to somebody's staff folder) to only send out one email.
This script sends an email on every file change:
inotifywait --recursive --exclude '.DS_Store' -e create -e moved_to -m /media/server/Staff/christoph |
while read path action file ; do
echo "The file '$file' appeared in directory '$path' via '$action'"
sendEmail -f server#email.com -t user#gmail.com -s smtpout.secureserver.net:80 -xu user#email.com -xp password \
-u ""The file $file appeared in your directory"" -m ""To view your file go to $path""
done
What is the smartest way to go about this? Does it make sense to have inotify wait for further input for lets say 2 mins?
BTW I'm using sendemail for this since port 25 is blocked by the ISP.
I would likely do this by writing the modification notices to a file (if I was doing it I would probably use a SQLite database), then run a cron job every few minutes to check the database and send an aggregated email.
Another option would be to use inotifywait to trigger a script to watch that specific file, it would just loop checking the size/modified time of the file, then sleep for some period of time. Once the file stopped growing you the script would append the file info out to a message file. The cron job would send the message file (if it was not empty) then truncate the file. This would avoid the need to read and write data from a log file.

"Couldn't stat remote file: No such file or directory" in SFTP shell script

I have a Bash script on a RHEL 5 system I use to perform various secure file transfer operations between the server and many external systems. Normally the script works fine and uses expect to spawn an ftp session to do things like MGET and then rm'ing the files from the target server. However occasionally the script will error with the output "Couldn't stat remote file: No such file or directory", when attempting the file deletion after a successful MGET transfer. At first glance, this error is straightforward and appears to indicate the file doesn't exist. But the script output shows the files exist and have been transferred normally:
SFTP protocol version 3
sftp> get /home/sistran/parchment/*.xml /nfs/ftp-banp/college_xml
Fetching /home/sistran/parchment/TQB10G4D.xml to /nfs/ftpbanp/college_xml/TQB10G4D.xml
Fetching /home/sistran/parchment/TQB1343C.xml to /nfs/ftp-banp/college_xml/TQB1343C.xml
Then my output shows the stat error:
sftp> quit
Result of error check: 617
!SFTP - Successful File Transfer Performed.
Connecting to abc.xyz.edu...
sftp>
sftp> rm /home/sistran/parchment/TQB10G4D.xml
Couldn't stat remote file: No such file or directory
Removing /home/sistran/parchment/TQB10G4D.xml
Couldn't delete file: No such file or directory
sftp> quit
\r
This sometimes occurs a handful of times during a file transfer of many files in a batch operation. Unfortunately i cannot get my hands on the external server for troubleshooting. Below is a snippet of the script that performs the mget and rm:
....if [ "$F1" = "$ftp_success_string" ]; then
if [ "$password_option" = "1" ]; then
# ----- SFTP command here (password) -----
/usr/bin/expect - << EOF
spawn /usr/bin/sftp -oPort=$PORT $userid#$dest
expect "assword: "
send "${send_password}\r"
expect "sftp> "
send "rm $F2\r"
expect "sftp> "
send "bye \r"
EOF
err=$?
else
# ----- SFTP command here (public Key) -----
echo "
rm $F2
quit
"|sftp -oPort=$PORT -oPubkeyAuthentication=yes $userid#$dest 2>&1
fi....
Help! I'm also open to not using expect, if there is a better method.

Check if remote file exist and perform some other operation in expect script

I am trying to write an expect script in Linux which needs to do following job.
Login to remote windows machine using ssh
Check if a certain file exists
Compare timestamp of another file on the same path and print if it is newer
In step 1, I am trying to send one command to check if file exist but it does not work
log_user 1
spawn ssh -o "StrictHostKeyChecking=no" $username#$hostname
expect {
..... user and password checks
.....
send "IF EXIST C:\\path\\to\\file\\temp.zip (echo FOUND) else (echo NOTFOUND)\r"
expect "path" {
set result $expect_out(buffer)
puts $result
if{$result=="FOUND"} {
#compare with temp2.zip here
}
}
The result always contain the command I am sending not the output FOUND or NOTFOUND. Can someone let me know what I am doing wrong here?
Instead of using pattern matching your script tries to process the buffer manually but makes the incorrect assumption that the buffer will only contain the text "(NOT)FOUND". The buffer will actually contain everything received since the last time the command expect was used. Even if it did match the buffer correctly (e.g., with string match *NOTFOUND* $result), however, it would be affected by the echo problem: the strings "FOUND" and "NOTFOUND" are in the command you send, which is most likely echoed back to you by the SSH server.
The following modification of the script hacks around the echo problem by not sending the literal strings it expects.
It works for me with the Bitvise SSH Server on the Windows side.
log_user 1
spawn ssh -o "StrictHostKeyChecking=no" $username#$hostname
# Log in here.
send "set prefix=___\r" ;# Combat the echo problem.
send "IF EXIST C:\\path\\to\\file\\temp.zip (echo %prefix%FOUND) else (echo %prefix%NOTFOUND)\r"
expect {
___NOTFOUND {
error {file not found}
}
___FOUND {
send_user Found!\n
# Do things with the file.
}
}

Copying SCP multiple files from remote server with wildcard argument in UNIX BASH

This Expect script is a part of my UNIX Bash script
expect -c "
spawn scp yoko#sd.lindeneau.com:\"encryptor *.enc\" .
expect password: { send \"$PASS\r\" }
expect 100%
sleep 1
exit
"
I am trying to copy both 'encryptor' and '*.enc' with this one SCP command. Console tells me it cannot find ' *.enc" '
The syntax for multiple files:
$ scp your_username#remotehost.edu:~/\{foo.txt,bar.txt\} .
I would guess in your case (untested)
scp yoko#sd.lindeneau.com:\\{encryptor, \*.enc\\} .
Not sure it helps but I was looking for a similar objective.
Means: Copying some selected files based on their filenames / extensions but located in different subfolders (same level of subfolders).
this works e.g. copying .csv files from Server:/CommonPath/SpecificPath/
scp -r Server:/CommonPath/\*/\*csv YourRecordingLocation
I even tested more complex "perl like" regular expressions.
Not sure the -r option is still useful.
#!/bin/bash
expect -c "
set timeout 60; # 1 min
spawn scp yoko#sd.lindeneau.com:{encryptor *.enc} .
expect \"password: $\"
send \"mypassword\r\"
expect eof
"
You can increase the timeout if your file copy takes more time to complete. I have used expect eof which will wait till the closure of the scp command. i.e. we are waiting for the End Of File (EOF) of the scp after sending the password.

Non sequential ftp script

Scenario:
I have to transfer approx 3000 files, 30 to 35 MB each from one server to another (Both servers are IBM-AIX servers).
These files are in .gz format. They are unzipped at the destination using gunzip command to b of use.
The way i am doing it now:
I have made .sh files containing ftp scripts of 500 files each. These .sh files when run, transfer the file to the destination. At the destination i keep on checking how many files have arrived, as soon as 100 files have arrived, i run gunzip for these 100 files, then again the same for the next 100 files and so on. I run gunzip for a batch of 100 just to save on time.
What is in my mind:
I am in search of a command or any other way which will ftp my files to the destination, and as soon as 100 files are transferred they are started for unzipping BUT this unzipping should not pause the transfer for the remaining files.
Script that i tried:
ftp -n 192.168.0.22 << EOF
quote user username
quote pass password
cd /gzip_files/files
lcd /unzip_files/files
prompt n
bin
mget file_00028910*gz
! gunzip file_00028910*gz
mget file_00028911*gz
! gunzip file_00028911*gz
mget file_00028912*gz
! gunzip file_00028912*gz
mget file_00028913*gz
! gunzip file_00028913*gz
mget file_00028914*gz
! gunzip file_00028914*gz
bye
The drawback in the above code is that when the
! gunzip file_00028910*gz
lines is executing, the ftp for the next batch i.e ftp for ( file_00028911*gz ) is paused, hence wasting lot of time and loss of bandwidth utilization.
The ! mark is used to run Operating system commands within ftp prompt.
Hope i have explained my scenario properly. Will update the post if i get a solution, if any one already has a solution do reply.
Regards
Yash.
Since you seem to do it on a UNIX system you probably have Perl installed. You might try the following Perl code:
use strict;
use warnings;
use Net::FTP;
my #files = #ARGV; # get files from command line
my $server = '192.168.0.22';
my $user = 'username';
my $pass = 'password';
my $gunzip_after = 100; # collect up to 100 files
my $ftp = Net::FTP->new($server) or die "failed connect to the server: $!";
$ftp->login($user,$pass) or die "login failed";
my $pid_gunzip;
while (1) {
my #collect4gunzip;
GET_FILES:
while (my $file = shift #files) {
my $local_file = $ftp->get($file);
if ( ! $local_file ) {
warn "failed to get $file: ".$ftp->message;
next;
}
push #collect4gunzip,$local_file;
last if #collect4gunzip == $gunzip_after;
}
#collect4gunzip or last; # no more files ?
while ( $pid_gunzip && kill(0,$pid_gunzip)) {
# gunzip is still running, wait because we don't want to run multiple
# gunzip instances at the same time
warn "wait for last gunzip to return...\n";
wait();
# instead of waiting for gunzip to return we could go back to retrieve
# more files and add them to #collect4gunzip
# goto GET_FILES;
}
# last gunzip is done, start to gunzip collected files
defined( $pid_gunzip = fork()) or die "fork failed: $!";
if ( ! $pid_gunzip ) {
# child process should run gunzip
# maybe one needs so split it into multipl gunzip calls to make
# sure, that the command line does not get too long!!
system( ['gunzip', #collect4gunzip ]);
# child will exit once done
exit(0);
}
# parent continues with getting more files
}
It's not tested, but at least it passes the syntax check.
One of two solutions. Don't call gunzip directly. Call "blah" and "blah" is a script:
#!/bin/sh
gunzip "$#" &
so the gunzip is put into the background, the script returns immediately, and you continue with the FTP. The other thought is to just add the & to the sh command -- I bet that would work just as well. i.e. within the ftp script, do:
! gunzip file_00028914*gz &
But... I believe you are somewhat leading yourself astray. rsync and other solutions are the way to go for many reasons.

Resources