put *.* is moving only few files from local folder to SFTP - shell

I am trying to move all files under a folder to SFTP folder using shell script for my batch job. But every time it runs only few files are moved. Not all the files.
/usr/local/bin/expect -c "spawn sftp -o Identityfile="/export/home/user/.ssh/example.ppk" $SFTP_USERID#$SFTP_SERVER
expect \"password: \"
send \"$PASSWORD\n\"
expect \"sftp> \"
send \"cd $DESTDIR\r\"
expect \"sftp> \"
send \"lcd $LOCALDIR\r\"
expect \"sftp> \"
send \"put *.* \r \"
expect \"sftp> \"
send \"quit\r\"
expect \"sftp> \"" >> $BATCH_DIR/logs/batch"$todaydatetime".log
This script runs every time succesfully but only few files are moved to SFTP destination folder. In Logs i am always seeing only 19 Files are uploaded from local folder to SFTP Folder(every time same files).
I understand why every time same files but i am not able to figure out why only few files.
Is there any limit on time that SFTP command will be active?
Kindly also help me how can i change the command to take only new files. "rsynch" is not working.
Hi kenster, mput didn't work for me. Files that are transferred are files under my local folder started with numbers. In my local folder there are 236 files of which 19 files that are starting with numbers are getting transferred even if there are spaces in file name or file extension is may be pdf or xls or what ever but always same 19 files are transferred MEANING not a single file that starts with alphabet is transferred. I tried the same steps manually to check whether file names/ permissions are causing some issue but manually is working fine : ( all files are transferred.
Sorry Guys my mistake.
I just added below lines after lcd step. Now only 6 files are transferred.
expect \"sftp> \"
send \"lls -ltr\r\"
Issue looks like something else not with commands or file names.
I echo date '+%Y%m%d%H%M%S' before and end of all steps. the programs executed only 12 secs everytime.THis should be some with my environment.I am working in restricted environment. Thanks Guys. But still can help me how to pick only new files(moving old files to backup folder is not accepted by my boss).

Change the line:
send \"put *.* \r \"
to
send \"mput * \r \"
as PUT *.* is an ugly Windows-ism.
You should also consider putting double quotes around $DESTDIR and $LOCALDIR in case they contain spaces.

You could try using sshfs and then use "regular" commands like cp, which should give you more options and which should behave like a regular filesystem.
sshfs $SFTP_USERID#$SFTP_SERVER: /temporary/mount/path
cp -R $LOCALDIR/* /temporary/mount/path/$DESTDIR
fusermount -u /temporary/mount/path/

Related

How to scp multiple files from remote to local in a folder other than ~/?

I'm trying to make a bash expect script that takes in input like user, host, password, and file names and then copies the files from remote to local. From what I've read so far, scp'ing multiple files from remote to local works just fine when you're assuming the files are coming from ~/, i.e:
scp -r user#host:"file1 file2 file3" .
would copy file1, file2, and file3 from ~/ into the current working directory. But I need to be able to pass in another directory as an argument, so my bash script looks like this (but doesn't work, I'll explain how):
eval spawn scp -oStrictHostKeyChecking=no -oCheckHostIP=no -r $user#$host:$dir/"$file1 $file2 $file3" ~/Downloads
This doesn't work after the first file; the shell raises a "No such file or directory" error after the first file, which I would assume means that the script only works on $dir for the first file, then kicks back into ~/ and of course can't find the files there. I've looked everywhere for an answer on this but can't find it, and it would be super tedious to do this one file at a time.
Assuming your remote login shell understands Brace Expansion, this should work
scp $user#$host:$dir/"{$file1,$file2,$file3}" ~/Downloads
If you want to download multiple files with a specific pattern, you can do the following for example if you want all zip files:
scp -r user#host:/path/to/files/"*.zip" /your/local/path

Download a file from ftp subdirectories using cmd script

I was trying to download multiple files from our ftp server using the script:
mget cd\dir_here\subdir_here\sample*.txt
but it didn't work so I tried to change to back slash:
mget cd/dir_here/subdir_here/sample*.txt
a message
Type set to A
appeared. What does that mean?
Type set to A
This means that you've told the FTP server you will be transferring by ASCII.
You almost never actually want to do ASCII, it is better to make a habit of using Binary for all transfers. (ASCII breaks the chain of custody, ASCII transfers can modify file contents such as changing New line characters and not sending correct Unicode characters and are completely unsuited to binary file contents, while the speed benefits are trivial.)
You want to make sure your FTP script uses commands separately it looks like you put a CD along with an MGet here...?
Example FTP Script:
user USERNAME PASSWORD
binary
lcd C:\Local\File\System\Folder
cd /root/dir_here/subdir_here
mget sample*.txt
quit
This would be in a script file which would be specified in the FTP command when you call it from the command line or from a script.
eg:
ftp -n -i -v "-s:C:\Local\Path\To\FTP\Script.txt" SERVER_IP

wget hangs after large file download

I'm trying to download a large file over a ftp. (5GB file). Here is my script.
read ZipName
wget -c -N -q --show-progress "ftp://Password#ftp.server.com/$ZipName"
unzip $ZipName
The files downloads at 100% but never goes to the unzip command. No special error message, no outputs in the terminal. Just blank new line. I have to send CTRL + c and run back to script to unzip since wget detects that the file is fully downloaded.
Why does is hangs out like this? Is it because of the large file, or passing an argument in command?
By the way I can't use ftp because it's not on the VM i'm working on, and it's a temporary VM so no root privilege to install anything.
I've made some tests, and I think that size of the disk was the reason.
I've tried with curl -O and it worked for the same disk space.

FileMaker Terminal command

I want to run a Terminal command from within FileMaker. I use the Perform AppleScript script step with a native AppleScript:
do shell script "rsync -r Documents/Monturen/ fakeuser#fakeserverhosting.be:www/"
I installed a SSH Key on the remote server. The goal is to automate the sync of images.
I do get a 23 error. Any advice on what I'm doing wrong?
This is rsinc error 23 - some files could not be transferred. Try to transfer one file with explicitly defined full file path.
I think there is a problem with the source filepath as well. Shouldn't this be
~/Documents/Monturen
or
~/Documents/Monturen/*
If you have any spaces in your file names or folder names they have to be escaped with \\. The same applies to any apostrophes.

Copy file with dynamic name in unix

I have a business scenario where a unix user ftp files to unix box in the following format 'BusinessData_date.dat' Please note that date part is dynamic and hence keeps on changing daily. e.g 'BusinessData_20131210.dat'
How can i run copy command to copy the file to a different directory daily and also archive the previous day file so that it does not read twice.
Trying out the following...getting an error
$ cp -pr /Tickets/data/BusinessData_"$(date+%Y%m%d)".dat /sftpdata/dataloader/data/BusinessData_"$(date+%Y%m%d)".csv
You need a space to split the actual command & the arguments. Also you dont need the quotes.
cp -pr ..../BusinessData_$(date +%Y%m%d).dat ..../BusinessData_$(date +%Y%m%d).csv
cp -p /Tickets/data/BusinessData_"$(date +%Y%m%d)".dat /sftpdata/dataloader/data/BusinessData_"$(date +%Y%m%d)".csv

Resources