ftp shell mput file not match - shell

I use ftp shell script to upload data below:
find . -type d -exec ./recursive-ftp.sh {} \;
recursive-ftp.sh is below:
#!/bin/bash
ftp_site="<skip>"
username="<skip>"
passwd="<skip>"
localDir="<skip>"
folder=$1
cd <skip>
pwd
ftp -in <<EOF
open $ftp_site
user $username $passwd
mput *
mkdir $folder
cd $folder
lcd $folder
mput *
close
bye
But I found the file size & md5 is different from the orignal file.
ftp use tcp,why are some files different?

Related

Bulk copy files over sftp with a delay between each file

I want to move my files from one directory to SFTP and further to another directory in sequence not together.
Let's say if my directories are A and B.
Here is my code:
#!/bin/bash
cp -R /usr/sap/tmp/Dir A/. /usr/sap/tmp/Dir B/
lftp <<_EOF_
open sftp://User:Password#Host -p Port
lcd /usr/sap/tmp/Dir A
cd /
pwd
mput -E /usr/sap/tmp/Dir A/*.dat
exit
_EOF_
This works fine. But the only problem is it moves all files together at the same time from dir A to SFTP. How can I get it to move files one by one (in sequence, say the files moved to SFTP should have at least difference of one second between them)?
First create a file with commands for all files.
cat <<# > inputfile
open sftp://User:Password#Host -p Port
lcd /usr/sap/tmp/Dir A
cd /
pwd
#
find . -type f -name sa\*.txt -print0 |
xargs -n1 --null -I'{}' printf "%s\n" "mput '{}'" '!'"sleep 1" >> inputfile
echo "exit" >> inputfile
Next stream that file
lftp << inputfile

In sftp i am not able to copy all files in ksh sctipt

I am connecting thru sftp in shell script file and trying to -mget all files from multiple directories like below:
sshpass -p $password sftp -oBatchMode=no -o StrictHostKeyChecking=no -b - $user#$host 1>$currentdir/ftp.log << !
cd /DAILY_FEEDS/abc/in/xml/
-mget *.*
cd /DAILY_FEEDS/abc/in/pdf/
ls * tp_pdf.lst
-mget *.*
cd /DAILY_FEEDS/abc/in/images/
-mget *.*
cd /DAILY_FEEDS/xyz/in/xml/
-mget *.*
cd /DAILY_FEEDS/xyz/in/pdf/
-mget *.*
cd /DAILY_FEEDS/xyz/in/images/
-mget *.*
cd /DAILY_FEEDS/xyz/in/pdfmetaxml/
-mget *.*
bye
!
When any file is empty then process skip onward. in above scenario if /DAILY_FEEDS/abc/in/images/ there are no image and later in /DAILY_FEED/xyz/in/xml/ directory having files then its not picking after abc/in/images.
You have to check initially any files exists inside the folder or not.If files exists then use "mget" to download files ,otherwise go to next folder.
"mget" working for me.

Bash script upload to ftp & delete files 7 days and older

The script is unable to delete anything older than 7 days in FTP. Says the error "550 Can't remove directory: No such file or directory" All of the FTP backups are just in the main directory.
#!/bin/sh
USERNAME="user"
PASSWORD="pass"
SERVER="ftpbackupservergoeshere"
NOW="$(date +'%m-%d-%Y')"
DAYS=7
RMDATE=$(date --iso -d $DAYS' days ago')
# local directory to pickup *.tar.gz file
FILE="/root/$NOW.tar.gz"
cd /home/minecraft/multicraft/servers
find . -name \*.log -type f -delete
find . -name \*.log.gz -type f -delete
cd
tar -zcf $NOW.tar.gz /home/minecraft
# login to remote server
ftp -n -v $SERVER <<EOF
user $USERNAME $PASSWORD
binary
put $FILE $NOW.tar.gz
cd ..
rm -rf ${RMDATE}
bye
EOF
rm /root/$NOW.tar.gz

shell script for 24 hrs data files upload to ftp ; unable to identify the issue

file_list=$( find . -type f -name * -mtime -1 )ftp -n << EOF
open ftpip
user uname pwd
cd directory
prompt
hash
bin
mput $file_list
bye
EOF
unable to upload with the above script... and through invalid command
Aside from the problem with quoting the asterisk, and the fact that your "ftp" statement needs to start on a new line, I suspect your $file_list variable could get far too long to be handled well. I have made you a little script that uses "tar" to collect up the files you want into a single archive named after today's date. Then you can FTP that instead of 8 million files ;-)
Here you go:
#!/bin/bash
#
# Make dated tar file of everything from last 24 hrs, filename like "Backup2013-12-14.tgz"
#
FILENAME=`date +"Backup%Y-%m-%d.tgz"`
find . -type f -mtime -1 | tar -cvz -T - -f "$FILENAME"
ftp -n << EOF
open somehost
user joe bloggs
prompt
hash
bin
mput "$FILENAME"
bye
EOF
You either need to put * in quotes so it doesn't immediately expand or remove
-name * altogeather since it's default option.

BASH recursive find filename and copy to ftp

I am working on a Bash script (see below) that recursively searches through directories on a SAN for files with a specific file name that is newer than 4 hours. Then copies all these files to a specific FTP Location and email to say the copy has been completed. The script works fine except that it only copies files on the top level directory. The error I am getting on the lower directories is this:
#
remote: -v
ftp: local: -v: No such file or directory
local: ./Test01/test02/folder02_01_1200_m30.mp4 remote: ./Test01/test02/folder02_01_1200_m30.mp4
229 Entering Extended Passive Mode (|||45127|)
550 ./Test01/test02/folder02_01_1200_m30.mp4: File does not exist. (2)
221 Goodbye.
#
Here is the Script
#!/bin/bash
#The location from where the script should search
GSPORIGIN='/Volumes/folder01/folder02'
#File Names to be moved
FILE1='*1200_m30.mp4'
#FTP Details
HOST='xxxx.upload.com'
USER='xxxxxxx'
PASSWD='xxxxxxxxxxxx'
#the destination directory on the FTP
DESTDIR="/8619/_!/TEST"
# Go to the location from where the search should start
cd $GSPORIGIN
for file in `find . -type f -name "*1200_m30.mp4" -mmin -240`
do
echo $file
if [ -f $file ] ; then
ftp -n -v $HOST << EOT
ascii
user $USER $PASSWD
prompt
cd $DESTDIR
mput -v $file
EOT
echo "$file has been copied to FTP" | mail -s "$file has been copied to FTP in Directory $DESTDIR" xxx.xxx#xxx.com;
else exit 1
fi
done
To do what you're doing, you'll have to recreate the directories on the destination FTP.
use the basename/dirname commmands and a mkdir command like this :
for file in `find . -type f -name "*1200_m30.mp4" -mmin -240`
do
echo $file
if [ -f $file ] ; then
destdirname=`dirname "$file"`
ftp -n -v $HOST << EOT
ascii
user $USER $PASSWD
prompt
cd $DESTDIR
mkdir $destdirname
mput -v $file
EOT
echo "$file has been copied to FTP" | mail -s "$file has been copied to FTP in Directory $DESTDIR" xxx.xxx#xxx.com;
else exit 1
fi
To copy multiple files in nested directories: I would suggest you to look at rsync utility to do this job for you.
rsync will create all the remote directories whenever needed and it will keep files completely in sync even after frequent runs.

Resources