ftpcmd.bat not uploading the complete pdf file - windows

I am using this code to upload via a batch file:
#echo off
echo user user#site.com> ftpcmd.dat
echo password>> ftpcmd.dat
echo cd public_html/new/data>> ftpcmd.dat
echo put abc.pdf>> ftpcmd.dat
echo quit>> ftpcmd.dat
ftp -n -s:ftpcmd.dat 11.111.111.111
del ftpcmd.dat
If I upload this same file (4MB pdf) with filezilla, it uploads in its entirety to the public_html/new/data folder. I have uploaded many files without issues, mainly txt, htm, csv and xls files and have never had an issue with the above code.
But for this pdf file it shows the same size of file as having been uploaded, but when I open the file I get an error message that is has been corrupted and it only partially displays the page contents.
I have tried removing the #echo off and setting the folder permissions to 777, but I end up with the same result.

The Windows ftp command supports an ASCII (text) mode and a binary mode, the former of which is the default setting.
In text mode (entered by the command ascii), end-of-line markers become converted, and end-of-file characters may be recognised. In binary mode however (entered by the command binary), no such conversions occur and files remain unedited.
Since a PDF file is not a text file, binary mode is required; otherwise, its contents becomes altered, rendering the file corrupt.

Related

Batch for removing last 2 pages from PDF files on a folder

I am using the PDFtk to remove last 2 pages of a bunch of PDF from a specific folder.
For removing it individually on a file, this code works perfectly fine as the last two pages are removed from original.pdf and a newly created reduced.pdf copy is created without the last two pages
#echo off
cd "C:\Program Files (x86)\PDFtk\bin"
start pdftk.exe C:\Desktop\long\original.pdf cat 1-r3 output C:\Desktop\short\reduced.pdf
pause
Fyi, the pdf files all have various alphanumeric filenames and a - as separator between filename words e.g. the-march-event-2022.pdf
What I need now is how to automate is so the script would go through each pdf file on the long folder and create a new copy with identical filename through the command into the short folder
The task can be done with a batch file with only following single command line:
#for %%I in ("C:\Desktop\long\*.pdf") do #"C:\Program Files (x86)\PDFtk\bin\pdftk.exe" "%%I" cat 1-r3 output "C:\Desktop\short\%%~nxI" && echo Successfully processed "%%~nxI" || echo ERROR: Failed to process "%%~nxI"
This command line uses the Windows command FOR to process all PDF files in the specified folder. For each PDF file is executed pdftk with the fully qualified file name of the current PDF file as input file name and the file name + extension with a different directory path as output file name. Run for /? in a command prompt window for help on this command.
There is output the success message on pdftk.exe exits with value 0. Otherwise the error message isĀ output on pdftk.exe exits with value not equal 0.
The two # are for suppressing the output of the FOR command line and of each executed pdftk command line on processing the PDF files in the specified folder.
Please see single line with multiple commands using Windows batch file for an explanation of the conditional operators && and ||.

How can I take file names from a text document, find them in windows explorer, then transfer the specific files to another folder?

I have a .txt with 900+ file names of pictures with full directories (ex: 2017/conference/tsd-60545).
My windows explorer is pulling from an ftp for my company, and has over 160K photos in it.
I only need the 900 images in my .txt file.
Is there a way to automate this? Manual is bringing me to a slow death.
This is untested as I do not have an ftp server to test this with now, but this should work.
Create a batch file with a .cmd or .bat extention, ensure your test file is in the same directory, or specify the full path to it in the batch file.
Basically, you echo the entire connection to a file. Run a for loop to read from your FTP file and do a get or put for each entry in the file. Once it is completed, it will run the ftp command and read from the file.
MyFTPscript.cmd
#echo off
echo user username password> ftpto.dat
for /F "tokens=*" %%A in (myfile.txt) do echo get %%A >> ftpto.dat
echo quit>> ftpto.dat
ftp -n -s:ftpto.dat ftp.imagesserver.com
You can test it first to make sure it is actually writing the .dat file without actually ftp'ing anything adding rem before the last command. like this:
rem ftp -n -s:ftpto.dat ftp.imagesserver.com
Once you are happy that the content of the dat file looks ok, you can remove the rem and run the script which will then do everything.
No need to delete the file as it will re-write it each time you run the command file.
This downloads all jpgs from ftp:://ftp.mydomain.com/2017/conference/tsd-60545
Create a file called "commands.ftp" based on the contents of your txt file using a tool like vim, awk, or sed. Then use the windows ftp command with option -s to process it.
commands.ftp
open ftp.mydomain.com
user myusername mypassword
cd mysubdirectory
prompt
mget 2017/conference/tsd-60545/*.jpg
close
quit
batch file
ftp -s:commands.ftp -n

How to change the extention to all file in a directory using windows batch script

I need to change the file extensios of a directoy. If I do it manually so it will take more time. So is there any windows shell command or batch file to do this?
Like all .html file in a folder will be .php?
make a .bat in a folder and put in it this code
#ren *.Old_extension *.New_Extension
Also if you want to change more extensions just copy the lane and paste it under edited like this
#ren *.Old_extension *.New_Extension
#ren *.Old_extension2 *.New_Extension2
Yes, You can easily do this Command Prompt
Suppose, You have a Folder so many .TXT files in your folder.
Open that folder and just press SHIFT + RIGHT click and select Open Command Windows here
After that type the following command to change all .txt file to .doc
ren *.txt *.doc
It will change all text files in doc file
That's all

put *.* is moving only few files from local folder to SFTP

I am trying to move all files under a folder to SFTP folder using shell script for my batch job. But every time it runs only few files are moved. Not all the files.
/usr/local/bin/expect -c "spawn sftp -o Identityfile="/export/home/user/.ssh/example.ppk" $SFTP_USERID#$SFTP_SERVER
expect \"password: \"
send \"$PASSWORD\n\"
expect \"sftp> \"
send \"cd $DESTDIR\r\"
expect \"sftp> \"
send \"lcd $LOCALDIR\r\"
expect \"sftp> \"
send \"put *.* \r \"
expect \"sftp> \"
send \"quit\r\"
expect \"sftp> \"" >> $BATCH_DIR/logs/batch"$todaydatetime".log
This script runs every time succesfully but only few files are moved to SFTP destination folder. In Logs i am always seeing only 19 Files are uploaded from local folder to SFTP Folder(every time same files).
I understand why every time same files but i am not able to figure out why only few files.
Is there any limit on time that SFTP command will be active?
Kindly also help me how can i change the command to take only new files. "rsynch" is not working.
Hi kenster, mput didn't work for me. Files that are transferred are files under my local folder started with numbers. In my local folder there are 236 files of which 19 files that are starting with numbers are getting transferred even if there are spaces in file name or file extension is may be pdf or xls or what ever but always same 19 files are transferred MEANING not a single file that starts with alphabet is transferred. I tried the same steps manually to check whether file names/ permissions are causing some issue but manually is working fine : ( all files are transferred.
Sorry Guys my mistake.
I just added below lines after lcd step. Now only 6 files are transferred.
expect \"sftp> \"
send \"lls -ltr\r\"
Issue looks like something else not with commands or file names.
I echo date '+%Y%m%d%H%M%S' before and end of all steps. the programs executed only 12 secs everytime.THis should be some with my environment.I am working in restricted environment. Thanks Guys. But still can help me how to pick only new files(moving old files to backup folder is not accepted by my boss).
Change the line:
send \"put *.* \r \"
to
send \"mput * \r \"
as PUT *.* is an ugly Windows-ism.
You should also consider putting double quotes around $DESTDIR and $LOCALDIR in case they contain spaces.
You could try using sshfs and then use "regular" commands like cp, which should give you more options and which should behave like a regular filesystem.
sshfs $SFTP_USERID#$SFTP_SERVER: /temporary/mount/path
cp -R $LOCALDIR/* /temporary/mount/path/$DESTDIR
fusermount -u /temporary/mount/path/

FTP get and delete multiple files

I have to get and delete multiple files via FTP so I wrote this script:
open ftp.myftpserver.org
user
pass
cd folder
lcd E:\localdir
mget *
mdel *
bye
This works but is not safe since the folder is being fed from other sources and the mdel * step may delete files uploaded in the meanwhile.
I guess a solution could be moving the files remotely to a different folder, building a filelist at the beginning of the process, but i have no idea how to make it.
Is it possibile?
FTR I followed the nice hint and I managed to succesfully made something working, maybe not elegant but works:
First step to get the file list:
getfilelist.bat
open ftp.myserver.it
myuser
pass1234
cd ftpfolder
prompt n
lcd E:\localdir
ls *.??? filelist.txt
bye
Second step to download and delete the above files
movefiles.bat
#echo off
setlocal enableextensions
setlocal enabledelayedexpansion
echo open ftp.myserver.it>>myscript
echo user myuser pass1234>>myscript
echo cd ftpfolder>>myscript
echo prompt n>>myscript
echo ascii>>myscript
echo lcd E:\downloaddir>>myscript
for /F "usebackq tokens=1,2* delims=," %%G IN ("E:\localdir\filelist.txt") DO ECHO rename %%G %%G_TMP>>myscript
echo mget *_TMP>>myscript
echo mdelete *_TMP>>myscript
echo bye>>myscript
ftp -n -s:myscript
del filelist.txt
del myscript
e:
cd E:\downloaddir
ren *.???_TMP *.???
A bat file to recall the above steps:
E:
cd E:\localdir
ftp -i -s:E:\localdir\getfilelist.bat
E:\localdir\movefiles.bat
hope it helps
I had a similar problem. It looks a lot of people struggle here ;-)
I have to download multiple files and remove them after successfully downloading them on the remote Server to avoid double processing. There is a very small chance that while doing an mget the remote System adds further files. So a mdelete might delete untransferred files.
My approach was not to use anything else than FTP commands. Here we go:
cd <dir on remote server>
mkdir Transfer
mv *.* Transfer
mget Transfer\*.*
del Transfer\*.*
rmdir Transfer
At a glance I move all files to an extra directory on the remote server. Should be very fast as this happens locally. Once the files are shifted to the transfer dir, I perform the real mget download. This might take longer and meanwhile further files can be safely uploaded on the remote Server main dir.
After the download is done, I delete all files in the transfer dir and remove the dir itself.
Don't know if you can do that with a script from with ftp client.
Might be better to do it as a program or scipt using a language of your choice with an FTP library so you have much more control of the FTP operations. e.g. perl with Net::ftp, java etc.
You could then implement a algorithum like:
remote cd to required folder
localcd to required folder
list current files in remote folder
for each file in list that matchs required pattern
get file
if get ok then
delete file
else
log error,exit or whatever error handling you want
endif
endfor
Also need to make sure you don't try to get a file that is in the process of being written, depending on o/s this might be handled for you with file locks, but you need to make sure files are written to the remote dir in a two stage process. First too a temporay directory or file name that does not match the file pattern you are checking for, then renamed or moved into the correct location and name that you will detect.
ok, then i think you can implement the algorithum i've described as a batch script using two seperate ftp calls and scripts. The first to list the files to be transfered from the remote dir, the second to get and delete a single file from the list.
The file get script would have to be created dynamically on each loop iteration to get the next file.
cd to working dir
Run FTP with script to:
cd to remote folder
list files matching required pattern and store in local file
(e.g: ls files_*.txt filelist.txt)
for each file in file list created above (e.g. use 'for /f ...' command to loop through filelist.txt)
create temp ftp script to:
cd to remote dir
lcd to local dir
get file
del file
run FTP with temp script
endfor
This site has an example of doing similer (note script as shown dosn't work, comments give details of corrections needed).
http://www.computing.net/answers/programming/batch-ftp-script-list-rename-get-delete/25728.html

Resources