I have to get and delete multiple files via FTP so I wrote this script:
open ftp.myftpserver.org
user
pass
cd folder
lcd E:\localdir
mget *
mdel *
bye
This works but is not safe since the folder is being fed from other sources and the mdel * step may delete files uploaded in the meanwhile.
I guess a solution could be moving the files remotely to a different folder, building a filelist at the beginning of the process, but i have no idea how to make it.
Is it possibile?
FTR I followed the nice hint and I managed to succesfully made something working, maybe not elegant but works:
First step to get the file list:
getfilelist.bat
open ftp.myserver.it
myuser
pass1234
cd ftpfolder
prompt n
lcd E:\localdir
ls *.??? filelist.txt
bye
Second step to download and delete the above files
movefiles.bat
#echo off
setlocal enableextensions
setlocal enabledelayedexpansion
echo open ftp.myserver.it>>myscript
echo user myuser pass1234>>myscript
echo cd ftpfolder>>myscript
echo prompt n>>myscript
echo ascii>>myscript
echo lcd E:\downloaddir>>myscript
for /F "usebackq tokens=1,2* delims=," %%G IN ("E:\localdir\filelist.txt") DO ECHO rename %%G %%G_TMP>>myscript
echo mget *_TMP>>myscript
echo mdelete *_TMP>>myscript
echo bye>>myscript
ftp -n -s:myscript
del filelist.txt
del myscript
e:
cd E:\downloaddir
ren *.???_TMP *.???
A bat file to recall the above steps:
E:
cd E:\localdir
ftp -i -s:E:\localdir\getfilelist.bat
E:\localdir\movefiles.bat
hope it helps
I had a similar problem. It looks a lot of people struggle here ;-)
I have to download multiple files and remove them after successfully downloading them on the remote Server to avoid double processing. There is a very small chance that while doing an mget the remote System adds further files. So a mdelete might delete untransferred files.
My approach was not to use anything else than FTP commands. Here we go:
cd <dir on remote server>
mkdir Transfer
mv *.* Transfer
mget Transfer\*.*
del Transfer\*.*
rmdir Transfer
At a glance I move all files to an extra directory on the remote server. Should be very fast as this happens locally. Once the files are shifted to the transfer dir, I perform the real mget download. This might take longer and meanwhile further files can be safely uploaded on the remote Server main dir.
After the download is done, I delete all files in the transfer dir and remove the dir itself.
Don't know if you can do that with a script from with ftp client.
Might be better to do it as a program or scipt using a language of your choice with an FTP library so you have much more control of the FTP operations. e.g. perl with Net::ftp, java etc.
You could then implement a algorithum like:
remote cd to required folder
localcd to required folder
list current files in remote folder
for each file in list that matchs required pattern
get file
if get ok then
delete file
else
log error,exit or whatever error handling you want
endif
endfor
Also need to make sure you don't try to get a file that is in the process of being written, depending on o/s this might be handled for you with file locks, but you need to make sure files are written to the remote dir in a two stage process. First too a temporay directory or file name that does not match the file pattern you are checking for, then renamed or moved into the correct location and name that you will detect.
ok, then i think you can implement the algorithum i've described as a batch script using two seperate ftp calls and scripts. The first to list the files to be transfered from the remote dir, the second to get and delete a single file from the list.
The file get script would have to be created dynamically on each loop iteration to get the next file.
cd to working dir
Run FTP with script to:
cd to remote folder
list files matching required pattern and store in local file
(e.g: ls files_*.txt filelist.txt)
for each file in file list created above (e.g. use 'for /f ...' command to loop through filelist.txt)
create temp ftp script to:
cd to remote dir
lcd to local dir
get file
del file
run FTP with temp script
endfor
This site has an example of doing similer (note script as shown dosn't work, comments give details of corrections needed).
http://www.computing.net/answers/programming/batch-ftp-script-list-rename-get-delete/25728.html
Related
I have a directory which contains CSV files needing to be moved to another directory:
C:\Users\JohnSmith\Desktop\Testing\report-20180819040000-20180826040000-4
We receive a new file weekly where the dates in the directory name will be updated. I want to create a batch file to copy this data using a wildcard on report* but I am running into some issues.
The wildcard appears to work without any issues when I first navigate to:
C:\Users\JohnSmith\Desktop\Testing\
then use:
dir report*
It also works fine when I navigate to:
C:\Users\JohnSmith\Desktop\Testing\
then run
copy * C:\Users\JohnSmith\Desktop\Testing\Destination
My goal is to be able to run something simple in my batch file like the below:
copy C:\Users\JohnSmith\Desktop\Testing\report* C:\Users\JohnSmith\Desktop\Testing\Destination
Whenever I try running the above, I receive the following error:
The system cannot find the file specified.
0 file(s) copied.`
Does anyone have any suggestions?
Use For /D with a wildcards for your directory name, then you can use Copy with a wildcard too!
From the Command Prompt:
For /D %A In ("%UserProfile%\Desktop\Testing\report-*-*-*") Do #Copy "%A\*.csv" "%UserProfile%\Desktop\Testing\Destination">Nul 2>&1
From a batch file:
#For /D %%A In ("%UserProfile%\Desktop\Testing\report-*-*-*") Do #Copy "%%A\*.csv" "%UserProfile%\Desktop\Testing\Destination">Nul 2>&1
Alternatively you could do it directly at the Powershell Prompt:
Cp "$($Env:UserProfile)\Desktop\Testing\report-*-*-*\*.csv" "$($Env:UserProfile)\Desktop\Testing\Destination"
I have a .txt with 900+ file names of pictures with full directories (ex: 2017/conference/tsd-60545).
My windows explorer is pulling from an ftp for my company, and has over 160K photos in it.
I only need the 900 images in my .txt file.
Is there a way to automate this? Manual is bringing me to a slow death.
This is untested as I do not have an ftp server to test this with now, but this should work.
Create a batch file with a .cmd or .bat extention, ensure your test file is in the same directory, or specify the full path to it in the batch file.
Basically, you echo the entire connection to a file. Run a for loop to read from your FTP file and do a get or put for each entry in the file. Once it is completed, it will run the ftp command and read from the file.
MyFTPscript.cmd
#echo off
echo user username password> ftpto.dat
for /F "tokens=*" %%A in (myfile.txt) do echo get %%A >> ftpto.dat
echo quit>> ftpto.dat
ftp -n -s:ftpto.dat ftp.imagesserver.com
You can test it first to make sure it is actually writing the .dat file without actually ftp'ing anything adding rem before the last command. like this:
rem ftp -n -s:ftpto.dat ftp.imagesserver.com
Once you are happy that the content of the dat file looks ok, you can remove the rem and run the script which will then do everything.
No need to delete the file as it will re-write it each time you run the command file.
This downloads all jpgs from ftp:://ftp.mydomain.com/2017/conference/tsd-60545
Create a file called "commands.ftp" based on the contents of your txt file using a tool like vim, awk, or sed. Then use the windows ftp command with option -s to process it.
commands.ftp
open ftp.mydomain.com
user myusername mypassword
cd mysubdirectory
prompt
mget 2017/conference/tsd-60545/*.jpg
close
quit
batch file
ftp -s:commands.ftp -n
I need to copy all the files of an FTP folder to my local Windows folder, but without replacing the files that already exist. This would need to be a job/task that runs unattended every hour.
This is what the job would need to do:
1. Connect to FTP server.
2. In ftp, move to folder /var/MyFolder.
3. In local PC, move to c:\MyDestination.
4. Copy all files in /var/MyFolder that do not exist in c:\MyDestination.
5. Disconnect.
I had previously tried the following script using MGET * (that runs from a .bat), but it copies and overwrites everything. Which means that even if 1000 files were previously copied, it will copy them again.
open MyFtpServer.com
UserName
Password
lcd c:\MyDestination
cd /var/MyFolder
binary
mget *
Any help is appreciated.
Thanks.
Use wget for Windows.
If you want to include subdirectories (adjust the cut-dirs number according to the depth of your actual remote path):
cd /d C:\MyDestination
wget.exe --mirror -np -nH --cut-dirs=2 ftp://UserName:Password#MyFtpServer.com/var/MyFolder
If you don't want subdirectories:
cd /d C:\MyDestination
wget.exe -nc ftp://UserName:Password#MyFtpServer.com/var/MyFolder/*
The "magic" bit (for this second form) is the -nc option, which tells wget not to overwrite files that are already there locally. Do keep in mind that old files are also left alone, so if a file on your FTP server gets edited or updated, it won't get re-downloaded. If you want to also update files, use -N instead of -nc.
(Note that you can also type wget instead of wget.exe, I just included the extension to point out that these are Windows batch file commands)
Hi I want to LOOP through files on FTP and copy one by one. Every thing is fine with FTP connection and accessing folders.
My question is How can loop through all files on FTP. It looks like there is no "For" type of functionality available to access FTP files because each line is considered as complete command.
open MyServerName 21
MyUserName
MyPassword
lcd E:\LocalDirectory
cd /FTPDirectory/upload
I WANT TO LOOP THROGH ALL FILES AND COPY ONE BY ONE TO LOCAL DIRECTORY
disconnect
bye
Why i want to loop through all files in FTP is, I want to copy only those files which are not locked and available for copy.
Use a different FTP client: wget.
With the -m option (for --mirror), use the following in a script
cd mylocaldirectory
wget -m ftp://username:password#hostname/theremotedirectory
Is there some way to specify a directory (let's say "C:\images") and move every .jpg file from there to another directory (say "D:\media") but preserve the directory structure (so if the file were "C:\images\paintball\july\07\headshot.jpg" after moving it would be "D:\media\paintball\july\07\headshot.jpg")?
I'm using cygwin (but would be happy to use DOS if that works too).
Yup.
Do a tar archive of *.jpg files while preserving directory structure (there's a switch) then extract it to the target directory. Should be a one-liner.
( cd /cygdrive/c/images
tar --create --file - . ) | ( cd /cygdrive/d/media
tar --extract --file - )
There's also a --directory option in some versions of tar with which you can avoid the complexity of piping between subshells, but I never use it myself, so I may be missing something:
tar --create --file - -C /cygdrive/c/images . | tar --extract --file - -C /cygdrive/d/media
If you need more power/flexibility, take the time to investigate rsync.
Since you're on windows, you could also take a look at xxcopy. It's great for this kind of stuff and much else.
You can also use xcopy command, like in this example (old is a directory):
xcopy cvs_src\*.jpg old /e/i/h/y/d/exclude:files_to_exclude
Thanks for the XCOPY solution, it solved my similar problem, so I thought I'd share the details for anyone else needing it.
I wanted a list (not a copy) of all the files in a directory (and sub-directories) that were not of a particular type, such as *.jpg. But the DIR command doesn't have an exclude function. So I:
Created a file named exclist.txt that contained a single line ".jpg"
Ran the command "xcopy c:\files c:\test /exclude:exclist.txt /l /d /e /h /i /y > found.txt"
Opened found.txt in Notepad to see the list of non-jpg files
Note the XCOPY /l parameter, which lists the files to be copied without copying them. Since XCOPY is executed in "list mode", the destination folder c:\test is not created and no files are copied. "> found.txt" saves the output from the XCOPY command to the file found.txt, rather than displaying the results on screen.