Copy Contents From Folder Using Wildcard On The Directory - windows

I have a directory which contains CSV files needing to be moved to another directory:
C:\Users\JohnSmith\Desktop\Testing\report-20180819040000-20180826040000-4
We receive a new file weekly where the dates in the directory name will be updated. I want to create a batch file to copy this data using a wildcard on report* but I am running into some issues.
The wildcard appears to work without any issues when I first navigate to:
C:\Users\JohnSmith\Desktop\Testing\
then use:
dir report*
It also works fine when I navigate to:
C:\Users\JohnSmith\Desktop\Testing\
then run
copy * C:\Users\JohnSmith\Desktop\Testing\Destination
My goal is to be able to run something simple in my batch file like the below:
copy C:\Users\JohnSmith\Desktop\Testing\report* C:\Users\JohnSmith\Desktop\Testing\Destination
Whenever I try running the above, I receive the following error:
The system cannot find the file specified.
0 file(s) copied.`
Does anyone have any suggestions?

Use For /D with a wildcards for your directory name, then you can use Copy with a wildcard too!
From the Command Prompt:
For /D %A In ("%UserProfile%\Desktop\Testing\report-*-*-*") Do #Copy "%A\*.csv" "%UserProfile%\Desktop\Testing\Destination">Nul 2>&1
From a batch file:
#For /D %%A In ("%UserProfile%\Desktop\Testing\report-*-*-*") Do #Copy "%%A\*.csv" "%UserProfile%\Desktop\Testing\Destination">Nul 2>&1
Alternatively you could do it directly at the Powershell Prompt:
Cp "$($Env:UserProfile)\Desktop\Testing\report-*-*-*\*.csv" "$($Env:UserProfile)\Desktop\Testing\Destination"

Related

Update timestamp of a directory on Windows

There is more or less well-known command line way to update the modification time of a file on Windows (described at Update file or folder Date Modified, for example):
copy /b somePath\fileName+,, somePath\
According to my experience it does for a file, but does not for a directory (tested on WinXP - the command did not fail, but the directory modification time was not updated).
I tried to adjust it for a directory using such a trick that we can "point" to the directory using a special "NUL" filename on Windows. I tried two ways to do that, but they don't work as well:
copy /b somePath\fileName\NUL+,, somePath\filename\
copy /b somePath\fileName\NUL+,, somePath\
Could anyone explain me why it does not work or what I am doing wrong?
It doesn't make any changes to the directory because the filename nul is not stored in the directory. Since the directory is not changed, its modification time doesn't change. You can do this instead:
type nul > somePath\fileName\SomeFileThatDoesNotExist.tmp && del somePath\fileName\SomeFileThatDoesNotExist.tmp

Issue on output directory while executing commands with windows batch command "FOR /R"

I was planning to use the following windows batch command to loop through some files that I want to execute some commands on it.
FOR /R [[drive:]path] %%parameter IN (set) DO command
In my CASE
FOR %%v IN (*.pyc) DO uncompyle6 -o . "%%v"
Where uncompyle6 -o . "file-path not directory-path" is the code I would like to execute to decompile all pyc files, including those files under the subdirectory.
Since the "DO" command executes with respect to the current working directory, so all my decompiled file also outputted to the current working directory.
Therefore, my question is how do I change my code? So that the decompiled file inside the subdirectory will be outputted to their own directory instead of the current working directory where the batch file is executed?

FTP get and delete multiple files

I have to get and delete multiple files via FTP so I wrote this script:
open ftp.myftpserver.org
user
pass
cd folder
lcd E:\localdir
mget *
mdel *
bye
This works but is not safe since the folder is being fed from other sources and the mdel * step may delete files uploaded in the meanwhile.
I guess a solution could be moving the files remotely to a different folder, building a filelist at the beginning of the process, but i have no idea how to make it.
Is it possibile?
FTR I followed the nice hint and I managed to succesfully made something working, maybe not elegant but works:
First step to get the file list:
getfilelist.bat
open ftp.myserver.it
myuser
pass1234
cd ftpfolder
prompt n
lcd E:\localdir
ls *.??? filelist.txt
bye
Second step to download and delete the above files
movefiles.bat
#echo off
setlocal enableextensions
setlocal enabledelayedexpansion
echo open ftp.myserver.it>>myscript
echo user myuser pass1234>>myscript
echo cd ftpfolder>>myscript
echo prompt n>>myscript
echo ascii>>myscript
echo lcd E:\downloaddir>>myscript
for /F "usebackq tokens=1,2* delims=," %%G IN ("E:\localdir\filelist.txt") DO ECHO rename %%G %%G_TMP>>myscript
echo mget *_TMP>>myscript
echo mdelete *_TMP>>myscript
echo bye>>myscript
ftp -n -s:myscript
del filelist.txt
del myscript
e:
cd E:\downloaddir
ren *.???_TMP *.???
A bat file to recall the above steps:
E:
cd E:\localdir
ftp -i -s:E:\localdir\getfilelist.bat
E:\localdir\movefiles.bat
hope it helps
I had a similar problem. It looks a lot of people struggle here ;-)
I have to download multiple files and remove them after successfully downloading them on the remote Server to avoid double processing. There is a very small chance that while doing an mget the remote System adds further files. So a mdelete might delete untransferred files.
My approach was not to use anything else than FTP commands. Here we go:
cd <dir on remote server>
mkdir Transfer
mv *.* Transfer
mget Transfer\*.*
del Transfer\*.*
rmdir Transfer
At a glance I move all files to an extra directory on the remote server. Should be very fast as this happens locally. Once the files are shifted to the transfer dir, I perform the real mget download. This might take longer and meanwhile further files can be safely uploaded on the remote Server main dir.
After the download is done, I delete all files in the transfer dir and remove the dir itself.
Don't know if you can do that with a script from with ftp client.
Might be better to do it as a program or scipt using a language of your choice with an FTP library so you have much more control of the FTP operations. e.g. perl with Net::ftp, java etc.
You could then implement a algorithum like:
remote cd to required folder
localcd to required folder
list current files in remote folder
for each file in list that matchs required pattern
get file
if get ok then
delete file
else
log error,exit or whatever error handling you want
endif
endfor
Also need to make sure you don't try to get a file that is in the process of being written, depending on o/s this might be handled for you with file locks, but you need to make sure files are written to the remote dir in a two stage process. First too a temporay directory or file name that does not match the file pattern you are checking for, then renamed or moved into the correct location and name that you will detect.
ok, then i think you can implement the algorithum i've described as a batch script using two seperate ftp calls and scripts. The first to list the files to be transfered from the remote dir, the second to get and delete a single file from the list.
The file get script would have to be created dynamically on each loop iteration to get the next file.
cd to working dir
Run FTP with script to:
cd to remote folder
list files matching required pattern and store in local file
(e.g: ls files_*.txt filelist.txt)
for each file in file list created above (e.g. use 'for /f ...' command to loop through filelist.txt)
create temp ftp script to:
cd to remote dir
lcd to local dir
get file
del file
run FTP with temp script
endfor
This site has an example of doing similer (note script as shown dosn't work, comments give details of corrections needed).
http://www.computing.net/answers/programming/batch-ftp-script-list-rename-get-delete/25728.html

Batch File to store backup files with timestamps

I have two folders in the same drive. I want to create backup of an access database. I need to copy the main file, append the name with the date and time and store it in a different folder.
Source Folder: G:\PMO\Talent Mgt\Data
Source file: Talent_Management_Data.accdb
Destination File: G:\PMO\Talent Mgt\Archive\Talent_Management_Data.accdb_20120101
Any suggestions?
You can achieve this by using for command to execute copy for each file. A simple batch file would be:
cd "G:\PMO\Talent Mgt\Data"
for %%A in (*.accdb) do copy %%A ..\Archive\%%A_%date:-=%

Recursively search directories moving specified file type from one location to another preserving directory structure

Is there some way to specify a directory (let's say "C:\images") and move every .jpg file from there to another directory (say "D:\media") but preserve the directory structure (so if the file were "C:\images\paintball\july\07\headshot.jpg" after moving it would be "D:\media\paintball\july\07\headshot.jpg")?
I'm using cygwin (but would be happy to use DOS if that works too).
Yup.
Do a tar archive of *.jpg files while preserving directory structure (there's a switch) then extract it to the target directory. Should be a one-liner.
( cd /cygdrive/c/images
tar --create --file - . ) | ( cd /cygdrive/d/media
tar --extract --file - )
There's also a --directory option in some versions of tar with which you can avoid the complexity of piping between subshells, but I never use it myself, so I may be missing something:
tar --create --file - -C /cygdrive/c/images . | tar --extract --file - -C /cygdrive/d/media
If you need more power/flexibility, take the time to investigate rsync.
Since you're on windows, you could also take a look at xxcopy. It's great for this kind of stuff and much else.
You can also use xcopy command, like in this example (old is a directory):
xcopy cvs_src\*.jpg old /e/i/h/y/d/exclude:files_to_exclude
Thanks for the XCOPY solution, it solved my similar problem, so I thought I'd share the details for anyone else needing it.
I wanted a list (not a copy) of all the files in a directory (and sub-directories) that were not of a particular type, such as *.jpg. But the DIR command doesn't have an exclude function. So I:
Created a file named exclist.txt that contained a single line ".jpg"
Ran the command "xcopy c:\files c:\test /exclude:exclist.txt /l /d /e /h /i /y > found.txt"
Opened found.txt in Notepad to see the list of non-jpg files
Note the XCOPY /l parameter, which lists the files to be copied without copying them. Since XCOPY is executed in "list mode", the destination folder c:\test is not created and no files are copied. "> found.txt" saves the output from the XCOPY command to the file found.txt, rather than displaying the results on screen.

Resources