I have this script scheduled every hour:
#echo off
set path1="E:\Document\Backup"
set path2="E:\Document\NewDoc"
set path3="C:\ScanDoc"
forfiles -p %path1% -s -m *.pdf /D -30 /C "cmd /c del #path"
xcopy %path2%\*.pdf* %path1% /c
start /d %path3% ScanBatch.exe
Files in "NewDoc" folder are created by manual document scanning (PDF FORMAT), so sometimes documents are in use.
The Scanbatch program read files in "Backup" folder, so if PDF is copied from "NewDoc" to "Backup" while in use, it's result as corrupted and the Scanbatch go in error.
Is there a way to copy files ONLY IF NOT IN USE?
At the end the real problem wasn't xcopy, but "Scanbatch.exe" that crashes if found an opened file. Problem solved changing schedulation time.
Related
I used this command in a windows command line:
C:\Users\myuser\Desktop>C:\Windows\System32\ForFiles.exe /P C:\myfolder\mysubfolder /S /M *.* /D +09/15/2022 /C "cmd /C echo #FSIZE >> sizes.txt"
I wanted to echo all the sizes for files in folder modified in the last 5 days.
I didn't found the output file.
I then solved the problem by changing the command to:
C:\Users\myuser\Desktop>C:\Windows\System32\ForFiles.exe /P C:\myfolder\mysubfolder /S /M *.* /D +09/15/2022 /C "cmd /C echo #FSIZE" > sizes.txt
Anyway, I'd like to know if I created a sizes.txt file somewhere on my hard drive.
Searched in the folder, subfolder, desktop, home folder, C:, C:\Windows, C:\Windows\System32... nothing...
I finally found them, yes "them".
One in each directory containing recently edited file(s).
Seems that the command forfiles executes is actually run where the file is located.
It's strange because if you specify cmd /C echo %CD% as command it actually prints the directory you run from, in my case Desktop!
This command is supposed to copy multiple files from a static source folder into each folder for a set of saved web pages:
forfiles /m *.htm /c "cmd /c copy /y _core/*.* #fname_files"
However, each call fails with a status of, "The system cannot find the file specified."
If this is tried:
forfiles /m *.htm /c "cmd /c copy /y 0x22_core/*.*0x22 #fname_files"
the status displayed shows the name of each source file and the same error message.
I've also tried adding setlocal/endlocal around the call but it still fails.
Searching on the web brought lots of discussions but nothing showing forfiles, cmd, and copying into a destination directory using #fname.
Would someone with deeper knowledge of batch scripting "fix" this line so it works as intended?
If I understand correctly what you are trying to do :
1)You might be getting "The system cannot find the file specified." because the directory _core is not in the same directory as the *.htm files
2)In order to copy the *.htm files into each #fname_files folder, you must create the folder first. Here is the command line with mkdir added :
forfiles /m *.htm /c "cmd /c mkdir #fname_files & copy /y _core\*.* #fname_files"
I am able to create a bacth file to delete files older than n days but i am having problem with saving the deletion results. What i want is listing the deletion result in a text file after the process.
I have tried this code below. It works and delete all files. It also creates result.txt but the txt file is empty. I am seeing that files are being deleted. Any idea why they are not being saved in the text file?
forfiles -p "C:\Log\" -s -m *.* /D -1 /C "cmd /c del #path" >> "c:\result.txt"
This will provide a log of files that have been used with the del command, but not a confirmed result.
If a file is read only for example then it will still exist.
forfiles -p "C:\Log\" -s -m *.* /D -1 /C "cmd /c del #path & echo #path >>c:\result.txt"
Is there a way to see what time a file get copied to a directory? Looks like Date modified column in windows explorer shows the date and time when the file got created.
You could try checking this via the command line.
Here are some commands that may help.
Using 'dir'
This, for example, gives the last modified time of this file:
dir /T:W d:\test.pdf
And this gives the date/time for all files and sub-directories in the current one:
dir /T:W
Using 'forfiles'
Modified datetime for all files in current dir:
forfiles /C "cmd /c echo #file #fdate #ftime"
Or just the pdf files within the directory:
forfiles /M *.pdf /C "cmd /c echo #file #fdate #ftime"
Using 'cp' in linux
I don't know who useful this would be for you, but on Linux machines you can use this option when coping files:
$ cp --no-preserve=timestamps my-file.pdf my-copy.pdf
This way the copy will have its own creation time and won't depend on the original file.
I have a batch script that does the following
#ECHO OFF
REM move files older than 2 days from the incoming directory to the incoming archive directory
robocopy D:\Agentrics\integration\download D:\Agentrics\integration\download\archive /MOV /MINAGE:2
REM Zip files in the Archieve directory that are older than one week
FOR %%A IN (D:\Agentrics\integration\download\archive\*.txt*, D:\Agentrics\integration\download\archive\*.cpi*) DO "C:\Program Files\WinRAR\WinRAR.exe" a -r -to7d D:\Agentrics\integration\download\archive\"%%~nA.zip" "%%A"
REM Delete Original files after they are zipped
forfiles /p D:\Agentrics\integration\download\archive /s /m *.txt* /d -7 /c "cmd /c del /q #path"
forfiles /p D:\Agentrics\integration\download\archive /s /m *.cpi* /d -7 /c "cmd /c del /q #path"
REM Delete files that are older than 6 months from the archive directory
forfiles /p D:\Agentrics\integration\download\archive /s /m *.zip* /d -180 /c "cmd /c del /q #path"
pause
Question 1:
When i run the script i get WinRAR diagnostic messages for some files. For example if there are files in the incoming directory that are not older than two days i get this message."WinRAR Diagnostic messages: No File To Add". Because of this message the scripts stops until i click on the close button of the dialogue box. I am using the free version of WinRAR and its not expired
Question 2: I have two seprate command in the script above. One is for zipping the files older than a week and the other one is deleting the original files after they are zipped. How can i link those two commands so that if some reason the files did not get zipped they should also not get deleted. Or is is there a command to break the script if the files did not get zipped? I just want to zipp the files first and then delete the original ones
I sugest to use
#ECHO OFF
REM Move files older than 2 days from the incoming directory to the incoming archive directory.
robocopy D:\Agentrics\integration\download D:\Agentrics\integration\download\archive /MOV /MINAGE:2
REM Move each file in the archive directory that is older than one week into a ZIP archive.
FOR %%A IN (D:\Agentrics\integration\download\archive\*.txt*, D:\Agentrics\integration\download\archive\*.cpi*) DO "C:\Program Files\WinRAR\WinRAR.exe" m -afzip -ep -inul -to7d -y "D:\Agentrics\integration\download\archive\%%~nA.zip" "%%A"
REM Delete files that are older than 6 months from the archive directory.
forfiles /p D:\Agentrics\integration\download\archive /s /m *.zip* /d -180 /c "cmd /c del /q #path"
The entire process can be simplified by using command m which means move to archive instead of command a which means add to archive. WinRAR removes the file only after successful compression.
Using switch -afzip informs WinRAR explicitly to use ZIP instead of RAR compression.
The switch -ep results in removing path from the file names inside the archive.
The output of any error or warning message can be suppressed with switch -inul. This switch is mainly for the console version Rar.exe (not supporting ZIP copmression) for output to stdout and stderr, but may work also for WinRAR. I have never seen a diagnostic message to confirm on my tests using WinRAR.exe version 4.20 when the ZIP file was not created because the file was not older than 7 days. I have seen the warning on using Rar.exe creating a RAR archive and not using -inul, but also with no need for a key hit even without using switch -y.
I removed switch -r for recursive archiving as not needed here with always moving only 1 file to a ZIP archive.
The unmodified switch -to7d results in archiving only files older than 7 days.
Last the switch -y is added to assume Yes on all queries although I have never seen one on my tests.
One more hint:
On NTFS partitions the attribute compressed can be set on a folder resulting in an automatic ZIP compression of all files copied or created in this folder to save disk storage.