How do I force Robocopy to overwrite files? - windows

In general, Robocopy ignores files for which lastwrittendate and filesize are the same. How can we escape this design? I'd like to force overwriting with Robocopy.
I expected that dst\sample.txt should be written test001.
But these file are recognized as the same files by Robocopy and not overwritten. The "/IS" option is not effective in this case.
New-Item src -itemType Directory
New-Item dst -itemType Directory
New-Item src\sample.txt -itemType File -Value "test001"
New-Item dst\sample.txt -itemType File -Value "test002"
Set-ItemProperty src\sample.txt -Name LastWriteTime -Value "2016/1/1 15:00:00"
Set-ItemProperty dst\sample.txt -Name LastWriteTime -Value "2016/1/1 15:00:00"
ROBOCOPY.exe src dst /COPYALL /MIR
Get-Content src\sample.txt, dst\sample.txt
> test001
> test002
ROBOCOPY.exe src dst /COPYALL /MIR /IS
Get-Content src\sample.txt, dst\sample.txt
> test001
> test002

From the documentation:
/is Includes the same files.
/it Includes "tweaked" files.
"Same files" means files that are identical (name, size, times, attributes). "Tweaked files" means files that have the same name, size, and times, but different attributes.
robocopy src dst sample.txt /is # copy if attributes are equal
robocopy src dst sample.txt /it # copy if attributes differ
robocopy src dst sample.txt /is /it # copy irrespective of attributes
This answer on Super User has a good explanation of what kind of files the selection parameters match.
With that said, I could reproduce the behavior you describe, but from my understanding of the documentation and the output robocopy generated in my tests I would consider this a bug.
PS C:\temp> New-Item src -Type Directory >$null
PS C:\temp> New-Item dst -Type Directory >$null
PS C:\temp> New-Item src\sample.txt -Type File -Value "test001" >$null
PS C:\temp> New-Item dst\sample.txt -Type File -Value "test002" >$null
PS C:\temp> Set-ItemProperty src\sample.txt -Name LastWriteTime -Value "2016/1/1 15:00:00"
PS C:\temp> Set-ItemProperty dst\sample.txt -Name LastWriteTime -Value "2016/1/1 15:00:00"
PS C:\temp> robocopy src dst sample.txt /is /it /copyall /mir
...
Options : /S /E /COPYALL /PURGE /MIR /IS /IT /R:1000000 /W:30
------------------------------------------------------------------------------
1 C:\temp\src\
Modified 7 sample.txt
------------------------------------------------------------------------------
Total Copied Skipped Mismatch FAILED Extras
Dirs : 1 0 0 0 0 0
Files : 1 1 0 0 0 0
Bytes : 7 7 0 0 0 0
...
PS C:\temp> robocopy src dst sample.txt /is /it /copyall /mir
...
Options : /S /E /COPYALL /PURGE /MIR /IS /IT /R:1000000 /W:30
------------------------------------------------------------------------------
1 C:\temp\src\
Same 7 sample.txt
------------------------------------------------------------------------------
Total Copied Skipped Mismatch FAILED Extras
Dirs : 1 0 0 0 0 0
Files : 1 1 0 0 0 0
Bytes : 7 7 0 0 0 0
...
PS C:\temp> Get-Content .\src\sample.txt
test001
PS C:\temp> Get-Content .\dst\sample.txt
test002
The file is listed as copied, and since it becomes a same file after the first robocopy run at least the times are synced. However, even though seven bytes have been copied according to the output no data was actually written to the destination file in both cases despite the data flag being set (via /copyall). The behavior also doesn't change if the data flag is set explicitly (/copy:d).
I had to modify the last write time to get robocopy to actually synchronize the data.
PS C:\temp> Set-ItemProperty src\sample.txt -Name LastWriteTime -Value (Get-Date)
PS C:\temp> robocopy src dst sample.txt /is /it /copyall /mir
...
Options : /S /E /COPYALL /PURGE /MIR /IS /IT /R:1000000 /W:30
------------------------------------------------------------------------------
1 C:\temp\src\
100% Newer 7 sample.txt
------------------------------------------------------------------------------
Total Copied Skipped Mismatch FAILED Extras
Dirs : 1 0 0 0 0 0
Files : 1 1 0 0 0 0
Bytes : 7 7 0 0 0 0
...
PS C:\temp> Get-Content .\dst\sample.txt
test001
An admittedly ugly workaround would be to change the last write time of same/tweaked files to force robocopy to copy the data:
& robocopy src dst /is /it /l /ndl /njh /njs /ns /nc |
Where-Object { $_.Trim() } |
ForEach-Object {
$f = Get-Item $_
$f.LastWriteTime = $f.LastWriteTime.AddSeconds(1)
}
& robocopy src dst /copyall /mir
Switching to xcopy is probably your best option:
& xcopy src dst /k/r/e/i/s/c/h/f/o/x/y

This is really weird, why nobody is mentioning the /IM switch ?! I've been using it for a long time in backup jobs. But I tried googling just now and I couldn't land on a single web page that says anything about it even on MS website !!! Also found so many user posts complaining about the same issue!!
Anyway.. to use Robocopy to overwrite EVERYTHING what ever size or time in source or distination you must include these three switches in your command (/IS /IT /IM)
/IS :: Include Same files. (Includes same size files)
/IT :: Include Tweaked files. (Includes same files with different Attributes)
/IM :: Include Modified files (Includes same files with different times).
This is the exact command I use to transfer few TeraBytes of mostly 1GB+ files (ISOs - Disk Images - 4K Videos):
robocopy B:\Source D:\Destination /E /J /COPYALL /MT:1 /DCOPY:DATE /IS /IT /IM /X /V /NP /LOG:A:\ROBOCOPY.LOG
I did a small test for you .. and here is the result:
Total Copied Skipped Mismatch FAILED Extras
Dirs : 1028 1028 0 0 0 169
Files : 8053 8053 0 0 0 1
Bytes : 649.666 g 649.666 g 0 0 0 1.707 g
Times : 2:46:53 0:41:43 0:00:00 0:41:44
Speed : 278653398 Bytes/sec.
Speed : 15944.675 MegaBytes/min.
Ended : Friday, August 21, 2020 7:34:33 AM
Dest, Disk: WD Gold 6TB (Compare the write speed with my result)
Even with those "Extras", that's for reporting only because of the "/X" switch. As you can see nothing was Skipped and Total number and size of all files are equal to the Copied. Sometimes It will show small number of skipped files when I abuse it and cancel it multiple times during operation but even with that the values in the first 2 columns are always Equal. I also confirmed that once before by running a PowerShell script that scans all files in destination and generate a report of all time-stamps.
Some performance tips from my history with it and so many tests & troubles!:
. Despite of what most users online advise to use maximum threads "/MT:128" like it's a general trick to get the best performance ... PLEASE DON'T USE "/MT:128" WITH VERY LARGE FILES ... that's a big mistake and it will decrease your drive performance dramatically after several runs .. it will create very high fragmentation or even cause the files system to fail in some cases and you end up spending valuable time trying to recover a RAW partition and all that nonsense. And above all that, It will perform 4-6 times slower!!
For very large files:
Use Only "One" thread "/MT:1" | Impact: BIG
Must use "/J" to disable buffering. | Impact: High
Use "/NP" with "/LOG:file" and Don't output to the console by "/TEE" | Impact: Medium.
Put the "/LOG:file" on a separate drive from the source or destination | Impact: Low.
For regular big files:
Use multi threads, I would not exceed "/MT:4" | Impact: BIG
IF destination disk has low Cache specs use "/J" to disable buffering | Impact: High
& 4 same as above.
For thousands of tiny files:
Go nuts :) with Multi threads, at first I would start with 16 and multibly by 2 while monitoring the disk performance. Once it starts dropping I'll fall back to the prevouse value and stik with it | Impact: BIG
Don't use "/J" | Impact: High
Use "/NP" with "/LOG:file" and Don't output to the console by "/TEE" | Impact: HIGH.
Put the "/LOG:file" on a separate drive from the source or destination | Impact: HIGH.

I did this for a home folder where all the folders are on the desktops of the corresponding users, reachable through a shortcut which did not have the appropriate permissions, so that users couldn't see it even if it was there. So I used Robocopy with the parameter to overwrite the file with the right settings:
FOR /F "tokens=*" %G IN ('dir /b') DO robocopy "\\server02\Folder with shortcut" "\\server02\home\%G\Desktop" /S /A /V /log+:C:\RobocopyShortcut.txt /XF *.url *.mp3 *.hta *.htm *.mht *.js *.IE5 *.css *.temp *.html *.svg *.ocx *.3gp *.opus *.zzzzz *.avi *.bin *.cab *.mp4 *.mov *.mkv *.flv *.tiff *.tif *.asf *.webm *.exe *.dll *.dl_ *.oc_ *.ex_ *.sy_ *.sys *.msi *.inf *.ini *.bmp *.png *.gif *.jpeg *.jpg *.mpg *.db *.wav *.wma *.wmv *.mpeg *.tmp *.old *.vbs *.log *.bat *.cmd *.zip /SEC /IT /ZB /R:0
As you see there are many file types which I set to ignore (just in case), just set them for your needs or your case scenario.
It was tested on Windows Server 2012, and every switch is documented on Microsoft's sites and others.

Related

Batch to move files based on total size

I have 900k files in a particular folder that I need to move daily to another folder based on the total size of 100 mbs. I've tried the following code, but it moves 100 files, not the total amount of 100 mbs
#echo on
set Source=C:\Users\%Username%\Desktop\Dropbox\Pre
set Target=C:\Users\%Username%\Desktop\Dropbox\Pos
set MaxLimit=10
for /f "tokens=1* delims=[]" %%G in ('dir /A-D /B "%Source%\*.*" ^| find /v /n ""') do (
move /y "%Source%\%%~nxH" "%Target%"
if %%G==%MaxLimit% exit /b 0
)
pause
I need the batch to move a selection on files that the sum of their size is less than or equal to 10 kbs. Basically to do the same if I manually selected N files that make up 100 mbs.
I believe that it didn't work because it checks only the single file size.
The basic problem is that you must keep track of how much data has been copied and check to see if it is over MaxLimit. This is not difficult in PowerShell. This script will copy up to $MaxLimit bytes to $TargetDir. This script requires PowerShell Core 6+. The current stable version at https://github.com/PowerShell/PowerShell is 7.1.5. This can be made to work with the old Windows PowerShell 5.1 by changing the Join-Path statements.
Place these two (2) files in the same directory. When you are confident that the correct files will be copied to the correct directory, remove the -WhatIf from the Move-Item command.
=== Move-PictureBatch.bat
#pwsh -NoLogo -NoProfile -File "%~dp0Move-PictureBatch.ps1"
=== Move-PictureBatch.ps1
#Requires -Version 6
$MaxLimit = 100MB
$SourceDir = Join-Path -Path $Env:USERPROFILE -ChildPath 'Desktop' -AdditionalChildPath 'Pre'
$TargetDir = Join-Path -Path $Env:USERPROFILE -ChildPath 'Desktop' -AdditionalChildPath 'Pos'
$CurrentSize = 0
Get-ChildItem -File -Path $SourceDir |
ForEach-Object {
if (($CurrentSize + $_.Length) -lt $MaxLimit) {
Move-Item -Path $_.FullName -Destination $TargetDir -WhatIf
$CurrentSize += $_.Length
} else {
break
}
}

Trying to find a way to keep up to date MIR'ed directory of 7 days worth of info

I am trying to use Robocopy (Win10) to maintain a directory of 7 days worth of music that will 'age out', with the 8th prior day being deleted. I run this script nightly.
I have tried this, simply:
robocopy c:\music c:\music7 /MIR /MAXAGE:7 /S
Initial copy works great; what happens though is files that are 8 days old (or older) are not 'purged' from music7 directory on the 8th day. Am I missing a parameter in robocopy to cull those?
You can repro this also by running the first command:
robocopy c:\music c:\music7 /MIR /MAXAGE:7
and then re-running altering the MAXAGE to lower
robocopy c:\music c:\music7 /MIR /MAXAGE:2
and note that days 3-7 content are not removed.
I think you have a misconception about the purge logic (/purge) implied by the /mir option:
Only files that no longer exist in the source directory are also removed from the destination directory, which is unrelated to the /maxage:<N> option.
/maxage:<N> only restricts which of the existing source files are copied, by when they were last modified.
It does not mean that older files are automatically removed from the destination directory.
Therefore, assuming that you want to keep older files around in the source folder and only purge them from the destination folder:
Run robocopy c:\music c:\music7 /mir /maxage:7, as before, which will copy only the last 7 days' worth of file to the destination folder.
As an aside: no need for option /s (copy subdirs., except empty ones), because /mir implies /e (copy subdirs., including empty ones)
Then use a different tool / shell, such as PowerShell, to remove older files from the destination folder:
$cutOffDate = (Get-Date).Date.AddDays(-7)
Get-ChildItem -File -Recurse -LiteralPath c:\Music7 |
Where-Object { $_.LastWriteTime -lt $cutOffDate } |
Remove-Item -WhatIf
-WhatIf previews what will be deleted; remove it to perform actual deletion.
Thanks very much for the comments. After additional research I was able to get what I was after staying in CMD using the ForFiles command in conjunction with Robocopy in a batch file.
Similar to these lines:
ForFiles /p "c:\music7" /d -7 /c "cmd /c del #path /Q /S"
ForFiles /p "c:\music7" /d -7 /c "cmd /c rd #path /Q"
Robocopy c:\music c:\music7 /MIR /MAXAGE:6
Things can get a little quirky since I can have some directories whose dates appear to be modified earlier or later than their files under them, so I can't just do a destructive RD /S; robocopy will add the files back on its run.
PS - For those curious, this methodology allows me to effectively emulate time based autoplaylists for Groove which doesn't support it natively like WMP or Zune did. I can push the 7 days worth of songs into OneDrive\Music and filter on that in the player clients to play that subset of my collection. Now I have a way to task scheduler automate the purge older than 7 days content. Thanks again.
In powershell after your robocopy or a separate process
Get-ChildItem -file -path C:\music ($_.CreationTime -le (Get-Date).AddDays(-8)) | remove-item

Delete folders having particular naming pattern in a path in windows

I have a directory that contains .exe executable and required input files.
For each individual request, a separate directory (of the format Output_) is created in the same path and the respective output files are directed there.
Now i am in need of purging the older output directories. Can somebody explain how to achieve this in windows.
I went through the forfiles documentation and other stack overflow answers but did not find any information on providing name pattern for folders that need to be deleted although this option is available for file deletion.
I am trying the below code. But i want to specify the folder pattern as well so that i do not delete other files.
forfiles /p "C:\work\Analysis\" /d +7 /c "cmd /c IF #isdir == TRUE rd /S /Q #path"
Here an example in Powershell
$Now = Get-Date
$Hours = "4"
$TargetFolder = **!Folder in which old data exists!**
$LastWrite = $Now.AddHours(-$Hours)
$Folders = get-childitem -path $TargetFolder |
Where {$_.psIsContainer -eq $true} |
Where {$_.LastWriteTime -le "$LastWrite"}
foreach ($Folder in $Folders)
{
write-host "Deleting $Folder" -foregroundcolor "Red"
Remove-Item $TARGETFOLDER\$Folder -recurse -Force
}
Below command did the job for me. /M works with directories as well. Thanks to JJS's answer on https://superuser.com/q/764348
forfiles /P C:\work\Analysis\ /M guid_* /d +5 /C "cmd /c if #isdir==TRUE rmdir /s /q #file"

CMD - Copy null to multiple files

I want to make lots of files 0 bytes on windows.
The command copy /y nul test.vtx was working. I need to change their size without changing the file names.
How can i use copy command to automatically detect the filenames and use them to erase its contents? Would be a bath file helpful?
Thanks.
With break> i don't know if it is possible anyway you can iterate :
for %G in (*.vtx) do (copy /Y nul "%G")
Including subfolders :
for /R %G in (*.vtx) do (copy /Y nul "%G")
Another way using PowerShell. When you are satisfied that the correct files will be overwritten, remove the -WhatIf from the Set-Content statement.
Get-ChildItem -Path C:\src\t\adir -Filter *.vtx |
ForEach-Object { Set-Content -Path $_ -Value $null -WhatIf}
Usage :: ROBOCOPY source destination [file [file]...] [options]
source :: Source Directory (drive:\path or \\server\share\path).
destination :: Destination Dir (drive:\path or \\server\share\path).
file :: File(s) to copy (names/wildcards: default is "*.*").
/CREATE :: CREATE directory tree and zero-length files only.
I see you specified "copy" but I assumed you or others might accept robocopy too. My guess would be:
ROBOCOPY C:\vtxfolder1 C:\vtxfolder2 *.vtx /CREATE

how to find large files in command line windows for version 5.1

How do I find large files in windows ver 5.1 command line?
For Windows ver 6.1, I can run the following command:
forfiles /p c:\ /s /m . /c "cmd /c if #fsize gtr 100000 echo #file
#fsize"
However what is the equivalent command for version 5.1 windows?
Appreciate your quick help!
(added quote marks)
To be run from command line
for /r c:\ %f in (*) do if %~zf gtr 100000 echo %f %~zf
To run it from batch file, change % with %%
EDIT - As stated in the comments, arithmetic in batch command lines have some limits on the operands. In this case, the %~zf (size of file referenced in for command) in the left part of the if command have no problems, but the value on the right size is limited to values from -2147483647 to 2147483646.
To handle it, if command is executed under an administrator account, wmic command can be used. The command in OP question can be written as
wmic datafile where "drive='c:' AND filesize>'100000'" get caption, filesize
first search biggest directories in actual directory
diruse /* /m /c /, . |sort
cd to the chosen dir; you can use diruse again or
search biggest files in the chosen directory, which are more than 500MB for example:
forfiles /s /c "cmd /c if #fsize gtr 500000000 echo #path #fsize bytes"
For windows, I couldn't find one answer that suits me for command line, so i coded my answer in powershell.
Following code will help you find the list of files (more than 1 KB) in your desired path to scan.
#Run via Powershell
#$PathToScan refers to the directory where you want to find your files greater than 1 KB.
$PathToScan="E:\"
"Path to scan:`t $PathToScan"
$hash = $null
$hash = #{}
Get-ChildItem -path $PathToScan -recurse | Where-Object {
#If you want to find files greater than 1GB, replace all KB in this powershell script to GB.
($_.Length /1KB) -gt 2
} | ForEach-Object {
$hash.add(
$_.DirectoryName.Replace("$PathToScan","")+$_.Name,
[math]::Round(($_.length/1KB),0))
}
$TableStyle = #{Expression={$_.Value}; Label="Size (KB)"; Width=10; Alignment = 'Left'},
#{Expression={$_.Name}; Label="Name"; Width=100}
$hash.GetEnumerator() | Sort Value -Descending| Format-Table -Property $TableStyle
Countdown 5
Function Countdown{
Param ($TimeInSec)
Do {
Write-Host "$TimeInSec"
Start-Sleep -s 1
$TimeInSec--
}
Until ($TimeInSec -eq 0)
}
Or you may view my code here;
[Find large files in KB for Windows]:
https://github.com/worldpeacez0991/powershell
Hope my answer helps someone, have a great day ^ ^

Resources