Appending date/time to file AND updating last modified/save date - windows

I am trying to automate some file renaming but also need the file to update it's "last modified" time as I have a field inserted within the Word document that dynamically updates the last time the file was edited.
copy C:\path\to\file\test\test.docx "C:\path\to\file\test2\test-%date:~-7,2%-%date:~-10,2%-%date:~-4,4% %time:~-11,2%%time:~-8,2%.docx"
I tried to integrate the following syntax:
copy /b filename.ext +,,
That I got from:
https://superuser.com/questions/10426/windows-equivalent-of-the-linux-command-touch/764716
However it did not output anything when I put the + after the source file.
copy /b "C:\path\to\file\test\test.docx" + "C:\path\to\file\test2\test-
%date:~-7,2%-%date:~-10,2%-%date:~-4,4% %time:~-11,2%%time:~-8,2%.docx"
I also tried invoking a PowerShell script within the batch file to update last modified date:
$file = Get-Item C:\Path\TO\test.docx
$file.LastWriteTime = (Get-Date)
copy C:\path\to\file\test\test.docx "C:\path\to\file\test2\test-
%date:~-7,2%-%date:~-10,2%-%date:~-4,4% %time:~-11,2%%time:~-8,2%.docx"
powershell -file C:\path\to\powershell.ps1
Can't get it to work either way, I'm new to this so probably missing something simple.

I was able to figure this one out. My batch file is now as follows:
powershell -command "(Get-Item "C:\path\to\file\test\test.docx").LastWriteTime = (Get-
Date)"
copy C:\path\to\file\test\test.docx "C:\path\to\file\test2\test-%date:~-7,2%-
%date:~-10,2%-%date:~-4,4%%time:~-11,2%%time:~-8,2%.docx"
Which first modifies the last edited date of the file, and then copies it over to test 2 folder with the time and date appended.

I'm a bit curious what you mean with:
I have a field inserted within the Word document that dynamically updates the last time the file was edited.
If you copy a file it's content is not altered and the LastWriteTime stays the same,
so why do you want to set LastWriteTime to current date&time?
As per your attempt with copy in cmd.exe you did omit the two commas, this should do:
copy /b "C:\path\to\file\test\test.docx" + , , "C:\path\to\file\test2\test-%date:~-7,2%-%date:~-10,2%-%date:~-4,4% %time:~-11,2%%time:~-8,2%.docx"
The PowerShell suggestion from my comment:
'C:\path\to\file\test\test.docx' |
Get-Item |
Copy-Item -Destination {'{0}\{1}-{2:MMddyyyy\ HHmm}{3}' -f `
$_.Directory, $_.Basename,
$_.LastWriteTime, $_.Extension} -WhatIf
Could be modified to rename all files with date time appendix to reflect the actual LastWriteTime.
Get-ChildItem -File -Filter *.docx |
Where BaseName -Match '-\d{8} \d{4}$' |
Rename-Item -NewName {'{0}-{1:MMddyyyy\ HHmm}{2}' -f `
$_.Basename.Replace($Matches[0],''),
$_.LastWriteTime, $_.Extension} -WhatIf
If the NewName is the same, Rename-Item ignores it.
In contrast to Copy-Item, Rename-Item doesn't allow a directory.

Related

Windows Compare filenames and delete 1 version of the filename

I have several hundred folders where I will have multiple files called filename.ext but also another file called filename.ext.url
I need a way of checking if filename.pdf.url exists does filename.ext exist. If they both exist delete filename.ext.url
I can't just do a search and delete all *.url files as they will be needed if the normal file does not exist
I then need to repeat that in all subdirectories of a specific directory.
I don't mind its its a batch script, powershell script that does it or any other way really. I'm just stumped on how to do what I want.
Currently I'm doing it folder by folder, manually comparing file names, file size and file icon.
foreach ($file in ls -Recurse c:\files\*.url) {
if (ls -ErrorAction Ignore "$($file.PSParentPath)\$($file.basename)") {
remove-item $file.fullname -whatif
}
}
remove whatif when ready to delete.
the basename removes the extension, so if they are all .ext.url then it will check if that file exists. It also removes the path, so we pull that as well.
an alternative way (that more matches what you're explaining) is something like
foreach ($file in ls -Recurse "c:\files\*.url") {
### replacing '.url$' means .url at the end of the line in Regex
if (ls -ErrorAction Ignore ($file.FullName -replace '\.url$')) {
remove-item $file.fullname -whatif
}
}
for /r "startingdirectoryname" %b in (*.url) do if exist "%~dpnb" ECHO del "%b"
This is expected to be executed directly from the prompt. If the requirement is as a batch line, each % needs to be doubled (ie. %%).
This also assumes that the whatever.ext file is to be in the same directory as the whatever.ext.url file.
Note that the filenames that are to be deleted will merely be echoed to the console. To actually delete the files, remove the echo keyword.
Test against a test directory first!
[untested]
To check for "filename.ext", check for "filename.ext.", they are the same file.
In CMD.EXE, you can do "IF EXIST "filename.ext." CALL :DoIt

How can I get a list of folders/subfolders that DOES NOT contain a specific type of file?

For example, I wanted to print a list of folders which contain an .html file. Correct me if I am wrong, but this worked for me:
dir /S *.html > C:\Users\PC\Desktop\with-html.txt
Now I would like to find the folders which do not contain .html files.
How do I go about that?
EDIT:
The folders are structured in a way that only the child folders (last subfolder) have any kind of files. I would like to get a list of directories to those subfolders. So the above command line is giving me:
C:\...\ml\regression\lasso-regression
C:\...\ml\regression\linear-regression
There is not output just C:\...\ml or C:\...\ml\regression.
The folder structure looks like this:
C:\...\ml
classification
regression
lasso-regression
linear-regression
There are about 10 folders in folder ml and no files. There are again about 10 folders in second folder level where C:\...\ml\regression\linear-regression contains an HTML file while C:\...\ml\regression\lasso-regression does not contain a file with file extension .html. Only the folders in last level of the folders tree have files at all.
I'd be grateful getting just the list of the last folders in folders tree not containing a file with file extension .html.
I basically output the above command line into a .csv file, filtered it with MS Excel, and have now a list of folders with .html file(s). I'm basically working with R-markdown files, and it'll be a status report, the folders list with .html files is what I have completed already. So in need now only the opposite folders list.
Not difficult using PowerShell.
Get-ChildItem -Recurse -Directory |
ForEach-Object {
if ((Get-ChildItem -File -Path $_.FullName -Filter '*.html').Length -eq 0) { $_.FullName }
}
If you must run this in a .bat file script, the following might be used.
#powershell.exe -NoLogo -NoProfile -Command ^
"Get-ChildItem -Recurse -Directory |" ^
"ForEach-Object {" ^
"if ((Get-ChildItem -File -Path $_.FullName -Filter '*.html').Length -eq 0) { $_.FullName }" ^
"}"

Issue with Robocopy on Windows Server 2019

I've had a robocopy script running on Windows Server 2008 (Powershell version 4) with no issues for about 6 months now.
We've recently had to migrate off of this machine to initiate the same script off of a Windows Server 2019 (Powershell 5) machine. The exact same script no longer works, and i'm not quite sure what the issue is. The script is:
# Run BCP Bat file to get list of folders that have changed in the past 12 days and save them to
\\xxxx\xxxx\xxx\xxx\xxx.bat
# Select the starting directory to pull the CSV from. The assumption is that the DB file can #be sent to a dedicated directory that just has only the CSV files
$Filerecentdir = "\\ZZZ\Z\ZZZ\ZZZZZ\ZZZZ\ZZZZZZ"
#
# Filtering for only CSV files
$Filter = "*.csv"
#
# This PS command will search for the file with the latest timestamp on it. Coupling this with #the filter above, we turn $Filerecent into a variable consisting of the most recent CSV
$Filerecent = Get-ChildItem -Path $Filerecentdir -Filter $Filter | Sort-Object LastAccessTime -Descending | Select-Object -First 1
#
# I concatenate $Filerecentdir with the $Filerecent variable to form the full path to the CSV
$FullPath = Join-Path -path $Filerecentdir -ChildPath $Filerecent
#
# $roboSource variable uses import-csv cmdlet to parameterize the one (headerless) column that #the DB file creates
$roboSource = Import-Csv -Header #("a") -Path $FullPath
#
# Arbitrary directory that i'm using to store the logs. We'll change this to something on the #file server so it can be viewable
$logPath = "\\AAAA\A\AAAA\AAAA\AAAA\AAAA\"
#creates a folder to seperate weekly logs based off the output of the bat script
$weeklylogfolder = $Filerecent -replace '_.csv'
New-Item -ItemType Directory -Force -Path "$($logPath)$($weeklylogfolder)"
#
# For each loop to iterate over every single row in the CSV
Foreach($script in $roboSource)
{
# I used the two below variables to replace the two last trailing '\' in each entry
$prefix = $script.a -replace '\{.+'
$suffix = $script.a.Substring($prefix.Length) -replace '\\', '_' #keeping the {} in the file #name.
#$suffix = $script.a.Substring($prefix.Length) -replace '[{}]' -replace '\\', '_'
#
#$logFileName = $prefix + $suffix
$logFileName = $suffix
#
# Same switches that we used
#$StandardSwitches = "/copy:DAT /s /dcopy:DAT /V /L" #no copy
$StandardSwitches = "/copy:DAT /s /dcopy:DAT /V" #Copy
#
# Creates the log file in the same format that we used, so one log file per entry
$log = "/log:`"$($logPath)$($weeklylogfolder)\$($logFileName).log`""
#
# Iterates through each row to create the source and destination
$FileSource = "$($script.a)"
$FileDestination = "$($script.a)"
#
# used this to surround the certain variables with double quotes, otherwise Robo fails
$RoboArgs = '"I:\{0}" "Z:\{1}" {2} {3}' -f
$FileSource, $FileDestination, $StandardSwitches, $log
#
Robocopy $RoboArgs
}
I can't seem to pinpoint what would be causing the issue's I'm seeing. I've tried to run each of the commands within the script alone, and when I do, I am noticing that the $RoboArgs variable seems to cut off which then generates an incomplete $FileDestination
$RoboArgs with the above generates the exact same output as it does on server 2008/Powershell 4. However, it seems that powershell 5 processes this differently. Is there something I'm doing wrong or need to add in order to get this to process correctly?
EDIT:
Here's an example of what $RoboArgs is defined as in Win2019/Powershell5:
PS C:\> echo $RoboArgs
"I:\Fake-directory\Fake-directory\D\{DAFD6721-E854-46F3-B5CF-7EE1861348A6}\Fake\FakeDate" "Z:\Fake-Directory\Fake-Directory\D\{DAFD6721-E854-46F3-B
5CF-7EE1861348A6}\Responses\01182020" /copy:DAT /s /dcopy:DAT /V /log:"I:\FakeDirectory\FakeDirectory\Fake\Logs\Folders_Changed_20200118_21.1
\{DAFD6721-E854-46F3-B5CF-7EE1861348A6}_Responses_01182020.log"
It basically cuts off where the first line ends, and generates the destination in the same manner:
PS C:\> Robocopy "I:\Fake-directory\Fake-directory\D\{DAFD6721-E854-46F3-B5CF-7EE1861348A6}\Fake\FakeDate" "Z:\Fake-Directory\Fake-Directory\D\{DAFD6721-E854-46F3-B
5CF-7EE1861348A6}\Responses\01182020" /copy:DAT /s /dcopy:DAT /V /log:"I:\FakeDirectory\FakeDirectory\Fake\Logs\Folders_Changed_20200118_21.1
\{DAFD6721-E854-46F3-B5CF-7EE1861348A6}_Responses_01182020.log"
2020/01/18 23:17:21 ERROR 123 (0x0000007B) Opening Log File I:\FakeDirectory\FakeDirectory\Fake\Logs\Folders_Changed_20200118_21.1
\{DAFD6721-E854-46F3-B5CF-7EE1861348A6}_Responses_01182020.log
The filename, directory name, or volume label syntax is incorrect.
-------------------------------------------------------------------------------
ROBOCOPY :: Robust File Copy for Windows
-------------------------------------------------------------------------------
Started : Saturday, January 18, 2020 11:17:21 PM
Source - I:\FakeDirectory\FakeDirectory\Fake\D\{DAFD6721-E854-46F3-B5CF-7EE1861348A6}\Responses\01182020\
Dest - Z:\FakeDirectory\FakeDirectory\Fake\D\{DAFD6721-E854-46F3-B
Edit 2:
Comparing this to Win2008/Powershell4, it actually does the same thing in regards to failing if there's a line break. I think the real problem is how the script is trying to define the source and destination. On 2008/Powershell4, running the script generates everything correctly, with the correct source, destination and log file. Running the exact same script on 2019/Powershell5 seems to generate a problem with how Robocopy is defining these paths, placing both Source, Destination, and Log path all into the source variable, even though that's not how it's defined:
PS C:\> echo $FileSource
FakeDirectory\FakeDirectory\D\{DAFD6721-E854-46F3-B5CF-7EE1861348A6}\Responses\01182020
PS C:\> echo $FileDestination
FakeDirectory\FakeDirectory\D\{DAFD6721-E854-46F3-B5CF-7EE1861348A6}\Responses\01182020
PS C:\> echo $log
/log:"I:\FakeDirectory\FakeDirectory\FakeDirectory\Logs\Folders_Changed_20200118_21.1\{DAFD6721-E854-46F3-B5CF-7EE1861348A6}_Responses_01182020.log"
PS C:\> echo $StandardSwitches
/copy:DAT /s /dcopy:DAT /V

powershell copy-item doesn't copy when filter is used

I'm trying to copy files using copy-item. Specifically, I want to copy files with a particular extension that are within a folder or its subfolders to another location, and to retain the subfolder hierarchy. I've tried using -filter and -include to specify the file extension, but no files are copied.
My source and destination paths are stored in variables $packageSourcePath and $objPath. When called, $packageSourcePath will be like the following ".\src\projects\Project1\PackageFiles" and $objPath will be like the following ".\bld\Project1\obj".
The command I've tried using is this:
Copy-Item -Path $packageSourcePath\* -Filter *.resw -Destination $objPath -Recurse
I've also tried variations, such as leaving off * from the path, or using -Include instead of -Filter. Nothing works. If I leave out the -Filter argument, then files copy, but all of the files are copied. I only want files with the particular extension.
I've given up on Copy-Item. JohnLBevan's answer didn't actually do what I want since all files in the source root get copied, even though they don't match the filter. I tried piping Convert-Path | Select-String | Copy-Item but still got all files in the source root being copied.
A contact in a different context provided a couple of suggestions:
1)
Get-ChildItem -Force -Recurse -ErrorAction Ignore -Path $packageSourcePath -Filter *.resw | % {
$src = $_.FullName
$dst = Join-Path $objPath $src.SubString($packageSourcePath.Length)
echo "copy ""$src"" ""$dst"""
}
I think this is a bit harder to follow, hence less maintainable for the next person (likely another PS-neophyte like me) a year from now. ("Why is the -ErrorAction parameter needed here? What's the behaviour of the Substring() method, and why can't I find that using Get-Help?")
This suggestion is a bit clearer, after re-familiarizing with attrib and checking the effect of the xcopy switches:
2)
cd $packageSourcePath
attrib -a /s
attrib +a *.resw /s
xcopy /eidlm $packageSourcePath $objPath
But if we're going to use xcopy, we don't need to call attrib:
xcopy $packageSourcePath*.resw $objPath /s /i > $null
The only problem with this for my scenario is that xcopy emits an error if no matching files are found. My script is being used for a VSTS build task, and the xcopy errors cause the build task to fail. (For that reason, I'm guess that suggestion 2 also wouldn't work for me.)
So, I've opted for this:
# In PS version 5.1, nothing gets copied using Copy-Item $packageSourcePath\* -Filter *.resw ...
# so resorting to using xcopy, which mostly works. The one issue is that xcopy will output an
# error if no matching file is found, so using GCI first to test for a matching file.
if ($(Get-ChildItem $packageSourcePath\*.resw -Recurse).count -gt 0) {
xcopy $packageSourcePath\*.resw $objPath /s /i > $null
}
The condition using GCI is added to check there are matching files before calling xcopy, thereby avoiding any errors.
I'm still amazed that Copy-Item -Filter -Recurse didn't work.
This should do it (obviously this could be done in 1 line; I've assigned values to the variables just to help make it readable / self-explanatory):
[string]$filter = '*.resw'
[string]$source = Join-Path -Path $packageSourcePath -ChildPath '*'
[string]$target = $objPath
$source | Convert-Path | Copy-Item -Filter $filter -Recurse -Destination $target -Container #-Force
Notes:
We append the asterisk to the source path to ensure that we copy the contents of the source folder to the destination, without copying the source's root folder into the destination (i.e. say we're copying c:\temp\from to c:\temp\to, we don't want c:\temp\to\from (unless it's a copy of c:\temp\from\from)).
We use the Join-Path cmdlet to append this asterisk to ensure the appropriate slashes are inserted into the path.
We do a Convert-Path on the source to resolve the asterisk to the child folder/file names... for some reason copy-item doesn't handle these asterisks well. NB: Convert-Path will potentially return an array of paths; i.e. if there's more than one file/subfolder directly under the source folder. Get-Item or Resolve-Path could equally be used for this; I prefer Convert-Path since it returns a simple string array, rather than a more complex type; but there's no strong argument for using any one over the others.
We pipe these source paths to the Copy-Item command so it can be applied to each path returned by Convert-Path.
We include -Recurse to say we're interested in anything in the subfolders of the copied path.
We include the -Container parameter to say that we want to preserve any folder structure when copying. Strictly this is not needed, as this switch is defaulted to true (i.e. rather we should specify if we don't want this behaviour: -Container:$false; but I like to be clear that I deliberately want to preserve the directory structure, as opposed to leaving the assumption that I may not have thought of this. There's a better explanation of this here: https://stackoverflow.com/a/21798660/361842.
You could optionally include -Force; this would mean that should an item of the same name already exist in the target we overwrite it instead of getting an error.
Related documentation:
Join-Path
Convert-Path
Copy-Item
Update 2018-01-03
Per comments, this solution should ensure that only those items you want get copied, and pre-existing directories shouldn't cause issues.
[string]$filter = '*.resw'
[string]$source = $packageSourcePath
[string]$target = $objPath
#copy all files in subfolders of the source
$source | Get-ChildItem -Directory | Copy-Item -Filter $filter -Recurse -Destination $target -Container -Force
#copy all files in root of the source
$source | Get-ChildItem -File -Filter $filter | Copy-Item -Destination $target -Container -Force
This solution uses 2 steps; there's probably a better option, but due to the peculiarities / bug in this cmdlet the above's a reliable option.

Rename multiple files in a folder and its subfolders to a fixed pattern

I have a large number of files to be renamed to a fixed pattern on my Windows 8 computer.
I am looking for a way so that I can rename the files in a quick way, any way: like any software that does it or any command on command window.
Following is what I need to do:
Original file name: Body Begger Power.docx.htm
Need this to be: body-begger-power.html
From the root directory you can try this:
Get-ChildItem -Force -Recurse | Move-Item -Destination {$_.FullName.ToLower() -replace ' ', '-'}
I don't see any pattern for removing the extension. If all files are docx.html and you want them changed to html, you could simply do another replace like this:
Get-ChildItem -Force -Recurse | Move-Item -Destination {($_.FullName.ToLower() -replace ' ', '-') -replace '.docx.htm$', '.html'}
for /r %i in (.) do if exist "%i\body begger power.docx.htm" ECHO ren "%i\body begger power.docx.htm" body-begger-power.html
from the prompt will scan the tree from . (the current directory - or substitute the directoryname from which you want to start). This will simply ECHO the filenames detected to the screen - you need to remove the ECHO keyword to actually rename the file.

Resources