PowerShell - Match getting wrong file - windows

I have the following code:
$MoveSheet0101AfterRender = get-ChildItem $Sheet01 -recurse | where {$_.name -match "Model 01-02 - $TodayDate"} | Move-Item -Destination (new-item -type directory -force ($OldSheets + $newSub)) -force -ea 0
Basically it asks PowerShell to look for files with Model 01-02 - $TodayDate in the name and move these files to the directory defined in $OldSheets, but for some reason it is copying files that have a slight variation in the name.
For example: if the file is named Model 01-03 - $TodayDate, for some reason the script is moving too.
I tried changing -match to -contains, but that way no files were moved.
Remembering that Model 01-02 - $TodayDate is just a part of the file name, for this reason I cannot use -eq.
How can I resolve this?
Update:
I have another variable called $MoveSheet0103AfterRender and its code is this:
$MoveSheet0103AfterRender = get-ChildItem $Sheet01 -recurse | where {$_.name -eq "$ClientName - Modelo 01-03 - $TodayDate.txt"} | Move-Item -Destination (new-item -type directory -force ($OldSheets + $newSub)) -force -ea 0
What seems to be happening is that this variable is being executed even though it's not in the code, because when I delete it the code works as expected.
What is the reason for this?

Related

Gather similar files into separate folders based on keywords in the filenames of multi-part archives

I have a folder that contains many rar or zip files. I want put similar files (based on part word in filename if exist) to own folder.by default in parent folder there isn't any folder.maybe in future another part of file added to parent directory so this time it should move file to his own folder instead of create new folder.
For example assume the files are:
Visual_Studio_2015.part1.rar
Visual_Studio_2015.part2.rar
Visual_Studio_2015.part3.rar
SQL-Server-Enterprise-2016-SP1.part1.rar
SQL-Server-Enterprise-2016-SP1.part2.rar
VSCodeSetup x64 1.29.1.rar
Microsoft.Visual.Studio.Ultimate.2012.update.3.part1.rar
Microsoft.Visual.Studio.Ultimate.2012.update.3.part12.rar
after moving,become looks like this:
Parent Directory
├───Visual_Studio_2015
│ ├───Visual_Studio_2015.part1.rar
│ ├───Visual_Studio_2015.part2.rar
│ ├───Visual_Studio_2015.part3.rar
├───VSCodeSetup x64 1.29.1
│ ├───VSCodeSetup x64 1.29.1.rar
├───SQL-Server-Enterprise-2016-SP1
│ ├───SQL-Server-Enterprise-2016-SP1.part1.rar
│ ├───SQL-Server-Enterprise-2016-SP1.part2.rar
├───Microsoft.Visual.Studio.Ultimate.2012.update.3
│ ├───Microsoft.Visual.Studio.Ultimate.2012.update.3.part1.rar
│ ├───Microsoft.Visual.Studio.Ultimate.2012.update.3.part2.rar
i can't use any software or compiled programming language for this problem. sorry for weak English
update:
in powershell somthing like this:
Get-ChildItem -File |
Group-Object { $_.Name -replace '.part.*' } |
ForEach-Object {
$dir = New-Item -Type Directory -Name $_.Name
$_.Group | Move-Item -Destination $dir
}
can separating files that have part in filename but not work for without it, also i must mention that all filename end with .partX (X is a digit) if it is multi parted archives.
If all the files are in one root folder and have the naming convention you specify, then here is one way to move them into appropriate subfolders:
Get-Childitem -path "C:\Test" -File |
ForEach-Object {
if($_.Name -match "^(?<folder>.*)\.part\d+|(?<folder>.*)\.rar$") {
New-Item -Path "$($_.Directory)\$($matches.Folder)" -ItemType Directory -Force | Out-Null
Move-Item -Path $_.FullName -Destination "$($_.Directory)\$($matches.Folder)\$($_.Name)" -Force
}
}
Change the path in Get-Childitem as appropriate. Also, you can modify the paths for New-Item and Move-Item if you want them to be located somewhere else instead of as subfolders of the root directory.
Another way to do it would be this:
$parentFolder = '<THE PARENTFOLDER THAT HOLDS ALL .RAR AND .ZIP FILES>'
# Get all files inside the parent folder with extension '.rar' or '.zip'
# Because '-Filter' only accepts a single string, we need to use a 'Where-Object' clause.
# Another way would be to use the '-Include' parameter on Get-Childitem, but for that to work
# you must either also use '-Recurse' or append '\*' to the $parentfolder like this:
# Get-ChildItem -Path "$parentFolder\*" -File -Include *.rar, *.zip
Get-ChildItem -Path $parentFolder -File | Where-Object { $_.Extension -match '\.(rar|zip)$' } | ForEach-Object {
# create the name of the subfolder by removing the '.partX' from the basename if that exists
$subFolder = Join-Path -Path $parentFolder -ChildPath ($_.BaseName -replace '\.part\d+', '')
# create this subfolder if it does not already exist
if (!(Test-Path -Path $subFolder -PathType Container)) {
New-Item -Path $subFolder -ItemType Directory -Force | Out-Null
}
# move the file to the subfolder
$_ | Move-Item -Destination $subFolder
}

Deleting files with PowerShell script

I have a script in PowerShell that scans a directory of folders that are named with the following convention: yyyymmdd. It scans the directory and finds all the folders that are current and up to one week old, then copies them over to another directory. After it copies them over to the other directory, I would like to have it delete the folders in the new directory which are named the same way and that are older than 18 months old. Would there be an easy way to do this? I have pasted the script below.
$targetdirectory = "\\DPR320-W12-1600\PRTG"
$sourcedirectory = "C:\Users\Public\Documents\PRTG Traffic Grapher"
$todaysdate=get-date
$minusoneweek=$todaysdate.adddays(-7)
$minusdate=($minusoneweek.Month).tostring(),($minusoneweek.day).tostring(),($minusoneweek.year).tostring()
$todaysdatestring=($todaysdate.Month).tostring(),($todaysdate.day).tostring(),($todaysdate.year).tostring()
$oldfilename=$minusdate[0]+$minusdate[1]+$minusdate[2]+" backup"
$newfilename=$todaysdatestring[0]+$todaysdatestring[1]+$todaysdatestring[2]+" backup"
Get-ChildItem $sourcedirectory\config | Where-Object {
$_.PsIsContainer -and
$_.BaseName -match '\d{6}' -and
([DateTime]::ParseExact($_.BaseName, 'yyyyMMdd', $null) -gt (Get-Date).AddDays(-7))
} |Copy-Item -Recurse -Force -Destination $targetdirectory\$oldfilename\config
Copy-Item -Force $sourcedirectory\config.csv -Destination $targetdirectory\$oldfilename
Copy-Item -Force $sourcedirectory\config.prtg -Destination $targetdirectory\$oldfilename
rename-item $targetdirectory\$oldfilename $newfilename
Based on your discussion with FoxDeploy:
This will loop through $YourDirectory and check if the name represents the date that's older than 18 months.
Used this for my own cleanup, except I have names in dotted format.
$CleanupList = Get-ChildItem $YourDirectory
$Threshold = (get-date).AddMonths(-18)
foreach ($DirName in $CleanupList)
{
If (([datetime]::ParseExact($DirName.BaseName,'yyyyMMdd',$null)) -lt $Threshold)
{
Remove-Item $DirName.FullName -Force -Recurse
}
}
Assuming $targetDirectory contains the files you'd like to remove (Those older than 18 months), just add this to the end of your script:
#resolve this day, 18 months ago
$18Months = (get-date).AddMonths(-18)
#get the name in the right format (i.e. 20141224)
$18MonthString = get-date -Date $18Months -UFormat "%Y%m%d"
#find the files with a name like this and delete them
$OldPrtgFiles = dir $targetDirectory | Where Name -like "$18MonthString*"
$OldPrtgFiles | remove-item -whatif
The first execution will display the -WhatIf view, to show you which files would be removed. If you're happy with this, remove WhatIf.

Compare a log file of file paths to a directory structure and remove files not in log file

I have a file transfer/sync job that is copying files from the main network into a totally secure network using a custom protocol (ie no SMB). The problem is that because I can't look back to see what files exist, the destination is filling up, as the copy doesn't remove any files it hasn't touched (like robocopy MIR does).
Initailly I wrote a script that:
1. Opens the log file and grabs the file paths out (this is quite quick and painless)
2. Does a Get-ChildItem on the destination folder (now using dir /s /b as it's way faster than gci)
3. Compared the two, and then removed the differences.
The problem is that there are more jobs that require this clean up but the log files are 100MB and the folders contain 600,000 files, so it's taking ages and using tons of memory. I actually have yet to see one finish. I'd really like some ideas on how to make this faster (memory/cpu use doesn't bother me too much but speed is essential.
$destinationMatch = "//server/fileshare/folder/"
the log file contains some headers and footers and then 600,000 lines like this one:
"//server/fileshare/folder/dummy/deep/tags/20140826/more_stuff/Deeper/2012-07-02_2_0.dat_v2" 33296B 0B completed
Here's the script:
[CmdletBinding(SupportsShouldProcess=$True)]
param(
[Parameter(Mandatory=$True)]
[String]$logName,
[Parameter(Mandatory=$True)]
[String]$destinationMatch
)
$logPath = [string]("C:\Logs\" + $logName)
$manifestFile = gci -Path $logPath | where {$_.name -match "manifest"} | sort creationtime -descending | select Name -first 1
$manifestFileName = [string]$manifestFile.name
$manifestFullPath = $logPath + "\" + $manifestFileName
$copiedList = #()
(gc $manifestFullPath -ReadCount 0) | where {$_.trim() -match $DestinationMatch} | % {
if ( $_ -cmatch '(?<=")[^"]*(?=")' ){
$copiedList += ($matches[0]).replace("/","\")
}
}
$dest = $destinationMatch.replace("/","\")
$actualPathString = (gci -Path $dest -Recurse | select fullname).fullnameCompare-Object -ReferenceObject $copiedList -DifferenceObject $actualPathString -PassThru | % {
$leaf = Split-Path $_ -leaf
if ($leaf.contains(".")){
$fsoData = gci -Path $_
if (!($fsoData.PSIsContainer)){
Remove-Item $_ -Force
}
}
}
$actualDirectory | where {$_.PSIsContainer -and #(gci -LiteralPath $_.FullName -Recurse -WarningAction SilentlyContinue -ErrorAction SilentlyContinue | where {!$_.PSIsContainer}).Length -eq 0} | remove-item -Recurse -Force
Ok, so let's assume that your file copy preserves the last modified date/time stamp. If you really need to pull a directory listing, and compare it against a log, I think you're doing a decent job of it. The biggest slow down is obviously going to be pulling your directory listing. I'll address that shortly. For right now I would propose the following modification of your code:
[CmdletBinding(SupportsShouldProcess=$True)]
param(
[Parameter(Mandatory=$True)]
[String]$logName,
[Parameter(Mandatory=$True)]
[String]$destinationMatch
)
$logPath = [string]("C:\Logs\" + $logName)
$manifestFile = gci -Path $logPath | where {$_.name -match "manifest"} | sort creationtime -descending | select -first 1
$RegExPattern = [regex]::escape($DestinationMatch)
$FilteredManifest = gc $manifestfile.FullPath | where {$_ -match "`"($RegexPattern[^`"]*)`""} |%{$matches[1] -replace '/','\'}
$dest = $destinationMatch.replace("/","\")
$DestFileList = gci -Path $dest -Recurse | select Fullname,Attributes
$DestFileList | Where{$FilteredManifest -notcontains $_.FullName -and $_.Attributes -notmatch "Directory"}|Remove-Item $_ -Force
$DestFileList | Where{$FilteredManifest -notcontains $_.FullName -and $_.Attributes -match "Directory" -and (gci -LiteralPath $_ -Recurse -WarningAction SilentlyContinue -ErrorAction SilentlyContinue).Length -eq 0}{Remove-Item $_ -Recurse -Force}
This stops you from duplicating efforts. There's no need to get your manifest file, and then assign different variables to different properties of the file object, just reference them directly. Then later when you pull your directory listing of the drive (the slow part here), keep the full name and attributes of the files/folders. That way you can easily filter against Attributes to see what's a directory and what not, so we can deal with files first, then clean up directories later after the files are cleaned up.
That script should be a bit more streamlined version of yours. Now, about pulling that directory listing... Here's the deal, using Get-ChildItem is going to be slower than some alternatives (such as dir /s /b) but it stops you from having to duplicate efforts by later checking what's a file, and what's a directory. I suppose if the actual files/folders that you are concerned with are a small percentage of the total, then the double work may actually be worth the time and effort to pull the list with something like dir /s /b, and then parse against the log, and only pull folder/file info for the specific items you need to address.

Move and Manipulate files across directories - Powershell

I am trying to move a list of files from one directory to another. The catch is, When the items are moved to the new directory, I want to automatically organize them.
Ex..
I have a folder of thousands of filenames.
All filenames are relative to a user's userID. Some users have multiple files in this folder, so it appends a number onto the end of the name. I.E. susy.txt, susy1.txt, susy2.txt, carl.txt, carl1.txt, etc...
What I am trying to do is create a specific folder (in the new directory) for each user that has multiple files, and move all associated files into that folder. So I notice there are multiple susy documents. So I want to create a folder named Susy and place susy.txt, susy1.txt, and susy2.txt into it... And so on for all files.
Is it even possible to do this as a batch file, if so can someone point me in the correct direction on doing this? I have a small amount of knowledge in writing batch scripts, and would like to take this as an opportunity to learn more.
This is very similar to a question I have asked earlier. File and Folder Manipulation in Powershell. I am very thankful for the responses I received, they helped me greatly. The answer from Adi Inbar was exactly what I needed, at the time. However, I was forced to make a modification, which I have tried myself.
Adi Inbar's Answer
Get-ChildItem | ?{$_.Name -match '(\D+)\d*\.txt'} | %{
md $matches[1] -ea SilentlyContinue
Move-Item $_ $matches[1]
}
Short sweet and too the point, exactly what I needed. However it only works for for files that are going to be organized but stay in the same parent folder.
This is what I have attempted:
Get-ChildItem –path "P:\My Documents\Org Test\Test1" | Where{$_.Name -match '(\D+)\d*\.txt'} | Foreach{
md P:\'My Documents'\'Org Test'\Test2\$matches[1] -ea SilentlyContinue
Move-Item $_ P:\'My Documents'\'Org Test'\Test2\$matches[1]
}
To my knowledge and basic understanding this should work... But I am getting an error saying Move-Item : Cannot create a file when that file already exists.
At P:\My Documents\Org Test\Test.ps1:3 char:3
+ Move-Item -Path P:\'My Documents'\'Org Test'\Test1\$_ -destination P:\'M ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : WriteError: (P:\My Documents...t1\Johnny123.txt:FileInfo) [Move-Item], I
+ FullyQualifiedErrorId : MoveFileInfoItemIOError,Microsoft.PowerShell.Commands.MoveItemCommand
I am sure that it is on the tip of my tongue, but I cannot get it. I have very basic powershell scripting experience and just need a quick fix.
EDIT:
I have been able to "resolve" my issue by using this script:
Get-ChildItem –path P:\'My Documents'\'PST Org Script'\Test1 | Foreach-Object{
move-item -Path $_.fullname -Destination "P:\My Documents\PST Org Script\Test2" -ea SilentlyContinue }
cd P:\'My Documents'\'PST Org Script'\Test2
Get-ChildItem | ?{$_.Name -match '(\D+)\d*\.txt'} | %{
md $matches[1] -ea SilentlyContinue
Move-Item $_ $matches[1]
}
I am curious. I feel like this can be done in the 3 lines of code I have above. This seems like a little redundant. But what do I know.
Thanks
Try this:
$srcPath = 'P:\My Documents\PST Org Script\Test1'
$dstPath = 'P:\My Documents\PST Org Script\Test2'
Get-ChildItem $srcPath | Where {$_.Name -match '(\D+)\d*\.txt'} |
Foreach {$targetDir = Join-Path $dstPath $matches[1]
md $targetDir -ea 0
Move-Item $_ $targetDir -WhatIf}
I have been able to resolve my issue using this:
Get-ChildItem –path P:\'My Documents'\'PST Org Script'\Test1 | Foreach-Object{
move-item -Path $_.fullname -Destination "P:\My Documents\PST Org Script\Test2" -ea SilentlyContinue }
cd P:\'My Documents'\'PST Org Script'\Test2
Get-ChildItem | ?{$_.Name -match '(\D+)\d*\.txt'} | %{
md $matches[1] -ea SilentlyContinue
Move-Item $_ $matches[1]
}
However I feel, there is a shorter, simpler way of doing this via some variation of the method I entered earlier in my original question.

Counting folders with Powershell

Does anybody know a powershell 2.0 command/script to count all folders and subfolders (recursive; no files) in a specific folder ( e.g. the number of all subfolders in C:\folder1\folder2)?
In addition I also need also the number of all "leaf"-folders. in other words, I only want to count folders, which don't have subolders.
In PowerShell 3.0 you can use the Directory switch:
(Get-ChildItem -Path <path> -Directory -Recurse -Force).Count
You can use get-childitem -recurse to get all the files and folders in the current folder.
Pipe that into Where-Object to filter it to only those files that are containers.
$files = get-childitem -Path c:\temp -recurse
$folders = $files | where-object { $_.PSIsContainer }
Write-Host $folders.Count
As a one-liner:
(get-childitem -Path c:\temp -recurse | where-object { $_.PSIsContainer }).Count
To answer the second part of your question, of getting the leaf folder count, just modify the where object clause to add a non-recursive search of each directory, getting only those that return a count of 0:
(dir -rec | where-object{$_.PSIsContainer -and ((dir $_.fullname | where-object{$_.PSIsContainer}).count -eq 0)}).Count
it looks a little cleaner if you can use powershell 3.0:
(dir -rec -directory | where-object{(dir $_.fullname -directory).count -eq 0}).count
Another option:
(ls -force -rec | measure -inp {$_.psiscontainer} -Sum).sum
This is a pretty good starting point:
(gci -force -recurse | where-object { $_.PSIsContainer }).Count
However, I suspect that this will include .zip files in the count. I'll test that and try to post an update...
EDIT: Have confirmed that zip files are not counted as containers. The above should be fine!
Get the path child items with recourse option, pipe it to filter only containers, pipe again to measure item count
((get-childitem -Path $the_path -recurse | where-object { $_.PSIsContainer }) | measure).Count

Resources