Deleting files with PowerShell script - windows

I have a script in PowerShell that scans a directory of folders that are named with the following convention: yyyymmdd. It scans the directory and finds all the folders that are current and up to one week old, then copies them over to another directory. After it copies them over to the other directory, I would like to have it delete the folders in the new directory which are named the same way and that are older than 18 months old. Would there be an easy way to do this? I have pasted the script below.
$targetdirectory = "\\DPR320-W12-1600\PRTG"
$sourcedirectory = "C:\Users\Public\Documents\PRTG Traffic Grapher"
$todaysdate=get-date
$minusoneweek=$todaysdate.adddays(-7)
$minusdate=($minusoneweek.Month).tostring(),($minusoneweek.day).tostring(),($minusoneweek.year).tostring()
$todaysdatestring=($todaysdate.Month).tostring(),($todaysdate.day).tostring(),($todaysdate.year).tostring()
$oldfilename=$minusdate[0]+$minusdate[1]+$minusdate[2]+" backup"
$newfilename=$todaysdatestring[0]+$todaysdatestring[1]+$todaysdatestring[2]+" backup"
Get-ChildItem $sourcedirectory\config | Where-Object {
$_.PsIsContainer -and
$_.BaseName -match '\d{6}' -and
([DateTime]::ParseExact($_.BaseName, 'yyyyMMdd', $null) -gt (Get-Date).AddDays(-7))
} |Copy-Item -Recurse -Force -Destination $targetdirectory\$oldfilename\config
Copy-Item -Force $sourcedirectory\config.csv -Destination $targetdirectory\$oldfilename
Copy-Item -Force $sourcedirectory\config.prtg -Destination $targetdirectory\$oldfilename
rename-item $targetdirectory\$oldfilename $newfilename

Based on your discussion with FoxDeploy:
This will loop through $YourDirectory and check if the name represents the date that's older than 18 months.
Used this for my own cleanup, except I have names in dotted format.
$CleanupList = Get-ChildItem $YourDirectory
$Threshold = (get-date).AddMonths(-18)
foreach ($DirName in $CleanupList)
{
If (([datetime]::ParseExact($DirName.BaseName,'yyyyMMdd',$null)) -lt $Threshold)
{
Remove-Item $DirName.FullName -Force -Recurse
}
}

Assuming $targetDirectory contains the files you'd like to remove (Those older than 18 months), just add this to the end of your script:
#resolve this day, 18 months ago
$18Months = (get-date).AddMonths(-18)
#get the name in the right format (i.e. 20141224)
$18MonthString = get-date -Date $18Months -UFormat "%Y%m%d"
#find the files with a name like this and delete them
$OldPrtgFiles = dir $targetDirectory | Where Name -like "$18MonthString*"
$OldPrtgFiles | remove-item -whatif
The first execution will display the -WhatIf view, to show you which files would be removed. If you're happy with this, remove WhatIf.

Related

Removing trailing and ending blank spaces in folder and file names on Windows in bulk

I tried following Remove leading spaces in Windows file names but it's not working for my use case.
I have a lot of folders and filenames that either have a blank space at the front or at the end. How would I go about removing those spaces in bulk?
This was the command-line command I used after following the linked post:
for /R %A IN ("* ") do #for /F "tokens=*" %B IN ("%~nxA") do #ren "%A" "%B"
But it didn't work out.
Update: thank you to all who replied trying to help. I think there is just a Windows-level glitch in the file system. I ended up just having to manually create new folders without leading and trailing spaces and then dragging all the files over manually then renaming those to non-trailing and leading names as well.
It's unclear whether or not you want a PowerShell solution, but there's a reasonable assumption to be made you might.
If you wanted a PowerShell solution, you could try this:
function Test-LeadingTrailingWhitespace {
param(
[Parameter(Mandatory)]
[String]$String
)
$String[0] -eq ' ' -Or $String[-1] -eq ' '
}
Get-ChildItem -Path "<path_to_folder>" | ForEach-Object {
if ($_.PSIsContainer -And (Test-LeadingTrailingWhitespace -String $_.Name)) {
$Destination = Split-Path -Path $_.FullName -Parent
$NewName = $_.Name.Trim()
Move-Item -Path $_ -Destination (Join-Path -Path $Destination -ChildPath $NewName)
}
elseif (Test-LeadingTrailingWhitespace -String $_.BaseName) {
$Destination = Split-Path -Path $_.FullName -Parent
$NewName = $_.BaseName.Trim() + $_.Extension
Move-Item -Path $_ -Destination (Join-Path -Path $Destination -ChildPath $NewName)
}
}
To be on the safe side, you could add -WhatIf or -Confirm on the Move-Item cmdlet. The former will tell you what would have changed without that parameter without actually making any changes (like a 'dry run'). The latter will prompt you for confirmation before making each change, giving you a chance to validate incrementally and not make changes en masse from the moment you hit enter.
Trim() is a method available for all strings in PowerShell:
Returns a new string in which all leading and trailing occurrences of a set of specified characters from the current string are removed.
You can loop over files and folder and check if they actually have a leading or trailing whitespace before renaming, this would avoid errors like:
Rename-Item: Source and destination path must be different.
We can use the -match matching operator with a simple regex ^\s|\s$ (starts with whitespace or ends with whitespace - regex101 link for a simple example) to see if the file or folder should be renamed:
Get-ChildItem path\to\startingfolder -Recurse | ForEach-Object {
$newName = switch($_) {
# handle folders
{ $_.PSIsContainer -and $_.Name -match '^\s|\s$' } {
$_.Name.Trim()
break
}
# handle files
{ $_.BaseName -match '^\s|\s$' -or $_.Extension -match '^\s|\s$' } {
$_.BaseName.Trim() + $_.Extension.Trim()
break
}
# if none of the above conditions were true, continue with next item
Default {
return
}
}
Rename-Item -LiteralPath $_.FullName -NewName $newName
}
Personally, I'd do this in two steps to rename folders and files separately. This to overcome the problem that when a folder is renamed, the items inside that folder all have a new path.
Using switch -Force allows renaming items such as hidden or read-only files
Using -ErrorAction SilentlyContinue swallows the error when the new name is equal to the existing name
$rootPath = 'X:\thepath'
# first the folders and subfolders (deepest nesting first)
(Get-ChildItem -Path $rootPath -Directory -Recurse | Sort-Object FullName -Descending) |
Rename-Item -NewName {$_.Name.Trim()} -Force -ErrorAction SilentlyContinue
# next the files
(Get-ChildItem -Path $rootPath -File -Recurse) |
Rename-Item -NewName {'{0}{1}' -f $_.BaseName.Trim(), $_.Extension.Trim()} -Force -ErrorAction SilentlyContinue

PS - Find Folders that Haven't Had their Files Modfied in Some Time

We're migrating our FTP and I would like to only migrate folders that have had files in them that have been used/written in the last 6 months. I would think this would be something that I find all over the place with google, but all the scripts I've found have the same fatal flaw.
It seems with everything I find, it depends on the "Date modified" of the folder. The problem with that is, I have PLENTY of folders that show a "Date Modified" of years ago, yet when you dig into it, there are files that are being created and written as recently as today.
Example:
D:/Weblogs may show a date modified of 01/01/2018 however, when you dig into it, there is some folder, idk, called "Log6" let's say, and THAT folder has a log file in it that was modified as recently as yesterday.
All these scripts I'm seeing pull the date modified of the top folder, which just doesn't seem to be accurate.
Is there any way around this? I would expect something like
Get all folders at some top level, then foreach through the CONTENTS of those folders looking for files with the datemodified -lt adddays(-180) filter. If stuff that's "New" is found, then don't add the overarching directory to the array, but if not, then list the directory.
Any ideas?
Edit: I've tried this
$path = <some dir>
gci -path $path -Directory where-object {LastWriteTime -lt (get-date).addmonths(-6))}
and
$filter = {$_.LastWriteTime -lt (Get-Date).AddDays(-180)}
#$OldStuff = gci "D:\FTP\BELAMIINC" -file | Where-Object $filter
$OldFolders = gci "D:\FTP\BELAMIINC" -Directory | ForEach-Object {
gci "D:\FTP\BELAMIINC" -file | Where-Object $filter
}
Write-Host $OldFolders
Give this a try, I added comments for you to follow along the thought process.
The use of -Force is mainly to find hidden files and folders.
$path = '/path/to/parentFolder'
$limit = [datetime]::Now.AddMonths(-6)
# Get all the child folders recursive
$folders = Get-ChildItem $path -Recurse -Directory -Force
# Loop over the folders to see if there is at least one file
# that has been modified or accessed after the limit date
$result = foreach($folder in $folders)
{
:inner foreach($file in Get-ChildItem $folder -File -Force)
{
if($file.LastAccessTime -gt $limit -or $file.LastWriteTime -gt $limit)
{
# If this condition is true at least once, break this loop and return
# the folder's FullName, the File and File's Date for reference
[pscustomobject]#{
FullName = $folder.FullName
File = $file.Name
LastAccessTime = $file.LastAccessTime
LastWriteTime = $file.LastWriteTime
}
break inner
}
}
}
$result | Out-GridView
If you need to find the folders that don't have files recently modified you can use the $folders array and compare it against $result:
$folders.where({$_.FullName -notin $result.FullName}).FullName

Copy and paste the latest modified log files from one server to other server (the servers are in the different domains)

I am trying to write a script that copies the latest modified log file from one server to another server (servers are in different domains), while copying it should check for the credentials and then execute the script.
Please let me know if the script is correct or any corrections to be made.
$sourcePath = 'sourcepath'
$destPath = 'Destinationpath'
$compareDate = (Get-Date).AddDays(-1);
$LastFileCaptured = Get-ChildItem -Path $sourcePath |
where {$_.Extension.EndsWith('.log') -and $_.LastWriteTime -gt $compareDate } |
Sort LastAccessTime -Descending |
select -First 1 |
select -ExcludeProperty Name, LastAccessTime
Write-Host $LastFileCaptured.Name
$LastFileCaptured.LastAccessTime
$LastFileCaptured = Get-ChildItem -Recurse |
Where-Object{$_.LastWriteTime.AddDays(-1) -gt (Get-Date)}
Write-Host $LastFileCaptured
Get-ChildItem $sourcePath -Recurse -Include '.log' | Where-Object {
$_.LastWriteTime.AddDays(-1).ToString("yyyy/MM/dd") -gt (get-date).ToString("yyyy/mm/dd")
} | ForEach-Object {
$destDir = Split-Path ($_.FullName -replace [regex]::Escape($sourcePath), $destPath)
if (!(Test-Path $destDir)) {
New-Item -ItemType directory $destDir | Out-Null
}
Copy-Item $_ -Destination $destDir
}
The "correctness" of your script is determined easily by running it! But, while this isn't a direct answer, I would suggest robocopy for this task.
In particular note these options:
/mon: Monitors the source, and runs again when more than N changes are detected.
/maxage: Specifies the maximum file age (to exclude files older than N days or date).

Get-ChildItem returns no items in script, but works from command line

I have a script that does some processing and then needs to delete files from a folder that haven't been modified for 10 days.
Firstly I get the date 10 days ago with:
$deleteDate = (Get-Date).AddDays(-10)
I then try and get the file list with:
$deleteFiles = Get-ChildItem -Path $destinationPath | Where-Object { $_.LastWriteTime -le $deleteDate }
However, this doesn't return any items (I output $deleteFiles.Length). If I run the exact same command, setting the variables first, from the powershell command line, it returns files.
I have tried adding the -Force parameter without luck.
This destination folder contains only files or files are in subfolders ?
$del=Get-ChildItem -Path $destinationPath -recurse | Where-Object {!$_.PsIsContainer -and ($_.LastWriteTime -le $deleteDate) }
This will list file in all subfolders -recurse and only files !$_.PsIsContainer
This works for me:
$destinationPath = "c:\temp"
$deleteDate = (Get-Date).AddDays(-10)
$del=Get-ChildItem -Path $destinationPath -recurse | Where-Object {!$_.PsIsContainer -and ($_.LastWriteTime -le $deleteDate) }
$del.length
And returns number of files.

How to delete all files except those created on a specific day of the week?

I have a folder structure containing SQL full and incremental backups, all with the file extention .bak. I want to delete all the incrementals only. All my full backups are created on Sunday, so essentially I want to delete all *.bak files that are created any day other then a Sunday.
How can this be achieved?
PowerShell can be used to filter files based on specific criteria, including filtering by day of the week. The following block can teach you how it can be achieved. Comments are inline.
# Get all files which are NOT folders, where the extention matches your required extention. Use -Recurse to do this recursively - i.e. all subfolders
$allfiles = Get-ChildItem "G:\SQL Backups" -Recurse | where {!$_.PSIsContainer -and $_.Extension -match ".bak"}
# Filter your file list and select objects which do NOT have the Day Of Week value equal to 0 (Which is a Sunday)
$nonsundayfiles = $allfiles | Where-Object { !$_.CreationTime.DayOfWeek.value__ -eq 0 }
#Iterate through all the non sunday filees and delete each one. Verbose logging to screen.
$nonsundayfiles | ForEach-Object {
Write-Host "The File " $_.FullName "was created on" $_.CreationTime.DayOfWeek "and therefore is being deleted"
Remove-Item $_.FullName -WhatIf
}
#Empty variables - useful if in PowerShell ISE
$allfiles = #()
$nonsundayfiles = #()
If you're feeling confident you can do this all in one line.
Get-ChildItem "G:\SQL Backups" -Recurse | where {!$_.PSIsContainer -and $_.Extension -match ".bak" -and !$_.CreationTime.DayOfWeek.value__ -eq 0} | Remove-Item -WhatIf
Remove the -WhatIf parameter if you're happy with the end result and you want to actually delete the file.

Resources