Delete user profiles based on the date of the Appdata\Local folder - windows

Thanks to Microsoft breaking the "Delete user profiles older than a specified number of days on system restart" GPO and not fixing it after all of these years, I need a script that deletes old user profiles. The thing is that instead of it looking for the modification date of the user profile folder itself, I need it to delete the user profile based on the modification date of the Local folder in the Appdata folder of the user profiles. I noticed that the modification date of the user profile folder might not change for years even if you log in daily, but the local folder does seem to change depending on when you log in.
So, I have this that I grabbed from a spiceworks post made by user cxr-aus.
$useraccounts = Get-ChildItem -path C:\users\ | Where-Object lastwritetime -lt (Get-Date).AddDays(90) | Select-Object Name
$sort = $useraccounts | ForEach-Object {$_.Name}
$removeaccounts = $sort -join "|"
Get-WmiObject -Class Win32_userprofile | Where-Object {($_.LocalPath -match "$removeaccounts") -and (!$_.special)} | Remove-WmiObject -whatif
You would remove the -whatif at the end of the code to get it to remove a user profile.
The first problem that I ran into is that I need this to remove multiple user profiles, so the Remove-WmiObject does not work because the Get-WmiObject returns multiple profiles for me, so to fix it to work, I use % { $_.Delete()} instead like the following.
WARNING:Be very careful with the following code as -whatif does not work with it and it might start deleting multiple profiles off of your machine.
$useraccounts = Get-ChildItem -path C:\users\ | Where-Object {$_.lastwritetime -lt (Get-Date).AddDays(90)} | Select-Object Name
Foreach ( $user in $useraccounts) {
$sort = $useraccounts | ForEach-Object {$_.Name}
$removeaccounts = $sort -join "|"
$Username = $removeaccounts.name
Get-WmiObject -Class Win32_userprofile | Where-Object {($_.LocalPath -match "$Username") -and (!$_.special)} | % { $_.Delete()}}
You can see that I did alter some other aspects of the code to try to break it up so that the code runs on one profile at a time. This kind of works as it will start deleting folders based on the Modification date of the user profile folder
but the problem is it will delete user profiles that may have been used yesterday, but the modification date of the user profile folder did not change.
So what I need the script to do is:
1.Get all of the user profiles folders in the C:\users directory
Go into the user profile folder and get the modification date of the appdata\local folder.
3.Then return only user profile folders that the appdata\local folder has not been modified in this case for 90 days.
I have tried some things to alter this code that seem to be dead ends. Could Someone help me figure this out?

Alright.
Make sure you test this thouroughly before using in production.
I've added some helpful comments to decipher what you're trying to do inside this code.
$VerbosePreference = 'Continue'
## getting the name of the users
## this returns the filtered results below
$useraccounts = Get-ChildItem -Path $env:HOMEDRIVE\users\ | Where-Object lastwritetime -LT (Get-Date).AddDays(90)
## $useraccounts returns
<#
Directory: /Users
UnixMode User Group LastWriteTime Size Name
-------- ---- ----- ------------- ---- ----
drwxr-x--- jacoby staff 5/2/2022 19:04 1056 jacoby
drwxrwxrwt root wheel 3/26/2022 00:21 128 Shared
#>
## $useraccounts.name returns
<#
PS > $useraccounts.name
jacoby
Shared
#>
## if we use get-childitem | get-member we can see that we have access to a lot of
## properties returned like the .name value
#### we don't need any of this
####$sort = $useraccounts | ForEach-Object {$_.Name}
#####$removeaccounts = $sort -join '|'
## ok let's do something here
## for each account in my list
foreach ($account in $useraccounts) {
## let's write some info so we know what's happening
Write-Verbose -Message ('currently working with {0}' -f $account)
## we want to check moditfication time of folder so we gotta see if it exists
## we want to test c:\users\ the current account \ appdata\local
## $account.FullName gives us c:\users\accountname so we just need to add the rest
$testpath1 = $account.FullName + '\appdata\local'
Write-Verbose -Message ('we generated the path {0}' -f $testpath1)
## imma give you some error checking
## this is ugly because it could be there but we can't access it
if ((Test-Path -Path $testpath1 -ErrorAction SilentlyContinue) -eq $true) {
## if the path exists here's what we'll do
## get a count of all the file modified in the last 90 days
$count = (Get-ChildItem -Path $testpath1 `
## -Recurse
## uncomment this if you want recurse or depth
| Where-Object {
$_.LastWriteTime -gt (Get-Date).AddDays(-90)
}
).Count
## now that we have a count we can test if the count is less than than 1 (0)
## that means no files in these folder were modified in the last 90 days
if ($count -lt 1) {
####
##
## this is the area where we can take action on the
## folders/files that have not been modified in the
## last 90 days
## you might delete them or just log them somewbere
##
####
Write-Verbose -Message 'no file modified in the last 90 days'
####
## this is your original deletion pipeline
##Get-WmiObject -Class Win32_userprofile | Where-Object {($_.LocalPath -match "$account") -and (!$_.special)} | Remove-WmiObject -whatif
##i have not tested this. be careful.
####
}
else {
Write-Verbose -Message ('{0} files have been modified in the last 90 days! We do not want delete this.' -f $count)
####
##
## these is the area where we can take action if the
## files/folder HAVE been modified recently
## we would NOT want to delete these files
##
####
}
}
## do some stuff before ending the for each loop
## maybe write our changes somewhere permanent
}

Thanks again, "another victim of the mouse" for the help with the script. I have altered the script a bit for my environment
$VerbosePreference = 'Continue'
$ExcludedUsers="Public","default","defaultuser0","public","administrator"
$path = "$Env:SystemDrive\users"
## getting the name of the users
## this returns the filtered results below
$useraccounts = Get-ChildItem -Path $path -Exclude $ExcludedUsers | Where-Object {$_.lastwritetime -lt (Get-Date).AddDays(-30)}
## $useraccounts returns
<#
Directory: /Users
UnixMode User Group LastWriteTime Size Name
-------- ---- ----- ------------- ---- ----
drwxr-x--- jacoby staff 5/2/2022 19:04 1056 jacoby
drwxrwxrwt root wheel 3/26/2022 00:21 128 Shared
#>
## $useraccounts.name returns
<#
PS > $useraccounts.name
jacoby
Shared
#>
## if we use get-childitem | get-member we can see that we have access to a lot of
## properties returned like the .name value
#### we don't need any of this
####$sort = $useraccounts | ForEach-Object {$_.Name}
#####$removeaccounts = $sort -join '|'
## ok let's do something here
## for each account in my list
foreach ($account in $useraccounts)
{
## let's write some info so we know what's happening
Write-Verbose -Message ('currently working with {0}' -f $account)
## we want to check moditfication time of folder so we gotta see if it exists
## we want to test c:\users\ the current account \ appdata\local
## $account.FullName gives us c:\users\accountname so we just need to add the rest
$testpath1 = $account.FullName + '\appdata\local'
Write-Verbose -Message ('we generated the path {0}' -f $testpath1)
## imma give you some error checking
## this is ugly because it could be there but we can't access it
if ((Test-Path -Path $testpath1 -ErrorAction SilentlyContinue) -eq $true)
{
## if the path exists here's what we'll do
## get a count of all the file modified in the last 30 days
$count = (Get-ChildItem -Path $testpath1 | Where-Object { $_.LastWriteTime -gt (Get-Date).AddDays(-30) }).Count
## now that we have a count we can test if the count is less than than 1 (0)
## that means no files in these folders were modified in the last 30 days
if ($count -lt 1) {
####
##
## this is the area where we can take action on the
## folders/files that have not been modified in the
## last 30 days
## you might delete them or just log them somewbere
##
####
Write-Verbose -Message 'no file modified in the last 30 days'
####
## this is your original deletion pipeline
#Get-WmiObject -Class Win32_userprofile | Where-Object {($_.LocalPath -contains "$account") -and (!$_.special)} | Remove-WmiObject -whatif
##i have not tested this. be careful.
####
}
else {
Write-Verbose -Message ('{0} files have been modified in the last 30 days! We do not want delete this.' -f $count)
####
##
## these is the area where we can take action if the
## files/folder HAVE been modified recently
## we would NOT want to delete these files
##
####
}
}
## do some stuff before ending the for each loop
## maybe write our changes somewhere permanent
}
I first changed $env:HomeDrive to $env:SystemDrive because my environment is set up differently. I also added an $ExcludeUsers so that it does not grab the Administrator or other system user profiles folders. Thanks to the changes you made, Remove-WmiObject works for the multiple profiles in the Users folder, and I was able to delete over 20 profiles by running this script once. For some reason, I cannot figure out, -match no longer works. It does not treat the $account as a full localpath, so I changed it to -contains, and that seems good enough for me. For everyone else, be sure thoroughly test the script before using it. It is a very powerful script.

Related

PS - Find Folders that Haven't Had their Files Modfied in Some Time

We're migrating our FTP and I would like to only migrate folders that have had files in them that have been used/written in the last 6 months. I would think this would be something that I find all over the place with google, but all the scripts I've found have the same fatal flaw.
It seems with everything I find, it depends on the "Date modified" of the folder. The problem with that is, I have PLENTY of folders that show a "Date Modified" of years ago, yet when you dig into it, there are files that are being created and written as recently as today.
Example:
D:/Weblogs may show a date modified of 01/01/2018 however, when you dig into it, there is some folder, idk, called "Log6" let's say, and THAT folder has a log file in it that was modified as recently as yesterday.
All these scripts I'm seeing pull the date modified of the top folder, which just doesn't seem to be accurate.
Is there any way around this? I would expect something like
Get all folders at some top level, then foreach through the CONTENTS of those folders looking for files with the datemodified -lt adddays(-180) filter. If stuff that's "New" is found, then don't add the overarching directory to the array, but if not, then list the directory.
Any ideas?
Edit: I've tried this
$path = <some dir>
gci -path $path -Directory where-object {LastWriteTime -lt (get-date).addmonths(-6))}
and
$filter = {$_.LastWriteTime -lt (Get-Date).AddDays(-180)}
#$OldStuff = gci "D:\FTP\BELAMIINC" -file | Where-Object $filter
$OldFolders = gci "D:\FTP\BELAMIINC" -Directory | ForEach-Object {
gci "D:\FTP\BELAMIINC" -file | Where-Object $filter
}
Write-Host $OldFolders
Give this a try, I added comments for you to follow along the thought process.
The use of -Force is mainly to find hidden files and folders.
$path = '/path/to/parentFolder'
$limit = [datetime]::Now.AddMonths(-6)
# Get all the child folders recursive
$folders = Get-ChildItem $path -Recurse -Directory -Force
# Loop over the folders to see if there is at least one file
# that has been modified or accessed after the limit date
$result = foreach($folder in $folders)
{
:inner foreach($file in Get-ChildItem $folder -File -Force)
{
if($file.LastAccessTime -gt $limit -or $file.LastWriteTime -gt $limit)
{
# If this condition is true at least once, break this loop and return
# the folder's FullName, the File and File's Date for reference
[pscustomobject]#{
FullName = $folder.FullName
File = $file.Name
LastAccessTime = $file.LastAccessTime
LastWriteTime = $file.LastWriteTime
}
break inner
}
}
}
$result | Out-GridView
If you need to find the folders that don't have files recently modified you can use the $folders array and compare it against $result:
$folders.where({$_.FullName -notin $result.FullName}).FullName

Copy and paste the latest modified log files from one server to other server (the servers are in the different domains)

I am trying to write a script that copies the latest modified log file from one server to another server (servers are in different domains), while copying it should check for the credentials and then execute the script.
Please let me know if the script is correct or any corrections to be made.
$sourcePath = 'sourcepath'
$destPath = 'Destinationpath'
$compareDate = (Get-Date).AddDays(-1);
$LastFileCaptured = Get-ChildItem -Path $sourcePath |
where {$_.Extension.EndsWith('.log') -and $_.LastWriteTime -gt $compareDate } |
Sort LastAccessTime -Descending |
select -First 1 |
select -ExcludeProperty Name, LastAccessTime
Write-Host $LastFileCaptured.Name
$LastFileCaptured.LastAccessTime
$LastFileCaptured = Get-ChildItem -Recurse |
Where-Object{$_.LastWriteTime.AddDays(-1) -gt (Get-Date)}
Write-Host $LastFileCaptured
Get-ChildItem $sourcePath -Recurse -Include '.log' | Where-Object {
$_.LastWriteTime.AddDays(-1).ToString("yyyy/MM/dd") -gt (get-date).ToString("yyyy/mm/dd")
} | ForEach-Object {
$destDir = Split-Path ($_.FullName -replace [regex]::Escape($sourcePath), $destPath)
if (!(Test-Path $destDir)) {
New-Item -ItemType directory $destDir | Out-Null
}
Copy-Item $_ -Destination $destDir
}
The "correctness" of your script is determined easily by running it! But, while this isn't a direct answer, I would suggest robocopy for this task.
In particular note these options:
/mon: Monitors the source, and runs again when more than N changes are detected.
/maxage: Specifies the maximum file age (to exclude files older than N days or date).

How to compress log files older than 30 days in Windows?

I wrote the below PowerShell script to compress logs older than 30 days:
$LastWrite=(get-date).AddDays(-30).ToString("MM/dd/yyyy")
Get-ChildItem -Filter "server.log*" -Recurse -File | Where-Object
{$_.LastWriteTime -le $LastWrite}
Now, I am unable to get a compress command in PowerShell via which I can compress (zip/tar) the server.log* files older than 30 days.
Expecting a single command which I can use by adding a pipe sign in the above command.
You can use the Compress-Archive cmdlet to zip files if you have PowerShell version 5 or above:
$LastWrite = (get-date).AddDays(-30)
$Files = Get-ChildItem -Filter "server.log*" -Recurse -File | Where-Object {$_.LastWriteTime -le $LastWrite}
ForEach ($File in $Files) {
$File | Compress-Archive -DestinationPath "$($File.fullname).zip"
}
If you have an older version of Powershell you can use ZipFileExtensions' CreateEntryFromFile method, but there are a lot of considerations if you want a robust script that runs unattended.
In months of testing a script developed for this purpose, I encountered some issues that have made this small problem more complicated:
Will any of the files be locked? CreateEntryFromFile may fail if so.
Did you know that you can have multiple copies of the same file in a Zip archive? It's harder to extract them because you can't put them in the same folder. My script checks the file path and the archived file time stamp (+/- 2 seconds due to the lost date precision in Zip format) to determine if it's been already archived, and doesn't create a duplicate.
Are the files created in a time zone with Daylight Savings? Zip format doesn't preserve that attribute, and may lose or gain an hour when uncompressed.
Do you want to delete the original if it was successfully archived?
If unsuccessful due to a locked/missing file or very long path, should the process continue?
Will any error leave you with an unusable zip file? You need to Dispose() the archive to finalize it.
How many archives do you want to keep? I prefer one per run-month, adding new entries to an existing zip.
Do you want to preserve the relative path? Doing so will partially eliminate the problem of duplicates inside the zip file.
Mark Wragg's script should work if you don't care about these issues and you have Powershell 5, but it creates a zip for every log, which may not be what you want.
Here's the current version of the script - in case GitHub ever becomes unavailable:
#Sends $FileSpecs files to a zip archive if they match $Filter - deleting the original if $DeleteAfterArchiving is true.
#Files that have already been archived will be ignored.
param (
[string] $ParentFolder = "$PSScriptRoot", #Files will be stored in the zip with path relative to this folder
[string[]] $FileSpecs = #("*.log","*.txt","*.svclog","*.log.*"),
$Filter = { $_.LastWriteTime -lt (Get-Date).AddDays(-7)}, #a Where-Object function - default = older than 7 days
[string] $ZipPath = "$PSScriptRoot\archive-$(get-date -f yyyy-MM).zip", #create one archive per run-month - it may contain older files
[System.IO.Compression.CompressionLevel]$CompressionLevel = [System.IO.Compression.CompressionLevel]::Optimal,
[switch] $DeleteAfterArchiving = $true,
[switch] $Verbose = $true,
[switch] $Recurse = $true
)
#( 'System.IO.Compression','System.IO.Compression.FileSystem') | % { [void][System.Reflection.Assembly]::LoadWithPartialName($_) }
Push-Location $ParentFolder #change to the folder so we can get relative path
$FileList = (Get-ChildItem $FileSpecs -File -Recurse:$Recurse | Where-Object $Filter) #CreateEntryFromFile raises UnauthorizedAccessException if item is a directory
$totalcount = $FileList.Count
$countdown = $totalcount
$skipped = #()
Try{
$WriteArchive = [IO.Compression.ZipFile]::Open( $ZipPath, [System.IO.Compression.ZipArchiveMode]::Update)
ForEach ($File in $FileList){
Write-Progress -Activity "Archiving files" -Status "Archiving file $($totalcount - $countdown) of $totalcount : $($File.Name)" -PercentComplete (($totalcount - $countdown)/$totalcount * 100)
$ArchivedFile = $null
$RelativePath = (Resolve-Path -LiteralPath "$($File.FullName)" -Relative) -replace '^.\\'
$AlreadyArchivedFile = ($WriteArchive.Entries | Where-Object {#zip will store multiple copies of the exact same file - prevent this by checking if already archived.
(($_.FullName -eq $RelativePath) -and ($_.Length -eq $File.Length) ) -and
([math]::Abs(($_.LastWriteTime.UtcDateTime - $File.LastWriteTimeUtc).Seconds) -le 2) #ZipFileExtensions timestamps are only precise within 2 seconds.
})
If($AlreadyArchivedFile -eq $null){
If($Verbose){Write-Host "Archiving $RelativePath $($File.LastWriteTimeUtc -f "yyyyMMdd-HHmmss") $($File.Length)" }
Try{
$ArchivedFile = [System.IO.Compression.ZipFileExtensions]::CreateEntryFromFile($WriteArchive, $File.FullName, $RelativePath, $CompressionLevel)
}Catch{
Write-Warning "$($File.FullName) could not be archived. `n $($_.Exception.Message)"
$skipped += [psobject]#{Path=$file.FullName; Reason=$_.Exception.Message}
}
If($File.LastWriteTime.IsDaylightSavingTime() -and $ArchivedFile){#HACK: fix for buggy date - adds an hour inside archive when the zipped file was created during PDT (files created during PST are not affected). Not sure how to introduce DST attribute to file date in the archive.
$entry = $WriteArchive.GetEntry($RelativePath)
$entry.LastWriteTime = ($File.LastWriteTime.ToLocalTime() - (New-TimeSpan -Hours 1)) #TODO: This is better, but maybe not fully correct. Does it work in all time zones?
}
}Else{#Write-Warning "$($File.FullName) is already archived$(If($DeleteAfterArchiving){' and will be deleted.'}Else{'. No action taken.'})"
Write-Warning "$($File.FullName) is already archived - No action taken."
$skipped += [psobject]#{Path=$file.FullName; Reason="Already archived"}
}
If((($ArchivedFile -ne $null) -and ($ArchivedFile.FullName -eq $RelativePath)) -and $DeleteAfterArchiving) { #delete original if it's been successfully archived.
Try {
Remove-Item $File.FullName -Verbose:$Verbose
}Catch{
Write-Warning "$($File.FullName) could not be deleted. `n $($_.Exception.Message)"
}
}
$countdown = $countdown -1
}
}Catch [Exception]{
Write-Error $_.Exception
}Finally{
$WriteArchive.Dispose() #close the zip file so it can be read later
Write-Host "Sent $($totalcount - $countdown - $($skipped.Count)) of $totalcount files to archive: $ZipPath"
$skipped | Format-Table -Autosize -Wrap
}
Pop-Location
Here's a command line that will compress all server.log* files older than 30 days under the current folder:
.\ArchiveOldLogs.ps1 -FileSpecs #("server.log*") -Filter { $_.LastWriteTime -lt (Get-Date).AddDays(-30)}

How to delete all files except those created on a specific day of the week?

I have a folder structure containing SQL full and incremental backups, all with the file extention .bak. I want to delete all the incrementals only. All my full backups are created on Sunday, so essentially I want to delete all *.bak files that are created any day other then a Sunday.
How can this be achieved?
PowerShell can be used to filter files based on specific criteria, including filtering by day of the week. The following block can teach you how it can be achieved. Comments are inline.
# Get all files which are NOT folders, where the extention matches your required extention. Use -Recurse to do this recursively - i.e. all subfolders
$allfiles = Get-ChildItem "G:\SQL Backups" -Recurse | where {!$_.PSIsContainer -and $_.Extension -match ".bak"}
# Filter your file list and select objects which do NOT have the Day Of Week value equal to 0 (Which is a Sunday)
$nonsundayfiles = $allfiles | Where-Object { !$_.CreationTime.DayOfWeek.value__ -eq 0 }
#Iterate through all the non sunday filees and delete each one. Verbose logging to screen.
$nonsundayfiles | ForEach-Object {
Write-Host "The File " $_.FullName "was created on" $_.CreationTime.DayOfWeek "and therefore is being deleted"
Remove-Item $_.FullName -WhatIf
}
#Empty variables - useful if in PowerShell ISE
$allfiles = #()
$nonsundayfiles = #()
If you're feeling confident you can do this all in one line.
Get-ChildItem "G:\SQL Backups" -Recurse | where {!$_.PSIsContainer -and $_.Extension -match ".bak" -and !$_.CreationTime.DayOfWeek.value__ -eq 0} | Remove-Item -WhatIf
Remove the -WhatIf parameter if you're happy with the end result and you want to actually delete the file.

Compare a log file of file paths to a directory structure and remove files not in log file

I have a file transfer/sync job that is copying files from the main network into a totally secure network using a custom protocol (ie no SMB). The problem is that because I can't look back to see what files exist, the destination is filling up, as the copy doesn't remove any files it hasn't touched (like robocopy MIR does).
Initailly I wrote a script that:
1. Opens the log file and grabs the file paths out (this is quite quick and painless)
2. Does a Get-ChildItem on the destination folder (now using dir /s /b as it's way faster than gci)
3. Compared the two, and then removed the differences.
The problem is that there are more jobs that require this clean up but the log files are 100MB and the folders contain 600,000 files, so it's taking ages and using tons of memory. I actually have yet to see one finish. I'd really like some ideas on how to make this faster (memory/cpu use doesn't bother me too much but speed is essential.
$destinationMatch = "//server/fileshare/folder/"
the log file contains some headers and footers and then 600,000 lines like this one:
"//server/fileshare/folder/dummy/deep/tags/20140826/more_stuff/Deeper/2012-07-02_2_0.dat_v2" 33296B 0B completed
Here's the script:
[CmdletBinding(SupportsShouldProcess=$True)]
param(
[Parameter(Mandatory=$True)]
[String]$logName,
[Parameter(Mandatory=$True)]
[String]$destinationMatch
)
$logPath = [string]("C:\Logs\" + $logName)
$manifestFile = gci -Path $logPath | where {$_.name -match "manifest"} | sort creationtime -descending | select Name -first 1
$manifestFileName = [string]$manifestFile.name
$manifestFullPath = $logPath + "\" + $manifestFileName
$copiedList = #()
(gc $manifestFullPath -ReadCount 0) | where {$_.trim() -match $DestinationMatch} | % {
if ( $_ -cmatch '(?<=")[^"]*(?=")' ){
$copiedList += ($matches[0]).replace("/","\")
}
}
$dest = $destinationMatch.replace("/","\")
$actualPathString = (gci -Path $dest -Recurse | select fullname).fullnameCompare-Object -ReferenceObject $copiedList -DifferenceObject $actualPathString -PassThru | % {
$leaf = Split-Path $_ -leaf
if ($leaf.contains(".")){
$fsoData = gci -Path $_
if (!($fsoData.PSIsContainer)){
Remove-Item $_ -Force
}
}
}
$actualDirectory | where {$_.PSIsContainer -and #(gci -LiteralPath $_.FullName -Recurse -WarningAction SilentlyContinue -ErrorAction SilentlyContinue | where {!$_.PSIsContainer}).Length -eq 0} | remove-item -Recurse -Force
Ok, so let's assume that your file copy preserves the last modified date/time stamp. If you really need to pull a directory listing, and compare it against a log, I think you're doing a decent job of it. The biggest slow down is obviously going to be pulling your directory listing. I'll address that shortly. For right now I would propose the following modification of your code:
[CmdletBinding(SupportsShouldProcess=$True)]
param(
[Parameter(Mandatory=$True)]
[String]$logName,
[Parameter(Mandatory=$True)]
[String]$destinationMatch
)
$logPath = [string]("C:\Logs\" + $logName)
$manifestFile = gci -Path $logPath | where {$_.name -match "manifest"} | sort creationtime -descending | select -first 1
$RegExPattern = [regex]::escape($DestinationMatch)
$FilteredManifest = gc $manifestfile.FullPath | where {$_ -match "`"($RegexPattern[^`"]*)`""} |%{$matches[1] -replace '/','\'}
$dest = $destinationMatch.replace("/","\")
$DestFileList = gci -Path $dest -Recurse | select Fullname,Attributes
$DestFileList | Where{$FilteredManifest -notcontains $_.FullName -and $_.Attributes -notmatch "Directory"}|Remove-Item $_ -Force
$DestFileList | Where{$FilteredManifest -notcontains $_.FullName -and $_.Attributes -match "Directory" -and (gci -LiteralPath $_ -Recurse -WarningAction SilentlyContinue -ErrorAction SilentlyContinue).Length -eq 0}{Remove-Item $_ -Recurse -Force}
This stops you from duplicating efforts. There's no need to get your manifest file, and then assign different variables to different properties of the file object, just reference them directly. Then later when you pull your directory listing of the drive (the slow part here), keep the full name and attributes of the files/folders. That way you can easily filter against Attributes to see what's a directory and what not, so we can deal with files first, then clean up directories later after the files are cleaned up.
That script should be a bit more streamlined version of yours. Now, about pulling that directory listing... Here's the deal, using Get-ChildItem is going to be slower than some alternatives (such as dir /s /b) but it stops you from having to duplicate efforts by later checking what's a file, and what's a directory. I suppose if the actual files/folders that you are concerned with are a small percentage of the total, then the double work may actually be worth the time and effort to pull the list with something like dir /s /b, and then parse against the log, and only pull folder/file info for the specific items you need to address.

Resources