Get-ChildItem returns no items in script, but works from command line - powershell-4.0

I have a script that does some processing and then needs to delete files from a folder that haven't been modified for 10 days.
Firstly I get the date 10 days ago with:
$deleteDate = (Get-Date).AddDays(-10)
I then try and get the file list with:
$deleteFiles = Get-ChildItem -Path $destinationPath | Where-Object { $_.LastWriteTime -le $deleteDate }
However, this doesn't return any items (I output $deleteFiles.Length). If I run the exact same command, setting the variables first, from the powershell command line, it returns files.
I have tried adding the -Force parameter without luck.

This destination folder contains only files or files are in subfolders ?
$del=Get-ChildItem -Path $destinationPath -recurse | Where-Object {!$_.PsIsContainer -and ($_.LastWriteTime -le $deleteDate) }
This will list file in all subfolders -recurse and only files !$_.PsIsContainer
This works for me:
$destinationPath = "c:\temp"
$deleteDate = (Get-Date).AddDays(-10)
$del=Get-ChildItem -Path $destinationPath -recurse | Where-Object {!$_.PsIsContainer -and ($_.LastWriteTime -le $deleteDate) }
$del.length
And returns number of files.

Related

Powershell: Copy files from multiple folders to one folder appending files name as source server name

I currently have multiple servers on the farm and want to copy the files of the same extension from these servers to one folder. Each server has the same folder name. The goal is to copy all files from these servers(same path) to another remote folder by appending the files name as the source server. The powershell script I wrote works but was wondering if there was a more elegant way by excluding the second foreach loop in the script (using a foreach-object after the GCI command doesn't work)
Clear-Variable source
Clear-Variable destination
Clear-Variable files
Clear-Variable file
Clear-Variable newPath
Clear-Variable server
$machines = Get-Content "C:\Users\localadmin\Desktop\servers.txt"
$destination = '\\nprodserver1\C$\Users\localadmin\Documents'
foreach($server in $machines){
$source = "\\$server\L$\Logs\W3SVC21"
#$files = Get-ChildItem -File "\\$server\L$\Logs\W3SVC21" -Filter *.log -Recurse -Force | Where-Object {$_.LastWriteTime -gt (Get-Date).AddDays(-1)}
$files = Get-ChildItem -Path $source -Filter *.log -Recurse -Force | Where-Object {$_.LastWriteTime -gt (Get-Date).AddDays(-1)}
foreach ($file in $files) {
$newPath = Join-Path -Path "$destination" -ChildPath "$($file.BaseName)-$("$server")-$(Split-Path -Path $source -Leaf).log" #change file output format here (e.g..txt)
Copy-Item -Path $($file.FullName) -Destination $newPath
}
}

PS - Find Folders that Haven't Had their Files Modfied in Some Time

We're migrating our FTP and I would like to only migrate folders that have had files in them that have been used/written in the last 6 months. I would think this would be something that I find all over the place with google, but all the scripts I've found have the same fatal flaw.
It seems with everything I find, it depends on the "Date modified" of the folder. The problem with that is, I have PLENTY of folders that show a "Date Modified" of years ago, yet when you dig into it, there are files that are being created and written as recently as today.
Example:
D:/Weblogs may show a date modified of 01/01/2018 however, when you dig into it, there is some folder, idk, called "Log6" let's say, and THAT folder has a log file in it that was modified as recently as yesterday.
All these scripts I'm seeing pull the date modified of the top folder, which just doesn't seem to be accurate.
Is there any way around this? I would expect something like
Get all folders at some top level, then foreach through the CONTENTS of those folders looking for files with the datemodified -lt adddays(-180) filter. If stuff that's "New" is found, then don't add the overarching directory to the array, but if not, then list the directory.
Any ideas?
Edit: I've tried this
$path = <some dir>
gci -path $path -Directory where-object {LastWriteTime -lt (get-date).addmonths(-6))}
and
$filter = {$_.LastWriteTime -lt (Get-Date).AddDays(-180)}
#$OldStuff = gci "D:\FTP\BELAMIINC" -file | Where-Object $filter
$OldFolders = gci "D:\FTP\BELAMIINC" -Directory | ForEach-Object {
gci "D:\FTP\BELAMIINC" -file | Where-Object $filter
}
Write-Host $OldFolders
Give this a try, I added comments for you to follow along the thought process.
The use of -Force is mainly to find hidden files and folders.
$path = '/path/to/parentFolder'
$limit = [datetime]::Now.AddMonths(-6)
# Get all the child folders recursive
$folders = Get-ChildItem $path -Recurse -Directory -Force
# Loop over the folders to see if there is at least one file
# that has been modified or accessed after the limit date
$result = foreach($folder in $folders)
{
:inner foreach($file in Get-ChildItem $folder -File -Force)
{
if($file.LastAccessTime -gt $limit -or $file.LastWriteTime -gt $limit)
{
# If this condition is true at least once, break this loop and return
# the folder's FullName, the File and File's Date for reference
[pscustomobject]#{
FullName = $folder.FullName
File = $file.Name
LastAccessTime = $file.LastAccessTime
LastWriteTime = $file.LastWriteTime
}
break inner
}
}
}
$result | Out-GridView
If you need to find the folders that don't have files recently modified you can use the $folders array and compare it against $result:
$folders.where({$_.FullName -notin $result.FullName}).FullName

Copy and paste the latest modified log files from one server to other server (the servers are in the different domains)

I am trying to write a script that copies the latest modified log file from one server to another server (servers are in different domains), while copying it should check for the credentials and then execute the script.
Please let me know if the script is correct or any corrections to be made.
$sourcePath = 'sourcepath'
$destPath = 'Destinationpath'
$compareDate = (Get-Date).AddDays(-1);
$LastFileCaptured = Get-ChildItem -Path $sourcePath |
where {$_.Extension.EndsWith('.log') -and $_.LastWriteTime -gt $compareDate } |
Sort LastAccessTime -Descending |
select -First 1 |
select -ExcludeProperty Name, LastAccessTime
Write-Host $LastFileCaptured.Name
$LastFileCaptured.LastAccessTime
$LastFileCaptured = Get-ChildItem -Recurse |
Where-Object{$_.LastWriteTime.AddDays(-1) -gt (Get-Date)}
Write-Host $LastFileCaptured
Get-ChildItem $sourcePath -Recurse -Include '.log' | Where-Object {
$_.LastWriteTime.AddDays(-1).ToString("yyyy/MM/dd") -gt (get-date).ToString("yyyy/mm/dd")
} | ForEach-Object {
$destDir = Split-Path ($_.FullName -replace [regex]::Escape($sourcePath), $destPath)
if (!(Test-Path $destDir)) {
New-Item -ItemType directory $destDir | Out-Null
}
Copy-Item $_ -Destination $destDir
}
The "correctness" of your script is determined easily by running it! But, while this isn't a direct answer, I would suggest robocopy for this task.
In particular note these options:
/mon: Monitors the source, and runs again when more than N changes are detected.
/maxage: Specifies the maximum file age (to exclude files older than N days or date).

Deleting files with PowerShell script

I have a script in PowerShell that scans a directory of folders that are named with the following convention: yyyymmdd. It scans the directory and finds all the folders that are current and up to one week old, then copies them over to another directory. After it copies them over to the other directory, I would like to have it delete the folders in the new directory which are named the same way and that are older than 18 months old. Would there be an easy way to do this? I have pasted the script below.
$targetdirectory = "\\DPR320-W12-1600\PRTG"
$sourcedirectory = "C:\Users\Public\Documents\PRTG Traffic Grapher"
$todaysdate=get-date
$minusoneweek=$todaysdate.adddays(-7)
$minusdate=($minusoneweek.Month).tostring(),($minusoneweek.day).tostring(),($minusoneweek.year).tostring()
$todaysdatestring=($todaysdate.Month).tostring(),($todaysdate.day).tostring(),($todaysdate.year).tostring()
$oldfilename=$minusdate[0]+$minusdate[1]+$minusdate[2]+" backup"
$newfilename=$todaysdatestring[0]+$todaysdatestring[1]+$todaysdatestring[2]+" backup"
Get-ChildItem $sourcedirectory\config | Where-Object {
$_.PsIsContainer -and
$_.BaseName -match '\d{6}' -and
([DateTime]::ParseExact($_.BaseName, 'yyyyMMdd', $null) -gt (Get-Date).AddDays(-7))
} |Copy-Item -Recurse -Force -Destination $targetdirectory\$oldfilename\config
Copy-Item -Force $sourcedirectory\config.csv -Destination $targetdirectory\$oldfilename
Copy-Item -Force $sourcedirectory\config.prtg -Destination $targetdirectory\$oldfilename
rename-item $targetdirectory\$oldfilename $newfilename
Based on your discussion with FoxDeploy:
This will loop through $YourDirectory and check if the name represents the date that's older than 18 months.
Used this for my own cleanup, except I have names in dotted format.
$CleanupList = Get-ChildItem $YourDirectory
$Threshold = (get-date).AddMonths(-18)
foreach ($DirName in $CleanupList)
{
If (([datetime]::ParseExact($DirName.BaseName,'yyyyMMdd',$null)) -lt $Threshold)
{
Remove-Item $DirName.FullName -Force -Recurse
}
}
Assuming $targetDirectory contains the files you'd like to remove (Those older than 18 months), just add this to the end of your script:
#resolve this day, 18 months ago
$18Months = (get-date).AddMonths(-18)
#get the name in the right format (i.e. 20141224)
$18MonthString = get-date -Date $18Months -UFormat "%Y%m%d"
#find the files with a name like this and delete them
$OldPrtgFiles = dir $targetDirectory | Where Name -like "$18MonthString*"
$OldPrtgFiles | remove-item -whatif
The first execution will display the -WhatIf view, to show you which files would be removed. If you're happy with this, remove WhatIf.

Counting folders with Powershell

Does anybody know a powershell 2.0 command/script to count all folders and subfolders (recursive; no files) in a specific folder ( e.g. the number of all subfolders in C:\folder1\folder2)?
In addition I also need also the number of all "leaf"-folders. in other words, I only want to count folders, which don't have subolders.
In PowerShell 3.0 you can use the Directory switch:
(Get-ChildItem -Path <path> -Directory -Recurse -Force).Count
You can use get-childitem -recurse to get all the files and folders in the current folder.
Pipe that into Where-Object to filter it to only those files that are containers.
$files = get-childitem -Path c:\temp -recurse
$folders = $files | where-object { $_.PSIsContainer }
Write-Host $folders.Count
As a one-liner:
(get-childitem -Path c:\temp -recurse | where-object { $_.PSIsContainer }).Count
To answer the second part of your question, of getting the leaf folder count, just modify the where object clause to add a non-recursive search of each directory, getting only those that return a count of 0:
(dir -rec | where-object{$_.PSIsContainer -and ((dir $_.fullname | where-object{$_.PSIsContainer}).count -eq 0)}).Count
it looks a little cleaner if you can use powershell 3.0:
(dir -rec -directory | where-object{(dir $_.fullname -directory).count -eq 0}).count
Another option:
(ls -force -rec | measure -inp {$_.psiscontainer} -Sum).sum
This is a pretty good starting point:
(gci -force -recurse | where-object { $_.PSIsContainer }).Count
However, I suspect that this will include .zip files in the count. I'll test that and try to post an update...
EDIT: Have confirmed that zip files are not counted as containers. The above should be fine!
Get the path child items with recourse option, pipe it to filter only containers, pipe again to measure item count
((get-childitem -Path $the_path -recurse | where-object { $_.PSIsContainer }) | measure).Count

Resources