recursively check all folders and export the list of folders that has no created backup file in it during last 24 hour - powershell-4.0

There is a folder containing database backup files. I need to recursively check all folders and export the list of last backup files in each folder.
I have a code here that contains my idea. just I have to add this part:
-check for each selected file(selected file is the last created backup file) if the file creation time is older than 24 hours, export in csv file.
Thanks in advance
[Cmdletbinding()]
param(
[Parameter(Position=0,Mandatory=$false,ValueFromPipeline=$true)]$path=
"\F:\backups",
[Parameter(Position=1,Mandatory=$false,ValueFromPipeline=$true)]
$OutPutFilepath=
"f:\backup-daily.csv"
)
function Get-LastestWroteFile{
[Cmdletbinding()]
param(
[Parameter(Position=0,Mandatory=$true)]$Folder
)
begin{
$Latest = Get-ChildItem $Folder.FullName -File | select FullName,
CreationTime, LastAccessTime, LastWriteTime, Attributes, #{N='SizeInMb';E=
{$_.Length/1mb}},Name | Sort-Object CreationTime | select -Last 1
}
process{
}
end{
#new custom object with 3 props.
if($Latest){
return New-Object PSobject -Property #{"FullName"=$latest.Name;
LastWriteTime =
$latest.LastWriteTime;"Folder"=$folder.FullName;"SizeInMB" =
[math]::Round($Latest.SizeInMB,3)} #FileInfo=$Latest; }
}
}
}
$OutPut=#()
Get-ChildItem -Directory -Path $path -Recurse | foreach{
$OutPut+= Get-LastestWroteFile $_
}
$OutPut | ConvertTo-Csv -NoTypeInformation -delimiter '|' | Out-File -
FilePath $OutPutFilepath

an advanced function would not be required, try below
Get-ChildItem -Path $Path -Recurse -File | Where-Object -FilterScript {
([Datetime]::Now - $_.CreationTime ).Hours -gt 24
} | Select-Object -Property Name,LastWriteTime,FullName,#{N='SizeInMb';E=
{$_.Length/1mb}},

Related

How to select the file with the maximum number of the specified file

I want to keep only the file with the largest version of the specified zip file in the folder using powershell. I wrote a shell script but it returns all the files. How can I modify the script to select only the file with the largest version?
$files = Get-ChildItem -Filter "*.zip"
$max = $files |Measure-Object -Maximum| ForEach-Object {[int]($_.Split("_")[-1].Split(".")[0])}
$largestFiles = $files | Where-Object {[int]($_.Split("_")[-1].Split(".")[0]) -eq $max}
Write-Output $largestFiles
Expectation:
A1_Fantasic_World_20.zip
A1_Fantasic_World_21.zip
B1_Mythical_Realms_11.zip
B1_Mythical_Realms_12.zip
C1_Eternal_Frame_Corporation_2.zip
C1_Eternal_Frame_Corporation_3.zip
↓
A1_Fantasic_World_21.zip
B1_Mythical_Realms_12.zip
C1_Eternal_Frame_Corporation_3.zip
A1_Fantasic_World's biggest number is 21.B1_Mythical_Realms's is 12.C1_Eternal_Frame_Corporation's is 3. So I want to choose the biggest version of zip.
First you add the calculated properties to your file system objects you use for filtering. Then with a combination of Group-Object, Sort-Object and Select.Object you can filter the desired files.
$FileList =
Get-ChildItem -Filter *.zip |
Select-Object -Property *,
#{
Name = 'Title'
Expression = {($_.BaseName -split '_')[0..$(($_.BaseName -split '_').count - 2)] -join '_' }
},
#{
Name = 'Counter'
Expression = {[INT]($_.BaseName -split '_')[-1]}
}
$LastOnesList =
$FileList |
Group-Object -Property Title |
ForEach-Object {
$_.Group | Sort-Object -Property Counter | Select-Object -Last 1
}
$LastOnesList |
Select-Object -Property Name

Getting root folder name with PS

I am trying to create a PowerShell script to fetch the root folder's name where in their subdirectories files with error names are present with today's date. Below is the sample code I have tried so far to pick the folder names.
Root Log folder - C:\Errorlogs, contains many other application log level folders.
$targetDir="C:\Errorlogs"
Get-ChildItem $targetDir -Recurse -ErrorAction SilentlyContinue -Force -Filter "*Error*"|
where {([datetime]::now.Date -eq $_.lastwritetime.Date)} |
select FullName
I have tried the above code; however, it's giving me the whole path as result, whereas I only need the folder name.
Result - C:\Errorlogs\AsyncCreateUsersAPIProcessor\202302\04\Error.txt
Required - AsyncCreateUsersAPIProcessor
Use string LastIndexOf and SubString
$rootPath = "C:\Temp\Errorlogs"
$date = [DateTime]::Now.ToString("yyyyMM\\\\dd")
$pattern = '\\(?<folder>\w+)\\' + $date + '\\Error.*$'
$files = Get-ChildItem -Path $rootPath -Recurse | Select-Object -Property Fullname | Where-Object {$_.Fullname -Match $pattern}
foreach($file in $files)
{
$file.Fullname -match $pattern
Write-Host "folder = " $Matches.folder
}
Looks like you can do it just with splitting the path using \ as delimiter then picking the 3rd token (2nd index of an array):
$targetDir = "C:\Errorlogs"
Get-ChildItem $targetDir -Recurse -ErrorAction SilentlyContinue -Force -Filter "*Error*" |
Where-Object { [datetime]::Now.Date -eq $_.LastWriteTime.Date } |
Select-Object #{ N='Name'; E={ $_.FullName.Split('\')[2] }}
Another option if you want 2 levels up in the folder hierarchy is to query the .Directory property of the file then the .Parent property of the parent folder (2 times or as many times as needed):
$targetDir = "C:\Errorlogs"
Get-ChildItem $targetDir -Recurse -ErrorAction SilentlyContinue -Force -Filter "*Error*" |
Where-Object { [datetime]::Now.Date -eq $_.LastWriteTime.Date } |
Select-Object #{ N='Name'; E={ $_.Directory.Parent.Parent.Name }}
As long as the subfolders inside the folder you are after all have numeric-only names, you can loop backwards to get at the first non-numeric foldername and output that.
$targetDir = "C:\Errorlogs"
Get-ChildItem -Path $targetDir -File -Filter "*Error*" -Recurse -Force -ErrorAction SilentlyContinue |
Where-Object { [datetime]::Now.Date -eq $_.LastWriteTime.Date } | ForEach-Object {
$parentDir = $_.Directory
while ($parentDir.Name -match '^\d+$') { $parentDir = $parentDir.Parent }
$parentDir.Name
}
That way, even a path C:\Errorlogs\AsyncCreateUsersAPIProcessor\202302\02\04\1234\567\Error.txt would produce folder name AsyncCreateUsersAPIProcessor

Compare mkv's creationtime

I've been tasked with creating a script that checks to see if the office cameras we've set up have stopped uploading their feeds to the "Camera" share located on our Windows 2016 storage server. If the NEWEST .mkv is over an hour old compared to the current time (get-date) then the "problem" camera needs to be restarted manually. (No need to script that part.)
Here's what my Director has written so far:
#Variable Definitions start here
$numhours = 1
Get-ChildItem "d:\Shares\Cameras" | Foreach {
$folderToLookAt = ($_.FullName + "\*.mkv")
$result = Get-ChildItem -Recurse $folderToLookAt | Sort-Object CreationTime -Descending
echo $result[0].FullName
echo $result[0].CreationTime
}
The first variable really isn't used yet, but I'm kind of dumb-struck as what to do next. The above returns the full names and creation times successfully of the newest .mkvs
Suggestions on the next part?
Invert the logic - instead of searching all the files, sorting them, finding the most recent, and checking the date, do it the other way round.
Look for files created since the cutoff, and alert if there were none found:
$cutOffTime = [datetime]::Now.AddHours(-1)
Get-ChildItem "d:\Shares\Cameras" | Foreach {
$folderToLookAt = ($_.FullName + "\*.mkv")
$result = Get-ChildItem -Recurse $folderToLookAt | Where-Object { $_.CreationTime -gt $cuttoffTime }
if (-not $result)
{
"$($_.Name) has no files since the cutoff time"
}
}
I'm assuming your paths look like:
D:\Shares\Cameras\Camera1\file1.mkv
D:\Shares\Cameras\Camera1\file2.mkv
D:\Shares\Cameras\Camera2\file1.mkv
D:\Shares\Cameras\Camera2\file2.mkv
D:\Shares\Cameras\Camera3\file1.mkv
.
.
.
If so, I would do something like this:
# The path to your files
$CameraShareRoot = 'D:\Shares\Cameras';
# Number of Hours
$NumberOfHours = 1;
# Date and time of significance. It's $NumberOfHours in the past.
$MinFileAge = (Get-Date).AddHours( - $NumberOfHours);
# Get all the folders at the camera share root
Get-ChildItem -Path $CameraShareRoot -Directory | ForEach-Object {
# Get the most recently created file in each folder
$_ | Get-ChildItem -Recurse -Filter '*.mkv' -File | Sort-Object -Property CreationTime -Descending | Select-Object -First 1
} | Where-Object {
# Remove any files that were created after our datetime
$_.CreationTime -lt $MinFileAge;
} | Select-Object -Property FullName, CreationTime
This will just output the full file name and creation time for stale cameras.
You could do something like this to email yourself a report when the results have any files:
# The path to your files
$CameraShareRoot = 'D:\Shares\Cameras';
# Number of Hours
$NumberOfHours = 1;
# Date and time of significance. It's $NumberOfHours in the past.
$MinFileAge = (Get-Date).AddHours( - $NumberOfHours);
# Get all the folders at the camera share root, save the results to $StaleCameraFiles
$StaleCameraFiles = Get-ChildItem -Path $CameraShareRoot -Directory | ForEach-Object {
# Get the most recently created file in each folder
$_ | Get-ChildItem -Recurse -Filter '*.mkv' -File | Sort-Object -Property CreationTime -Descending | Select-Object -First 1;
} | Where-Object {
# Remove any files that were created after our datetime
$_.CreationTime -lt $MinFileAge;
}
# If there are any stale camera files
if ($StaleCameraFiles) {
# Send an email
$MailMessage = #{
SmtpServer = 'mail.example.com';
To = 'youremail#example.com';
From = 'youremail#example.com';
Subject = 'Stale Camera Files';
Body = $StaleCameraFiles | Select-Object -Property FullName, CreationTime | ConvertTo-Html -Fragment | Out-String;
BodyAsHtml = $true;
}
Send-MailMessage #MailMessage;
}
Generally you will want to use LastWriteTime instead of CreationTime since the latter can be updated by a file move or copy, but maybe that's what you want here.
You have to compare the CreationTime date with (Get-Date).AddHours(-1). The AddHours method allows you to add hours to the DateTime, but also to subtract.
You can use the following example:
$Path = 'd:\Shares\Cameras'
$CreationTime = Get-ChildItem -Path $Path -Filter *.mkv |
Sort-Object -Property CreationTime -Descending |
Select-Object -First 1 -ExpandProperty CreationTime
if ($CreationTime -lt (Get-Date).AddHours(-1)) {
# your action here (restart, send mail, write output, ...)
}
It also optimizes your code a bit. ;)
$LatestFile = Get-ChildItem C:\Users\Connor\Desktop\ | Sort CreationTime | Select -Last 1
if ($LatestFile.CreationTime -gt (Get-Date).AddHours(-1)){
#It's Currently Working
} else {
#Do Other Stuff
}
try this :
Get-ChildItem "c:\temp" -Filter *.mkv -File | sort CreationTime -Descending |
select -First 1 | where CreationTime -lt (Get-Date).AddHours(-1) |
%{Write-Host "Alert !!" -ForegroundColor Red}

Powershell: How to output folder name, lastwritetime and folder size in one line?

I want to out put folder name, lastwritetime and folder size, how can i combine both of the results in to one line?
For folder name and lastwritetime:
get-item "\\server-01\Y$\Server1" | select name,lastwritetime
For folder size:
$folder = (Get-ChildItem "\\server-01\Y$\Server1" -recurse | Measure-Object -property length -sum)
$size = "{0:N2}" -f ($folder.sum / 1024MB) + " GB"
I need output format like this:
Name LastWriteTime Size
Server1 2014-05-05 55G
Also how to make a loop of running this function through a list of PCs?
Any idea please?
For Folder name and lastwritetime:
Get-Item $Path | Where-Object { $_.BaseName ,$_.LastWriteTime}
For folder size:
$log="C:\log.txt"
$Path = "C:\Test"
$Items = Get-ChildItem $Path | Where-Object {$_.PSIsContainer -eq $True} | Sort-Object
foreach ($f in $Items){
$itemSum = Get-ChildItem ("$Path\" + $f.Name) | Select-Object #{ l="Path" ; e = {$f}},LastWriteTime,#{l="Size" ; e={((Get-childitem -recurse | measure-object length -sum).Sum /1KB)}}
}
Enjoy!!
FYI
Query Folder tree for Size and export to a log on a server
Select-Object will be your friend here:
foreach ($c in (get-content .\Servers.txt))
{ Get-Childitem \\$c\y$\mydirectory | select-object #{l="Name" ; e = {$c}},Lastwritetime,#{l="Size" ; e={(Get-childitem -recurse | measure-object length -sum).sum}} }
But you could also do yourself a favor and add a function like get-foldersize to your profile or to a standard tools module.
http://gallery.technet.microsoft.com/Get-FolderSize-b3d317f5
Here's a true one-liner with some formatting.
Get-ChildItem -Directory -Force|ForEach {"{0,-30} {1,-30} {2:N2}MB" -f $_.Name, $_.LastWriteTime, ((Get-ChildItem $_ -Recurse|Measure-Object -Property Length -Sum -ErrorAction Stop).Sum/1MB)}
Result:

powershell exporting to text file

I'm working on a script that checks folders in specific directory. For example, I run the script for first time, it generates me a txt file containing folders in the directory.
I need the script to add any new directories that are found to the previously created txt file when the script is run again.
Does anyone have any suggestions how to make that happen?
Here is my code so far:
$LogFolders = Get-ChildItem -Directory mydirectory ;
If (-Not (Test-Path -path "txtfilelocated"))
{
Add-Content txtfilelocated -Value $LogFolders
break;
}else{
$File = Get-Content "txtfilelocatedt"
$File | ForEach-Object {
$_ -match $LogFolders
}
}
$File
something like this?
You can specify what directory to check adding path to get-childitem cmdlet in first line
$a = get-childitem | ? { $_.psiscontainer } | select -expand fullname #for V2.0 and above
$a = get-childitem -Directory | select -expand fullname #for V3.0 and above
if ( test-path .\list.txt )
{
compare-object (gc list.txt) ($a) -PassThru | Add-Content .\list.txt
}
else
{
$a | set-content .\list.txt
}

Resources