How to compress log files older than 30 days in Windows? - windows

I wrote the below PowerShell script to compress logs older than 30 days:
$LastWrite=(get-date).AddDays(-30).ToString("MM/dd/yyyy")
Get-ChildItem -Filter "server.log*" -Recurse -File | Where-Object
{$_.LastWriteTime -le $LastWrite}
Now, I am unable to get a compress command in PowerShell via which I can compress (zip/tar) the server.log* files older than 30 days.
Expecting a single command which I can use by adding a pipe sign in the above command.

You can use the Compress-Archive cmdlet to zip files if you have PowerShell version 5 or above:
$LastWrite = (get-date).AddDays(-30)
$Files = Get-ChildItem -Filter "server.log*" -Recurse -File | Where-Object {$_.LastWriteTime -le $LastWrite}
ForEach ($File in $Files) {
$File | Compress-Archive -DestinationPath "$($File.fullname).zip"
}

If you have an older version of Powershell you can use ZipFileExtensions' CreateEntryFromFile method, but there are a lot of considerations if you want a robust script that runs unattended.
In months of testing a script developed for this purpose, I encountered some issues that have made this small problem more complicated:
Will any of the files be locked? CreateEntryFromFile may fail if so.
Did you know that you can have multiple copies of the same file in a Zip archive? It's harder to extract them because you can't put them in the same folder. My script checks the file path and the archived file time stamp (+/- 2 seconds due to the lost date precision in Zip format) to determine if it's been already archived, and doesn't create a duplicate.
Are the files created in a time zone with Daylight Savings? Zip format doesn't preserve that attribute, and may lose or gain an hour when uncompressed.
Do you want to delete the original if it was successfully archived?
If unsuccessful due to a locked/missing file or very long path, should the process continue?
Will any error leave you with an unusable zip file? You need to Dispose() the archive to finalize it.
How many archives do you want to keep? I prefer one per run-month, adding new entries to an existing zip.
Do you want to preserve the relative path? Doing so will partially eliminate the problem of duplicates inside the zip file.
Mark Wragg's script should work if you don't care about these issues and you have Powershell 5, but it creates a zip for every log, which may not be what you want.
Here's the current version of the script - in case GitHub ever becomes unavailable:
#Sends $FileSpecs files to a zip archive if they match $Filter - deleting the original if $DeleteAfterArchiving is true.
#Files that have already been archived will be ignored.
param (
[string] $ParentFolder = "$PSScriptRoot", #Files will be stored in the zip with path relative to this folder
[string[]] $FileSpecs = #("*.log","*.txt","*.svclog","*.log.*"),
$Filter = { $_.LastWriteTime -lt (Get-Date).AddDays(-7)}, #a Where-Object function - default = older than 7 days
[string] $ZipPath = "$PSScriptRoot\archive-$(get-date -f yyyy-MM).zip", #create one archive per run-month - it may contain older files
[System.IO.Compression.CompressionLevel]$CompressionLevel = [System.IO.Compression.CompressionLevel]::Optimal,
[switch] $DeleteAfterArchiving = $true,
[switch] $Verbose = $true,
[switch] $Recurse = $true
)
#( 'System.IO.Compression','System.IO.Compression.FileSystem') | % { [void][System.Reflection.Assembly]::LoadWithPartialName($_) }
Push-Location $ParentFolder #change to the folder so we can get relative path
$FileList = (Get-ChildItem $FileSpecs -File -Recurse:$Recurse | Where-Object $Filter) #CreateEntryFromFile raises UnauthorizedAccessException if item is a directory
$totalcount = $FileList.Count
$countdown = $totalcount
$skipped = #()
Try{
$WriteArchive = [IO.Compression.ZipFile]::Open( $ZipPath, [System.IO.Compression.ZipArchiveMode]::Update)
ForEach ($File in $FileList){
Write-Progress -Activity "Archiving files" -Status "Archiving file $($totalcount - $countdown) of $totalcount : $($File.Name)" -PercentComplete (($totalcount - $countdown)/$totalcount * 100)
$ArchivedFile = $null
$RelativePath = (Resolve-Path -LiteralPath "$($File.FullName)" -Relative) -replace '^.\\'
$AlreadyArchivedFile = ($WriteArchive.Entries | Where-Object {#zip will store multiple copies of the exact same file - prevent this by checking if already archived.
(($_.FullName -eq $RelativePath) -and ($_.Length -eq $File.Length) ) -and
([math]::Abs(($_.LastWriteTime.UtcDateTime - $File.LastWriteTimeUtc).Seconds) -le 2) #ZipFileExtensions timestamps are only precise within 2 seconds.
})
If($AlreadyArchivedFile -eq $null){
If($Verbose){Write-Host "Archiving $RelativePath $($File.LastWriteTimeUtc -f "yyyyMMdd-HHmmss") $($File.Length)" }
Try{
$ArchivedFile = [System.IO.Compression.ZipFileExtensions]::CreateEntryFromFile($WriteArchive, $File.FullName, $RelativePath, $CompressionLevel)
}Catch{
Write-Warning "$($File.FullName) could not be archived. `n $($_.Exception.Message)"
$skipped += [psobject]#{Path=$file.FullName; Reason=$_.Exception.Message}
}
If($File.LastWriteTime.IsDaylightSavingTime() -and $ArchivedFile){#HACK: fix for buggy date - adds an hour inside archive when the zipped file was created during PDT (files created during PST are not affected). Not sure how to introduce DST attribute to file date in the archive.
$entry = $WriteArchive.GetEntry($RelativePath)
$entry.LastWriteTime = ($File.LastWriteTime.ToLocalTime() - (New-TimeSpan -Hours 1)) #TODO: This is better, but maybe not fully correct. Does it work in all time zones?
}
}Else{#Write-Warning "$($File.FullName) is already archived$(If($DeleteAfterArchiving){' and will be deleted.'}Else{'. No action taken.'})"
Write-Warning "$($File.FullName) is already archived - No action taken."
$skipped += [psobject]#{Path=$file.FullName; Reason="Already archived"}
}
If((($ArchivedFile -ne $null) -and ($ArchivedFile.FullName -eq $RelativePath)) -and $DeleteAfterArchiving) { #delete original if it's been successfully archived.
Try {
Remove-Item $File.FullName -Verbose:$Verbose
}Catch{
Write-Warning "$($File.FullName) could not be deleted. `n $($_.Exception.Message)"
}
}
$countdown = $countdown -1
}
}Catch [Exception]{
Write-Error $_.Exception
}Finally{
$WriteArchive.Dispose() #close the zip file so it can be read later
Write-Host "Sent $($totalcount - $countdown - $($skipped.Count)) of $totalcount files to archive: $ZipPath"
$skipped | Format-Table -Autosize -Wrap
}
Pop-Location
Here's a command line that will compress all server.log* files older than 30 days under the current folder:
.\ArchiveOldLogs.ps1 -FileSpecs #("server.log*") -Filter { $_.LastWriteTime -lt (Get-Date).AddDays(-30)}

Related

PS - Find Folders that Haven't Had their Files Modfied in Some Time

We're migrating our FTP and I would like to only migrate folders that have had files in them that have been used/written in the last 6 months. I would think this would be something that I find all over the place with google, but all the scripts I've found have the same fatal flaw.
It seems with everything I find, it depends on the "Date modified" of the folder. The problem with that is, I have PLENTY of folders that show a "Date Modified" of years ago, yet when you dig into it, there are files that are being created and written as recently as today.
Example:
D:/Weblogs may show a date modified of 01/01/2018 however, when you dig into it, there is some folder, idk, called "Log6" let's say, and THAT folder has a log file in it that was modified as recently as yesterday.
All these scripts I'm seeing pull the date modified of the top folder, which just doesn't seem to be accurate.
Is there any way around this? I would expect something like
Get all folders at some top level, then foreach through the CONTENTS of those folders looking for files with the datemodified -lt adddays(-180) filter. If stuff that's "New" is found, then don't add the overarching directory to the array, but if not, then list the directory.
Any ideas?
Edit: I've tried this
$path = <some dir>
gci -path $path -Directory where-object {LastWriteTime -lt (get-date).addmonths(-6))}
and
$filter = {$_.LastWriteTime -lt (Get-Date).AddDays(-180)}
#$OldStuff = gci "D:\FTP\BELAMIINC" -file | Where-Object $filter
$OldFolders = gci "D:\FTP\BELAMIINC" -Directory | ForEach-Object {
gci "D:\FTP\BELAMIINC" -file | Where-Object $filter
}
Write-Host $OldFolders
Give this a try, I added comments for you to follow along the thought process.
The use of -Force is mainly to find hidden files and folders.
$path = '/path/to/parentFolder'
$limit = [datetime]::Now.AddMonths(-6)
# Get all the child folders recursive
$folders = Get-ChildItem $path -Recurse -Directory -Force
# Loop over the folders to see if there is at least one file
# that has been modified or accessed after the limit date
$result = foreach($folder in $folders)
{
:inner foreach($file in Get-ChildItem $folder -File -Force)
{
if($file.LastAccessTime -gt $limit -or $file.LastWriteTime -gt $limit)
{
# If this condition is true at least once, break this loop and return
# the folder's FullName, the File and File's Date for reference
[pscustomobject]#{
FullName = $folder.FullName
File = $file.Name
LastAccessTime = $file.LastAccessTime
LastWriteTime = $file.LastWriteTime
}
break inner
}
}
}
$result | Out-GridView
If you need to find the folders that don't have files recently modified you can use the $folders array and compare it against $result:
$folders.where({$_.FullName -notin $result.FullName}).FullName

Folder 's"Date modified" incorrect (Windows 7)

Something very odd is going on with the "date modified" field of several folders on an exFAT external drive I have. A folder in which several files were recently added is still showing its date modified as its creation date. Even worse, another folder with recently added files is showing a date that precedes its creation date! Has anyone observed this and know what might be going on? I have checked online and found nothing useful/relevant regarding this. The same information shows up in both Explorer and in a command prompt so its not specific to Explorer
Run this PowerShell script. Close Explorer before running to avoid file-locking.
# -------------give each folder the highest modified date of it's files --------
function OneDir($dir)
{
# elaborate one folder, with given name
Set-Location -Path $dir.FullName
$maxd = Get-Date(0)
$files = Get-ChildItem -Recurse -Filter *.* | Where-Object { $_.PsIsContainer -eq $false }
for ($i=0; $i -lt $files.Count; $i++)
{
$file = $files[$i]
$cd = [datetime]($file.lastwritetime)
If ($cd -Gt $maxd)
{$maxd = $cd}
}
If ($files.Count -Gt 0)
{$dir.LastWriteTime = ($maxd)}
Write-Host ($dir.FullName) + " " + ($dir.LastWriteTime)
}
#------------------------- main ------------------------------------
$startDir = Read-Host 'Foldername to start with'
Set-Location -Path $startDir
$t = Get-ItemProperty $startDir
OneDir $t
$dirs = Get-ChildItem -Recurse -Filter *.* | Where-Object { $_.PSIsContainer }
for ($d=0; $d -lt $dirs.Count; $d++)
{
OneDir $dirs[$d]
}
Write-Host "Finished. press Enter"
cmd /c pause

PowerShell: Script won't count files older than 30 from last modified date

all.
I'm stuck. I have a PowerShell script which looks to a specific folder for files which are older than 30 days from the last modified date (additionally, it'll create the folder if it doesn't exist). It creates the folder, it gives me the total files, it'll list all of the files in a test query, but it won't actually count the number of 30+ day old files. I've tried several methods to get this count (some deriving from other solutions to delete old files from this site), but PowerShell just doesn't want to do it.
Here's my code so far...
$HomePath = $env:USERPROFILE
$CompanyFolder = "\Company"
$TimeSensativeFolder = "\TimeSensative"
$TimeSensativePath = $HomePath+$CompanyFolder+$TimeSensativeFolder
$OldFilesAmount = 0
$TotalFilesAmount = 0
$TimeLimit = (Get-Date).AddDays(-30)
$StatusOK = "No old files were found in the time sensative folder."
$StatusCreated = "The time sensative folder was created."
$StatusError1 = "There were old files found in the time sensative folder!"
$StatusError2 = "Unable to create the time sensative folder!"
function MakeTimeSensativeFolder ($TimeSensativePath) {
try {
md $TimeSensativePath -Force -ErrorAction Stop
Write-Host $StatusCreated
}
catch {
Write-Host $StatusError2
}
}
function CountOldFiles () {
$OldFilesAmount = $OldFilesAmount + 1
}
if(!(Test-Path $TimeSensativePath -PathType Container)) {
MakePHIFolder $TimeSensativePath
}
else {
}
try {
$TotalFilesAmount = (Get-ChildItem $PHIPath -Recurse -File | Measure-Object).Count
# I've tried this...
Get-Item $PHIPath | Foreach {$_.LastWriteTime} -ErrorAction Stop
if (Get-Content $_.LastWriteTime | Where-Object {$_ -gt $TimeLimit}) {
CountOldFiles
}
# And I've tried this...
Get-ChildItem -Path $PHIPath -Recurse -File | Foreach-Object {
if (Get-Content $_.LastWriteTime | Where-Object {$_ -gt $TimeLimit}) {
CountOldFiles
}
}
# I've even tried this...
Get-ChildItem $PHIPath -Recurse -File | ? {
-not $_.PSIsContainer -and $_.LastWriteTime -lt $TimeLimit
} | CountOldFiles
# And this, as well...
Get-ChildItem -Path $PHIPath -Recurse -File | Where-Object {$_.LastWriteTime -gt $TimeLimit} | CountOldFiles
}
catch {
MakeTimeSensativeFolder $TimeSensativePath
}
# Used for testing.
<#
Get-ChildItem $TimeSensativePath -Recurse -File
Write-Host "TimeSensative folder exists:" $TimeSensativePathExists
Write-Host "Home TimeSensative path:" $TimeSensativePath
Write-Host "Old files found:" $OldFilesAmount
Write-Host "Total files found:" $TotalFilesAmount
Exit
#>
# Determining proper grammar for status message based on old file count.
if ($OldFilesAmount -eq 1) {
$StatusError1 = "There was "+$OldFilesAmount+" old file of "+$TotalFilesAmount+" total found in the PHI folder!"
}
if ($OldFilesAmount -gt 1) {
$StatusError1 = "There were "+$OldFilesAmount+" old files of "+$TotalFilesAmount+" total found in the PHI folder!"
}
# Give statuses.
if ($OldFilesAmount -gt 0) {
Write-Host $StatusError1
}
else {
Write-Host $StatusOK
}
Depending on which I tried, I would get no result or I'd get something like this:
Get-Content : Cannot find drive. A drive with the name '12/22/2016 17' does not exist.
At C:\Users\nobody\Scripts\PS1\ts_file_age.ps1:54 char:14
+ if (Get-Content $_.LastWriteTime | Where-Object {$_ -gt $Tim ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : ObjectNotFound: (12/22/2016 17:String) [Get-Content], DriveNotFoundException
+ FullyQualifiedErrorId : DriveNotFound,Microsoft.PowerShell.Commands.GetContentCommand
In any instance, there's no old file count as I'm endeavoring to demand.
It's been a bit of a head scratcher. Any advice?
Thanks so much in advance!
Filtering files with last write time is easy enough. Like so,
$allFiles = gci
$d = (Get-Date).adddays(-30)
$newFiles = #()
$oldFiles = #()
$allFiles | % { if ($_.lastwritetime -ge $d) { $newFiles +=$_ } else { $oldFiles += $_ } }
What's done here is that first all the files are set in a collection. This isn't mandatory, but one can browse the collection to check that it's been populated properly. This is useful in cases one has complex paths or exclusion filters.
The second step is just to get a DateTime that is used later to divide files into old and new ones. Just like the sample did, so nothing interesting here. Actually, there's one little thing. The date is -30 days, but hours, minutes and seconds are based on current time. So if there's really tight limit, consider using midnight time ([datetime]::Today).AddDays(-30)
The third step is to declare two empty collections for new and old files.
The last step is to iterate through the $allFiles and check the last write time. If it's greater or equal to the cutpoint, add it into $newFiles, othervise $OldFiles.
After the last step, further processing should be simple enough.
This is what I do to get (delete in this case) files older than X days:
$Days = 5
$limit = (Get-Date).AddDays(-$Days)
$CurrentDate = Get-Date
#This will delete all files older than 5 days
Get-ChildItem -Path $Workdir -Recurse -Force | Where-Object { !$_.PSIsContainer -and $_.LastWriteTime -lt $limit } | Remove-Item -Force

Deleting files with PowerShell script

I have a script in PowerShell that scans a directory of folders that are named with the following convention: yyyymmdd. It scans the directory and finds all the folders that are current and up to one week old, then copies them over to another directory. After it copies them over to the other directory, I would like to have it delete the folders in the new directory which are named the same way and that are older than 18 months old. Would there be an easy way to do this? I have pasted the script below.
$targetdirectory = "\\DPR320-W12-1600\PRTG"
$sourcedirectory = "C:\Users\Public\Documents\PRTG Traffic Grapher"
$todaysdate=get-date
$minusoneweek=$todaysdate.adddays(-7)
$minusdate=($minusoneweek.Month).tostring(),($minusoneweek.day).tostring(),($minusoneweek.year).tostring()
$todaysdatestring=($todaysdate.Month).tostring(),($todaysdate.day).tostring(),($todaysdate.year).tostring()
$oldfilename=$minusdate[0]+$minusdate[1]+$minusdate[2]+" backup"
$newfilename=$todaysdatestring[0]+$todaysdatestring[1]+$todaysdatestring[2]+" backup"
Get-ChildItem $sourcedirectory\config | Where-Object {
$_.PsIsContainer -and
$_.BaseName -match '\d{6}' -and
([DateTime]::ParseExact($_.BaseName, 'yyyyMMdd', $null) -gt (Get-Date).AddDays(-7))
} |Copy-Item -Recurse -Force -Destination $targetdirectory\$oldfilename\config
Copy-Item -Force $sourcedirectory\config.csv -Destination $targetdirectory\$oldfilename
Copy-Item -Force $sourcedirectory\config.prtg -Destination $targetdirectory\$oldfilename
rename-item $targetdirectory\$oldfilename $newfilename
Based on your discussion with FoxDeploy:
This will loop through $YourDirectory and check if the name represents the date that's older than 18 months.
Used this for my own cleanup, except I have names in dotted format.
$CleanupList = Get-ChildItem $YourDirectory
$Threshold = (get-date).AddMonths(-18)
foreach ($DirName in $CleanupList)
{
If (([datetime]::ParseExact($DirName.BaseName,'yyyyMMdd',$null)) -lt $Threshold)
{
Remove-Item $DirName.FullName -Force -Recurse
}
}
Assuming $targetDirectory contains the files you'd like to remove (Those older than 18 months), just add this to the end of your script:
#resolve this day, 18 months ago
$18Months = (get-date).AddMonths(-18)
#get the name in the right format (i.e. 20141224)
$18MonthString = get-date -Date $18Months -UFormat "%Y%m%d"
#find the files with a name like this and delete them
$OldPrtgFiles = dir $targetDirectory | Where Name -like "$18MonthString*"
$OldPrtgFiles | remove-item -whatif
The first execution will display the -WhatIf view, to show you which files would be removed. If you're happy with this, remove WhatIf.

Copy files modified in last 2 days

I would like to copy files between folders. Just modified (CSV files with new entries) in current day and one day before.
Here is my code:
foreach ($file in (Get-ChildItem "D:\Shares\WinCAP Data\DAYPROT\OFS-222_2")) {
if ($file.LastWriteTime = (Get-Date).AddDays(-1)) {
Copy-Item -Path "D:\Shares\WinCAP Data\DAYPROT\OFS-222_2\*.csv" -Destination "\\Oracle\MP"
"copying $file"
} else {
"not copying $file"
}
}
What is wrong - any suggestions?
You need to compare the date with -gt otherwise your're looking for files that were copied at that EXACT time.
Note that doing (Get-Date).AddDays(-1) is perfectly valid but will give you anything modified in the last 24 hours.
$DestinationFolder = "\\Oracle\MP\"
$EarliestModifiedTime = (Get-date).AddDays(-1)
$Files = Get-ChildItem "D:\Shares\WinCAP Data\DAYPROT\OFS-222_2\*.csv"
foreach ($File in $Files) {
if ($File.LastWriteTime -gt $EarliestModifiedTime)
{
Copy-Item $File -Destination $DestinationFolder
Write-Host "Copying $File"
}
else
{
Write-Host "Not copying $File"
}
}
If you didn't want to write out the "Copying ..." and "Not Copying ..." then you could simplify this quite a bit.
$DestingationFolder = "\\Oracle\MP\"
$EarliestModifiedTime = (Get-date).AddDays(-1)
Get-ChildItem -File |? { $_.LastWriteTime -gt $EarliestModifiedTime } | Copy-Item -Destination $DestingationFolder
Finally, if you want to copy anything since the beginning of (eg midnight at the start of) yesterday then change the following line:
$EarliestModifiedTime = (Get-date).AddDays(-1).Date
#Mr Tree I have one more related question.
I got few times per day new file at the location D:\Shares\WinCAP Data\DAYPROT\OFS-HT (location 1) with fixed name abcDD.MM.YYYY.csv (abc03.09.2015.csv) and I have a service which every 10 minutes call my powershell script below. I made as you suggest before in upper posts. My goal is: 1. to check if there is new file with name abcDD.MM.YYYY.csv | 2. rename it into abcDD.MM.YYYYHT.csv and move it to "\Oracle\MP\PURO\" (location 2) folder where I need to rewrite it with existing for current day.
Problem is that if the file already exists on the location 2, script does not want to move it and rewrite it? Thanks for hints.
$DestingationFolder = "\\Oracle\MP\PURO\"
$EarliestModifiedTime = (Get-date).AddDays(-1)
Get-ChildItem "D:\Shares\WinCAP Data\DAYPROT\OFS-HT\*.csv" | ?{!($_.fullname -match "HT\.csv")} | Rename-Item -NewName { $_.Name -replace "\.csv", "HT.csv" }
$Files = Get-ChildItem "D:\Shares\WinCAP Data\DAYPROT\OFS-HT\*.csv" -File
foreach ($File in $Files) {
if ($File.LastWriteTime -gt $EarliestModifiedTime)
{
Move-Item $File -Destination $DestingationFolder
Write-Host "Moving $File"
}
else
{
Write-Host "Not moving $File"
}
}

Resources