I have a read-only file, say samp.txt and I run the following on PowerShell:
> $file = Get-Item .\samp.txt
> $file.LastAccessTime = (get-date)
we get: "Access to the path 'G:\Study_Material\Coding\samp.txt' is denied."
Now before we proceed, look at the access time:
> $file.LastAccessTime will be
Sunday, December 30, 2018 11:02:49 PM
Now we open WSL and do: $ touch samp.txt
Back to PowerShell we do:
> $file = Get-Item .\samp.txt
> $file.LastAccessTime
we get:
Sunday, December 30, 2018 11:19:16 PM
Thus it has been modified with no elevated privileges.
Now my question: How is it possible to mimic this action in PowerShell alone without removing the ReadOnly tag by modifying the $file.Attributes.
When dealing with ReadOnly files, you cannot simply change the LastAccessTime.
(see the comments by eryksun ).
In order to have it work in PowerShell, you need to first remove the ReadOnly flag from the file's attributes, do the change and reset the ReadOnly flag like so:
$file = Get-Item .\samp.txt -Force
# test if the ReadOnly flag on the file is set
if ($file.Attributes -band 1) {
# remove the ReadOnly flag from the file. (FILE_ATTRIBUTE_READONLY = 1)
$file.Attributes = $file.Attributes -bxor 1
# or use: $file | Set-ItemProperty -Name IsReadOnly -Value $false
$file.LastAccessTime = (Get-Date)
# reset the ReadOnly flag
$file.Attributes = $file.Attributes -bxor 1
# or use: $file | Set-ItemProperty -Name IsReadOnly -Value $true
}
else {
# the file is not ReadOnly, so just do the 'touch' on the LastAccessTime
$file.LastAccessTime = (Get-Date)
}
You can read all about file attributes and their numeric values here
Related
Here is a simple script:
$srcpth = "C:\Users\Mark\Desktop\dummy\"
$files = Get-ChildItem -Path $srcpth -File -Recurse
foreach ($f in $files) {
$filen = $f.Name
$filesize = $f.Length
Write-Output "$filen $filesize"
}
This will correctly loop through all subfolders in C:\Users\Mark\Desktop\dummy and output file name with file size, but it will not show relative path. How do I resolve the relative path? Thanks.
EDIT: added below for clarification of desired output:
For example, under C:\Users\Mark\Desktop\dummy are subfolders with files
C:\Users\Mark\Desktop\dummy\file00.txt
C:\Users\Mark\Desktop\dummy\folder01\file01_01.txt
C:\Users\Mark\Desktop\dummy\folder01\file01_02.txt
C:\Users\Mark\Desktop\dummy\folder01\file01_03.txt
C:\Users\Mark\Desktop\dummy\folder02\file02_01.txt
C:\Users\Mark\Desktop\dummy\folder02\file02_01.txt
C:\Users\Mark\Desktop\dummy\folder03\file03_01.txt
C:\Users\Mark\Desktop\dummy\folder03\file03_02.txt
C:\Users\Mark\Desktop\dummy\folder03\file03_03.txt
C:\Users\Mark\Desktop\dummy\folder03\file03_04.txt
Output with above code produces:
file00.txt 9
file01_01.txt 10
file01_02.txt 12
file01_03.txt 12
file02_01.txt 15
file02_01.txt 14
file03_01.txt 11
file03_02.txt 15
file03_03.txt 13
file03_04.txt 12
But what I want is:
file00.txt 9
\folder01\file01_01.txt 10
\folder01\file01_02.txt 12
\folder01\file01_03.txt 12
\folder02\file02_01.txt 15
\folder02\file02_01.txt 14
\folder03\file03_01.txt 11
\folder03\file03_02.txt 15
\folder03\file03_03.txt 13
\folder03\file03_04.txt 12
preceeding \, no slash, or .\ are fine.
Here you go:
$srcpth = "C:\Users\Mark\Desktop\dummy\"
$files = Get-ChildItem -Path $srcpth -File -Recurse
foreach ($f in $files) {
$filen = $f.Name
$filesize = $f.Length
$relativePath = $f.fullname.remove(0,($srcpth.length))
Write-Output "$filen $filesize $relativePath"
}
There aren't any object properties with the value you're looking for. But you can calculate it like above. It's always useful to look at the members of an object when you're trying to figure something like this out:
$files[0] | get-member
This will give you a better idea of what you can work with, what properties you can use, and what methods are available.
I would recommend you to output objects instead of strings as you're doing right now, in any case, you can get the relative paths either using .SubString(..):
foreach ($f in Get-ChildItem -Path $srcpth -File -Recurse) {
[pscustomobject]#{
FileName = $f.Name
FileSize = $f.Length
RelativePath = $f.FullName.Substring($srcpth.Length + 1)
}
}
Or if you're using PowerShell Core, you can access the .NET API Path.GetRelativePath(String, String):
foreach ($f in Get-ChildItem -Path $srcpth -File -Recurse) {
[pscustomobject]#{
FileName = $f.Name
FileSize = $f.Length
RelativePath = [IO.Path]::GetRelativePath($srcpth, $f.FullName)
}
}
There is also PathIntrinsics.NormalizeRelativePath(String, String) Method available to both, Windows PowerShell and PowerShell Core, though this seems an overkill:
$ExecutionContext.SessionState.Path.NormalizeRelativePath($f.FullName, $srcpth)
While the String.Substring() / .Remove() and [IO.Path]::GetRelativePath() solutions are sufficient when working with only absolute native paths, they fail when the -Path argument for Get-ChildItem is a relative path or a PowerShell-only path (see examples at the end of this answer for how they can fail).
For a solution that additionally supports PowerShell paths and relative paths, I recommend to use Resolve-Path -Relative:
# For this demo, create a Powershell-only path.
$null = New-PSDrive -Name TempDrv -Root ([IO.Path]::GetTempPath()) -PSProvider FileSystem
$srcpth = 'TempDrv:\RelativePathTest'
$null = New-Item "$srcpth\subdir\test.txt" -Force
# Set base path for Get-ChildItem and Resolve-Path. This is necessary because
# Resolve-Path -Relative resolves paths relative to the current directory.
Push-Location $srcpth
try {
foreach ($f in Get-ChildItem -File -Recurse) {
[pscustomobject]#{
FileName = $f.Name
FileSize = $f.Length
RelativePath = Resolve-Path $f.FullName -Relative
# Alternative to remove ".\" or "./" prefix from the path:
# RelativePath = (Resolve-Path $f.FullName -Relative) -replace '^\.[\\/]'
}
}
}
finally {
# Restore current directory even in case of script-terminating error
Pop-Location
}
Output:
FileName FileSize RelativePath
-------- -------- ------------
test.txt 0 .\subdir\test.txt
Modes of failure:
This is how the String.Substring() method fails for the PowerShell path of the sample above, on my system (you may see a different outcome depending on the location of your temp directory):
FileName FileSize RelativePath
-------- -------- ------------
test.txt 0 ubdir\test.txt
And this is how [IO.Path]::GetRelativePath() fails:
FileName FileSize RelativePath
-------- -------- ------------
test.txt 0 ..\..\..\..\temp\RelativePathTest\subdir\test.txt
I am writing a function in PowerShell 7 that flattens a directory.
It's ideally supposed to:
Copy / Move everything to a temp directory (Depending on whether a destination was supplied)
Rename all files that have identical filenames with a _XX numerical suffix (Padding controlled by a parameter)
Move everything back to the root of the original directory, or the destination directory supplied.
Here is the gist.
Here is the relevant code, without documentation to save space as it's a long one:
function Merge-FlattenDirectory {
[CmdletBinding(SupportsShouldProcess)]
param (
[Parameter(Mandatory, Position = 0, ValueFromPipeline)]
[ValidateScript({
if (!(Test-Path -LiteralPath $_)) {
throw [System.ArgumentException] "Path does not exist."
}
if ((Test-IsSensitiveWindowsPath -Path $_ -Strict).IsSensitive) {
throw [System.ArgumentException] "Path supplied is a protected OS directory."
}
return $true
})]
[Alias("source", "input", "i")]
[string]
$SourcePath,
[Parameter(Mandatory = $false, Position = 1, ValueFromPipelineByPropertyName)]
[Alias("destination", "dest", "output", "o")]
[string]
$DestinationPath = $null,
[Parameter(Mandatory=$false)]
[Switch]
$Force,
[Parameter(Mandatory = $false, ValueFromPipelineByPropertyName)]
[ValidateSet(1, 2, 3, 4, 5)]
[int32]
$DuplicatePadding = 2
)
begin {
# Trim trailing backslashes and initialize a new temporary directory.
$SourcePath = $SourcePath.TrimEnd('\')
$DestinationPath = $DestinationPath.TrimEnd('\')
$TempPath = (New-TempDirectory).FullName
New-Item -ItemType Directory -Force -Path $TempPath
# Escape $SourcePath so we can use wildcards.
$Source = [WildcardPattern]::Escape($SourcePath)
# If there is no $DestinationPath supplied, we flatten only the SourcePath.
# Thus, set DestinationPath to be the same as the SourcePath.
if (!$DestinationPath) {
$DestinationPath = $SourcePath
# Since there is no destination supplied, we move everything to a temporary
# directory for further processing.
Move-Item -Path $Source'\*' -Destination $TempPath -Force
}else{
# We need to perform some parameter validation on DestinationPath:
# Make sure the passed Destination is not a file
if(Test-Path -LiteralPath $DestinationPath -PathType Leaf){
throw [System.IO.IOException] "Please provide a valid directory, not a file."
}
# Make sure the passed Destination is a validly formed Windows path.
if(!(Confirm-ValidWindowsPath -Path $DestinationPath -Container)){
throw [System.IO.IOException] "Invalid Destination Path. Please provide a valid directory."
}
# Make sure the passed Destination is not in a protected or sensitive OS location.
if((Test-IsSensitiveWindowsPath -Path $DestinationPath -Strict).IsSensitive){
throw [System.IO.IOException] "The destination path is, or resides in a protected operating system directory."
}
# Since a destination was supplied, we copy everything to a new temp directory
# instead of moving everything. We want the source directory to remain untouched.
# Robocopy seems to be the most performant here.
# Robocopy on Large Dataset: ~789ms - ~810ms
# Copy-Item on Large Dataset: ~1203ms - ~1280ms
#
# Copy-Item -Path $Source'\*' -Destination $TempPath -Force -Recurse
Robocopy $Source $TempPath /COPYALL /B /E /R:0 /W:0 /NFL /NDL /NC /NS /NP /MT:48
# Create the destination directory now, ready for population in the process block.
New-Item -ItemType Directory -Force -Path $DestinationPath
}
# Grab all files as an Array of FileInfo Objects
$AllFiles = [IO.DirectoryInfo]::new($TempPath).GetFiles('*', 'AllDirectories')
# Initialize hashtable to store duplicate files
$Duplicates = #{}
}
process {
##
# $Stopwatch = [System.Diagnostics.Stopwatch]::StartNew()
#
# Iterate over all files
foreach ($File in $AllFiles) {
# If our $Duplicates hashtable already contains the current filename, we have a duplicate.
if ($Duplicates.Contains($File.Name)) {
# Rename the duplicate file by appending a numerical index to the end of the file.
$PathTemp = Get-ItemProperty -LiteralPath $File
$RenamedFile = Rename-Item -LiteralPath $PathTemp.PSPath -PassThru -NewName ('{0}_{1}{2}' -f #(
$File.BaseName
$Duplicates[$File.Name].ToString().PadLeft($DuplicatePadding, '0')
$File.Extension
))
# Increment the duplicate counter and pass $File down to be moved.
$Duplicates[$File.Name]++
$File = $RenamedFile
} else {
# No duplicates were detected. Add a value of 1 to the duplicates
# hashtable to represent the current file. Pass $File down to be moved.
$PathTemp = Get-ItemProperty -LiteralPath $File
$Duplicates[$File.Name] = 1
$File = $PathTemp
}
# If Force is specified, we don't have to worry about duplicate files,
# as the operation will overwrite every file with a duplicate filename
if($Force){
# Move the file to its appropriate destination. (Force)
Move-Item -LiteralPath $File -Destination $DestinationPath -Force
} else {
try {
# Move the file to its appropriate destination. (Non-Force)
Move-Item -LiteralPath $File -Destination $DestinationPath -ErrorAction Stop
} catch {
# Warn the user that files were skipped because of duplicate filenames.
Write-Warning "File already exists in the destination folder. Skipping this file."
}
}
# Return each file to the pipeline.
# $File
}
# $Stopwatch.Stop()
# Write-Host "`$Stopwatch.Elapsed: " $Stopwatch.Elapsed -ForegroundColor Green
# Write-Host "`$Stopwatch.ElapsedMilliseconds:" $Stopwatch.ElapsedMilliseconds -ForegroundColor Green
# Write-Host "`$Stopwatch.ElapsedTicks: " $Stopwatch.ElapsedTicks -ForegroundColor Green
}
end {
}
}
# Merge-FlattenDirectory "C:\Users\username\Desktop\Testing\Test" "C:\Users\username\Desktop\Testing\TestFlat" -Force
The function works great for the most part, but there is a major problem I didn't anticipate. The code is vulnerable to naming collisions. Here's a problematic directory structure:
(Root directory to be flattened is C:\Users\username\Desktop\Testing\Test)
Directory: C:\Users\username\Desktop\Testing\Test
Mode LastWriteTime Length Name
---- ------------- ------ ----
-a--- 11/4/2021 10:03 PM 1552565 1088_p_01.jpg
-a--- 11/4/2021 10:03 PM 1552565 1088_p_02.jpg
-a--- 11/4/2021 10:03 PM 1552565 1088_p_03.jpg
Directory: C:\Users\username\Desktop\Testing\Test\Folder
Mode LastWriteTime Length Name
---- ------------- ------ ----
-a--- 11/4/2021 10:03 PM 1552565 1088_p_03.jpg
-a--- 11/4/2021 10:03 PM 1552565 1088_p.jpg
Directory: C:\Users\username\Desktop\Testing\Test\Testing
Mode LastWriteTime Length Name
---- ------------- ------ ----
-a--- 11/4/2021 10:03 PM 1552565 1088_p_01.jpg
-a--- 11/4/2021 10:03 PM 1552565 1088_p.jpg
If I run the function to flatten C:\Users\username\Desktop\Testing\Test I get only six files instead of seven in the destination folder. The folder is missing the second 1088_p.jpg. I can verify this by going to my temp directory and looking at what's left:
C:\Users\username\AppData\Local\Temp\DdtElMvSoXbJf\Testing\1088_p.jpg is still in temp.
Anyway, If you're still with me after all this, I thank you generously for reading.
I really need to refactor the function in a way that accounts for this edge-case and I can't figure out how to do it at all gracefully. I could desperately use some help or guidance from someone that can point me in the right direction. I've been working on this function for awhile now and I'd really like to wrap it up.
Many, many thanks.
Edit:
I have a working solution now. I added an additional layer of duplication checks, and moved the actual renaming of the file further down.
Here's the revised code (Only relevant portion included):
# Iterate over all files
foreach ($File in $AllFiles) {
# If our $Duplicates hashtable already contains the current filename, we have a duplicate.
if ($Duplicates.Contains($File.Name)) {
# Create a new name for the file by appending a numerical index to the end of the filename.
$PathTemp = Get-ItemProperty -LiteralPath $File
$NewName = ('{0}_{1}{2}' -f #(
$File.BaseName
$Duplicates[$File.Name].ToString().PadLeft($DuplicatePadding, '0')
$File.Extension
))
# Check if our new name collides with any other filenames in $Duplicates. If so, create
# another new name by appending an additional numeric index to the end of the filename.
$DuplicateCount = 1
while ($Duplicates[$NewName]) {
$NewName = ('{0}_{1}{2}' -f #(
[System.IO.Path]::GetFileNameWithoutExtension($NewName)
$DuplicateCount.ToString().PadLeft($DuplicatePadding, '0')
[System.IO.Path]::GetExtension($NewName)
))
Write-Warning $DuplicateCount.ToString().PadLeft($DuplicatePadding, '0')
$DuplicateCount++
# If we're at a depth of 8, throw. Something is obviously wrong.
if ($DuplicateCount -ge 8) {
throw [System.Exception] "Duplicate count reached limit."
break
}
}
# Finally, rename the file with our new name.
$RenamedFile = Rename-Item -LiteralPath $PathTemp.PSPath -PassThru -NewName $NewName
# Increment the duplicate counters and pass $File down to be moved.
$Duplicates[$File.Name]++
$Duplicates[$NewName]++
$File = $RenamedFile
} else {
# No duplicates were detected. Add a value of 1 to the duplicates
# hashtable to represent the current file. Pass $File down to be moved.
$PathTemp = Get-ItemProperty -LiteralPath $File
$Duplicates[$File.Name] = 1
$File = $PathTemp
}
# If Force is specified, we don't have to worry about duplicate files,
# as the operation will overwrite every file with a duplicate filename
if($Force){
# Move the file to its appropriate destination. (Force)
Move-Item -LiteralPath $File -Destination $DestinationPath -Force
} else {
try {
# Move the file to its appropriate destination. (Non-Force)
Move-Item -LiteralPath $File -Destination $DestinationPath -ErrorAction Stop
} catch {
# Warn the user that files were skipped because of duplicate filenames.
Write-Warning "File already exists in the destination folder. Skipping this file."
}
}
# Return each file to the pipeline.
$File
}
Something very odd is going on with the "date modified" field of several folders on an exFAT external drive I have. A folder in which several files were recently added is still showing its date modified as its creation date. Even worse, another folder with recently added files is showing a date that precedes its creation date! Has anyone observed this and know what might be going on? I have checked online and found nothing useful/relevant regarding this. The same information shows up in both Explorer and in a command prompt so its not specific to Explorer
Run this PowerShell script. Close Explorer before running to avoid file-locking.
# -------------give each folder the highest modified date of it's files --------
function OneDir($dir)
{
# elaborate one folder, with given name
Set-Location -Path $dir.FullName
$maxd = Get-Date(0)
$files = Get-ChildItem -Recurse -Filter *.* | Where-Object { $_.PsIsContainer -eq $false }
for ($i=0; $i -lt $files.Count; $i++)
{
$file = $files[$i]
$cd = [datetime]($file.lastwritetime)
If ($cd -Gt $maxd)
{$maxd = $cd}
}
If ($files.Count -Gt 0)
{$dir.LastWriteTime = ($maxd)}
Write-Host ($dir.FullName) + " " + ($dir.LastWriteTime)
}
#------------------------- main ------------------------------------
$startDir = Read-Host 'Foldername to start with'
Set-Location -Path $startDir
$t = Get-ItemProperty $startDir
OneDir $t
$dirs = Get-ChildItem -Recurse -Filter *.* | Where-Object { $_.PSIsContainer }
for ($d=0; $d -lt $dirs.Count; $d++)
{
OneDir $dirs[$d]
}
Write-Host "Finished. press Enter"
cmd /c pause
I wrote the below PowerShell script to compress logs older than 30 days:
$LastWrite=(get-date).AddDays(-30).ToString("MM/dd/yyyy")
Get-ChildItem -Filter "server.log*" -Recurse -File | Where-Object
{$_.LastWriteTime -le $LastWrite}
Now, I am unable to get a compress command in PowerShell via which I can compress (zip/tar) the server.log* files older than 30 days.
Expecting a single command which I can use by adding a pipe sign in the above command.
You can use the Compress-Archive cmdlet to zip files if you have PowerShell version 5 or above:
$LastWrite = (get-date).AddDays(-30)
$Files = Get-ChildItem -Filter "server.log*" -Recurse -File | Where-Object {$_.LastWriteTime -le $LastWrite}
ForEach ($File in $Files) {
$File | Compress-Archive -DestinationPath "$($File.fullname).zip"
}
If you have an older version of Powershell you can use ZipFileExtensions' CreateEntryFromFile method, but there are a lot of considerations if you want a robust script that runs unattended.
In months of testing a script developed for this purpose, I encountered some issues that have made this small problem more complicated:
Will any of the files be locked? CreateEntryFromFile may fail if so.
Did you know that you can have multiple copies of the same file in a Zip archive? It's harder to extract them because you can't put them in the same folder. My script checks the file path and the archived file time stamp (+/- 2 seconds due to the lost date precision in Zip format) to determine if it's been already archived, and doesn't create a duplicate.
Are the files created in a time zone with Daylight Savings? Zip format doesn't preserve that attribute, and may lose or gain an hour when uncompressed.
Do you want to delete the original if it was successfully archived?
If unsuccessful due to a locked/missing file or very long path, should the process continue?
Will any error leave you with an unusable zip file? You need to Dispose() the archive to finalize it.
How many archives do you want to keep? I prefer one per run-month, adding new entries to an existing zip.
Do you want to preserve the relative path? Doing so will partially eliminate the problem of duplicates inside the zip file.
Mark Wragg's script should work if you don't care about these issues and you have Powershell 5, but it creates a zip for every log, which may not be what you want.
Here's the current version of the script - in case GitHub ever becomes unavailable:
#Sends $FileSpecs files to a zip archive if they match $Filter - deleting the original if $DeleteAfterArchiving is true.
#Files that have already been archived will be ignored.
param (
[string] $ParentFolder = "$PSScriptRoot", #Files will be stored in the zip with path relative to this folder
[string[]] $FileSpecs = #("*.log","*.txt","*.svclog","*.log.*"),
$Filter = { $_.LastWriteTime -lt (Get-Date).AddDays(-7)}, #a Where-Object function - default = older than 7 days
[string] $ZipPath = "$PSScriptRoot\archive-$(get-date -f yyyy-MM).zip", #create one archive per run-month - it may contain older files
[System.IO.Compression.CompressionLevel]$CompressionLevel = [System.IO.Compression.CompressionLevel]::Optimal,
[switch] $DeleteAfterArchiving = $true,
[switch] $Verbose = $true,
[switch] $Recurse = $true
)
#( 'System.IO.Compression','System.IO.Compression.FileSystem') | % { [void][System.Reflection.Assembly]::LoadWithPartialName($_) }
Push-Location $ParentFolder #change to the folder so we can get relative path
$FileList = (Get-ChildItem $FileSpecs -File -Recurse:$Recurse | Where-Object $Filter) #CreateEntryFromFile raises UnauthorizedAccessException if item is a directory
$totalcount = $FileList.Count
$countdown = $totalcount
$skipped = #()
Try{
$WriteArchive = [IO.Compression.ZipFile]::Open( $ZipPath, [System.IO.Compression.ZipArchiveMode]::Update)
ForEach ($File in $FileList){
Write-Progress -Activity "Archiving files" -Status "Archiving file $($totalcount - $countdown) of $totalcount : $($File.Name)" -PercentComplete (($totalcount - $countdown)/$totalcount * 100)
$ArchivedFile = $null
$RelativePath = (Resolve-Path -LiteralPath "$($File.FullName)" -Relative) -replace '^.\\'
$AlreadyArchivedFile = ($WriteArchive.Entries | Where-Object {#zip will store multiple copies of the exact same file - prevent this by checking if already archived.
(($_.FullName -eq $RelativePath) -and ($_.Length -eq $File.Length) ) -and
([math]::Abs(($_.LastWriteTime.UtcDateTime - $File.LastWriteTimeUtc).Seconds) -le 2) #ZipFileExtensions timestamps are only precise within 2 seconds.
})
If($AlreadyArchivedFile -eq $null){
If($Verbose){Write-Host "Archiving $RelativePath $($File.LastWriteTimeUtc -f "yyyyMMdd-HHmmss") $($File.Length)" }
Try{
$ArchivedFile = [System.IO.Compression.ZipFileExtensions]::CreateEntryFromFile($WriteArchive, $File.FullName, $RelativePath, $CompressionLevel)
}Catch{
Write-Warning "$($File.FullName) could not be archived. `n $($_.Exception.Message)"
$skipped += [psobject]#{Path=$file.FullName; Reason=$_.Exception.Message}
}
If($File.LastWriteTime.IsDaylightSavingTime() -and $ArchivedFile){#HACK: fix for buggy date - adds an hour inside archive when the zipped file was created during PDT (files created during PST are not affected). Not sure how to introduce DST attribute to file date in the archive.
$entry = $WriteArchive.GetEntry($RelativePath)
$entry.LastWriteTime = ($File.LastWriteTime.ToLocalTime() - (New-TimeSpan -Hours 1)) #TODO: This is better, but maybe not fully correct. Does it work in all time zones?
}
}Else{#Write-Warning "$($File.FullName) is already archived$(If($DeleteAfterArchiving){' and will be deleted.'}Else{'. No action taken.'})"
Write-Warning "$($File.FullName) is already archived - No action taken."
$skipped += [psobject]#{Path=$file.FullName; Reason="Already archived"}
}
If((($ArchivedFile -ne $null) -and ($ArchivedFile.FullName -eq $RelativePath)) -and $DeleteAfterArchiving) { #delete original if it's been successfully archived.
Try {
Remove-Item $File.FullName -Verbose:$Verbose
}Catch{
Write-Warning "$($File.FullName) could not be deleted. `n $($_.Exception.Message)"
}
}
$countdown = $countdown -1
}
}Catch [Exception]{
Write-Error $_.Exception
}Finally{
$WriteArchive.Dispose() #close the zip file so it can be read later
Write-Host "Sent $($totalcount - $countdown - $($skipped.Count)) of $totalcount files to archive: $ZipPath"
$skipped | Format-Table -Autosize -Wrap
}
Pop-Location
Here's a command line that will compress all server.log* files older than 30 days under the current folder:
.\ArchiveOldLogs.ps1 -FileSpecs #("server.log*") -Filter { $_.LastWriteTime -lt (Get-Date).AddDays(-30)}
I have a powershell script that will look for and compress log files then make a new .7zip file, however currently it has trouble compressing folder with dates e.g. application_logs_2016-07-14
1: The script works fine if there are folders with char's however if the containing folder has a date e.g. (application_logs_2016-07-14) nothing is archived.
2: I need a zip log files that are older than 5 days old, dump.log.341.log, dump.log.342.log and dump.log.343.log should be converted as dump.log.341.zip, dump.log.342.zip, dump.log.343.zip.
Here is the current code, if any powershell guru's could advise i'd be very happy. Thanks in advance.
# See if 7 zip is installed
if (-not (test-path "$env:ProgramFiles\7-Zip\7z.exe")) {throw "$env:ProgramFiles\7-Zip\7z.exe needed"}
set-alias sz "$env:ProgramFiles\7-Zip\7z.exe"
# Define log location and target directory
$Logfile = "D:\stuff\Software\dump files\Archive.log"
$newdirectory = "D:\stuff\compressedlogs"
# Write to log file - start of archive
$now = Get-date -Format "yyyy-MM-dd HH:mm:ss"
Add-Content $Logfile -value "Archive started $now"
# Import what we want to back up from a list file
$List = Import-CSV "D:\stuff\Software\dump files\Compressionlist.txt"
ForEach ($Entry in $List){
$filepath = $($Entry.FilePath)
$Extension = $($Entry.Extension)
$Include = $($Entry.Include)
$Horizon = $($Entry.Horizon)
$Ext2Chg = $($Entry.Ext2Chg)
# Extract List of Files to process
$log = Get-ChildItem -Recurse -Path $filepath -Filter $Extension -Include $Include | Where-Object {$_.lastwriteTime -lt (((Get-Date).AddDays($Horizon)).date)}
# Archive each file found
ForEach ($file in $log) {
if ($file -ne $null) {
$name = $file.name
$newdate = $file.CreationTime | Get-Date -f "yyyy-MM-dd"
$newname = $file.BaseName + "___" + $newdate + $file.Extension
$directory = $file.DirectoryName
$zipfile = $newname.replace($Ext2Chg,".7z")
sz a -t7z "$newdirectory\$zipfile" "$directory\$name"
$now = Get-date -Format "yyyy-MM-dd HH:mm:ss"
Add-Content $Logfile -value "File $directory\$name archived to folder\new filename $newdirectory\$newname at $now"
Remove-Item $file
}
}
}
# Write to log file - end of archive
$now = Get-date -Format "yyyy-MM-dd HH:mm:ss"
Add-Content $Logfile -value "Archive completed $now"
# End
The script looks at a txt document to find what to archive
COMPRESSION LIST
Filepath,Extension,Include,Horizon,Ext2Chg
E:\APPLICATION\DUMP_logs_*,*.log,APP.dumps-currentlog.messages*,-5,.log
===============================================================
example folder structure
D:\application\server_log
(which will contain a log e.g. server_log_2016-07-14_00-00-00_0001.log) this will archive fine.
D:\application\application_log_2016-07-14
(which will contain a log e.g. APP.dumps-currentlog.messages.log) this will NOT archive.
Hope that make sense.