Can someone help me understand why this scheduled script doesn't run as intended: The file is not reliably deleted at the start of the next day (runs correctly 3-4X out of 5 whether machine is powered on or out of sleep mode (S3). How to get it to always delete the previous day file?
$Path = "H:\foobar.txt"
if (Test-Path $Path) {Get-ChildItem $Path |
Where-Object { ($_.LastWriteTime).ToString() -lt ((get-date).AddDays(-1)).ToString() } |
Remove-Item -Force }
('{0:MMMM dd, yyyy h:mm:ss tt}' -f (Get-Date)) | Add-Content -Path $Path
EXIT
snippet from output showing carryover into next day:
April 27, 2020 9:31:18 PM
April 28, 2020 7:16:37 AM
April 28, 2020 5:31:45 PM
April 29, 2020 7:16:37 AM
April 29, 2020 11:17:02 PM
April 30, 2020 6:02:06 AM
April 30, 2020 6:17:02 PM
May 01, 2020 6:27:28 AM
May 01, 2020 7:17:02 AM
Seeing that the variable $Path contains the path to a single file, you could shorten the code like:
$Path = "H:\foobar.txt"
# try and get a FileInfo object from the file in the path
$file = Get-Item -Path $Path -ErrorAction SilentlyContinue
if ($file) {
# the file was found, check if it is older than yesterday (midnight)
if ($file.LastWriteTime -lt (Get-Date).AddDays(-1).Date) {
$file | Remove-Item -Force
}
}
# Add-Content will create the file if it does not yet exist
('{0:MMMM dd, yyyy h:mm:ss tt}' -f (Get-Date)) | Add-Content -Path $Path
Required of course is that the user running the scheduled task has write access to the H:\ path..
Related
I have a read-only file, say samp.txt and I run the following on PowerShell:
> $file = Get-Item .\samp.txt
> $file.LastAccessTime = (get-date)
we get: "Access to the path 'G:\Study_Material\Coding\samp.txt' is denied."
Now before we proceed, look at the access time:
> $file.LastAccessTime will be
Sunday, December 30, 2018 11:02:49 PM
Now we open WSL and do: $ touch samp.txt
Back to PowerShell we do:
> $file = Get-Item .\samp.txt
> $file.LastAccessTime
we get:
Sunday, December 30, 2018 11:19:16 PM
Thus it has been modified with no elevated privileges.
Now my question: How is it possible to mimic this action in PowerShell alone without removing the ReadOnly tag by modifying the $file.Attributes.
When dealing with ReadOnly files, you cannot simply change the LastAccessTime.
(see the comments by eryksun ).
In order to have it work in PowerShell, you need to first remove the ReadOnly flag from the file's attributes, do the change and reset the ReadOnly flag like so:
$file = Get-Item .\samp.txt -Force
# test if the ReadOnly flag on the file is set
if ($file.Attributes -band 1) {
# remove the ReadOnly flag from the file. (FILE_ATTRIBUTE_READONLY = 1)
$file.Attributes = $file.Attributes -bxor 1
# or use: $file | Set-ItemProperty -Name IsReadOnly -Value $false
$file.LastAccessTime = (Get-Date)
# reset the ReadOnly flag
$file.Attributes = $file.Attributes -bxor 1
# or use: $file | Set-ItemProperty -Name IsReadOnly -Value $true
}
else {
# the file is not ReadOnly, so just do the 'touch' on the LastAccessTime
$file.LastAccessTime = (Get-Date)
}
You can read all about file attributes and their numeric values here
I wrote the below PowerShell script to compress logs older than 30 days:
$LastWrite=(get-date).AddDays(-30).ToString("MM/dd/yyyy")
Get-ChildItem -Filter "server.log*" -Recurse -File | Where-Object
{$_.LastWriteTime -le $LastWrite}
Now, I am unable to get a compress command in PowerShell via which I can compress (zip/tar) the server.log* files older than 30 days.
Expecting a single command which I can use by adding a pipe sign in the above command.
You can use the Compress-Archive cmdlet to zip files if you have PowerShell version 5 or above:
$LastWrite = (get-date).AddDays(-30)
$Files = Get-ChildItem -Filter "server.log*" -Recurse -File | Where-Object {$_.LastWriteTime -le $LastWrite}
ForEach ($File in $Files) {
$File | Compress-Archive -DestinationPath "$($File.fullname).zip"
}
If you have an older version of Powershell you can use ZipFileExtensions' CreateEntryFromFile method, but there are a lot of considerations if you want a robust script that runs unattended.
In months of testing a script developed for this purpose, I encountered some issues that have made this small problem more complicated:
Will any of the files be locked? CreateEntryFromFile may fail if so.
Did you know that you can have multiple copies of the same file in a Zip archive? It's harder to extract them because you can't put them in the same folder. My script checks the file path and the archived file time stamp (+/- 2 seconds due to the lost date precision in Zip format) to determine if it's been already archived, and doesn't create a duplicate.
Are the files created in a time zone with Daylight Savings? Zip format doesn't preserve that attribute, and may lose or gain an hour when uncompressed.
Do you want to delete the original if it was successfully archived?
If unsuccessful due to a locked/missing file or very long path, should the process continue?
Will any error leave you with an unusable zip file? You need to Dispose() the archive to finalize it.
How many archives do you want to keep? I prefer one per run-month, adding new entries to an existing zip.
Do you want to preserve the relative path? Doing so will partially eliminate the problem of duplicates inside the zip file.
Mark Wragg's script should work if you don't care about these issues and you have Powershell 5, but it creates a zip for every log, which may not be what you want.
Here's the current version of the script - in case GitHub ever becomes unavailable:
#Sends $FileSpecs files to a zip archive if they match $Filter - deleting the original if $DeleteAfterArchiving is true.
#Files that have already been archived will be ignored.
param (
[string] $ParentFolder = "$PSScriptRoot", #Files will be stored in the zip with path relative to this folder
[string[]] $FileSpecs = #("*.log","*.txt","*.svclog","*.log.*"),
$Filter = { $_.LastWriteTime -lt (Get-Date).AddDays(-7)}, #a Where-Object function - default = older than 7 days
[string] $ZipPath = "$PSScriptRoot\archive-$(get-date -f yyyy-MM).zip", #create one archive per run-month - it may contain older files
[System.IO.Compression.CompressionLevel]$CompressionLevel = [System.IO.Compression.CompressionLevel]::Optimal,
[switch] $DeleteAfterArchiving = $true,
[switch] $Verbose = $true,
[switch] $Recurse = $true
)
#( 'System.IO.Compression','System.IO.Compression.FileSystem') | % { [void][System.Reflection.Assembly]::LoadWithPartialName($_) }
Push-Location $ParentFolder #change to the folder so we can get relative path
$FileList = (Get-ChildItem $FileSpecs -File -Recurse:$Recurse | Where-Object $Filter) #CreateEntryFromFile raises UnauthorizedAccessException if item is a directory
$totalcount = $FileList.Count
$countdown = $totalcount
$skipped = #()
Try{
$WriteArchive = [IO.Compression.ZipFile]::Open( $ZipPath, [System.IO.Compression.ZipArchiveMode]::Update)
ForEach ($File in $FileList){
Write-Progress -Activity "Archiving files" -Status "Archiving file $($totalcount - $countdown) of $totalcount : $($File.Name)" -PercentComplete (($totalcount - $countdown)/$totalcount * 100)
$ArchivedFile = $null
$RelativePath = (Resolve-Path -LiteralPath "$($File.FullName)" -Relative) -replace '^.\\'
$AlreadyArchivedFile = ($WriteArchive.Entries | Where-Object {#zip will store multiple copies of the exact same file - prevent this by checking if already archived.
(($_.FullName -eq $RelativePath) -and ($_.Length -eq $File.Length) ) -and
([math]::Abs(($_.LastWriteTime.UtcDateTime - $File.LastWriteTimeUtc).Seconds) -le 2) #ZipFileExtensions timestamps are only precise within 2 seconds.
})
If($AlreadyArchivedFile -eq $null){
If($Verbose){Write-Host "Archiving $RelativePath $($File.LastWriteTimeUtc -f "yyyyMMdd-HHmmss") $($File.Length)" }
Try{
$ArchivedFile = [System.IO.Compression.ZipFileExtensions]::CreateEntryFromFile($WriteArchive, $File.FullName, $RelativePath, $CompressionLevel)
}Catch{
Write-Warning "$($File.FullName) could not be archived. `n $($_.Exception.Message)"
$skipped += [psobject]#{Path=$file.FullName; Reason=$_.Exception.Message}
}
If($File.LastWriteTime.IsDaylightSavingTime() -and $ArchivedFile){#HACK: fix for buggy date - adds an hour inside archive when the zipped file was created during PDT (files created during PST are not affected). Not sure how to introduce DST attribute to file date in the archive.
$entry = $WriteArchive.GetEntry($RelativePath)
$entry.LastWriteTime = ($File.LastWriteTime.ToLocalTime() - (New-TimeSpan -Hours 1)) #TODO: This is better, but maybe not fully correct. Does it work in all time zones?
}
}Else{#Write-Warning "$($File.FullName) is already archived$(If($DeleteAfterArchiving){' and will be deleted.'}Else{'. No action taken.'})"
Write-Warning "$($File.FullName) is already archived - No action taken."
$skipped += [psobject]#{Path=$file.FullName; Reason="Already archived"}
}
If((($ArchivedFile -ne $null) -and ($ArchivedFile.FullName -eq $RelativePath)) -and $DeleteAfterArchiving) { #delete original if it's been successfully archived.
Try {
Remove-Item $File.FullName -Verbose:$Verbose
}Catch{
Write-Warning "$($File.FullName) could not be deleted. `n $($_.Exception.Message)"
}
}
$countdown = $countdown -1
}
}Catch [Exception]{
Write-Error $_.Exception
}Finally{
$WriteArchive.Dispose() #close the zip file so it can be read later
Write-Host "Sent $($totalcount - $countdown - $($skipped.Count)) of $totalcount files to archive: $ZipPath"
$skipped | Format-Table -Autosize -Wrap
}
Pop-Location
Here's a command line that will compress all server.log* files older than 30 days under the current folder:
.\ArchiveOldLogs.ps1 -FileSpecs #("server.log*") -Filter { $_.LastWriteTime -lt (Get-Date).AddDays(-30)}
I'm having some troubles using the PowerShell in windows 10 in order to get specific scheduled tasks. I need to get a list of scheduled task that run between 9:00 PM to 12 PM. I couldn’t figure out how to use the “Get-ScheduledTask “ and “Get-ScheduledTaskInfo” commands properly.
I will be so grateful if someone can help me writing the script the right way!
I think this is what you need:
Get-ScheduledTask | ForEach-Object {
$NextRunTimeHour = ($_ | Get-ScheduledTaskInfo).NextRunTime.Hour
If ($NextRunTimeHour -in 21..23) { $_ }
}
Gets the Scheduled Tasks, then iterates through them with ForEach-Object, piping each to Get-ScheduledTaskInfo to get the .NextRunTime property and it's .Hour subproperty and then returning the Scheduled task if the hour is 21, 22 or 23.
Other method, give you all necessary infos :
Get-ScheduledTask| %{$taskName=$_.TaskName; $_.Triggers |
where {$_ -ne $null -and $_.Enabled -eq $true -and $_.StartBoundary -ne $null -and ([System.DateTime]$_.StartBoundary).Hour -in 21..23} | %{
[pscustomobject]#{
Name=$taskName;
trigger=$_
Enabled=$_.Enabled
EndBoundary=$_.EndBoundary
ExecutionTimeLimit=$_.ExecutionTimeLimit
Id=$_.Id
Repetition=$_.Repetition
StartBoundary=$_.StartBoundary
DaysInterval=$_.DaysInterval
RandomDelay=$_.RandomDelay
PSComputerName=$_.PSComputerName
}
}
}
LastWriteTime of a Directory only updates if a new file is created,
not when one updates
PS>mkdir foo
PS>Get-Item ".\foo" | select -ExpandProperty lastwritetime
Friday, February 12, 2016 1:56:18 PM
PS>"hello" > .\foo\hello.txt
PS>Get-Item ".\foo" | select -ExpandProperty lastwritetime
Friday, February 12, 2016 1:56:53 PM
PS>"line 2" >> .\foo\hello.txt
PS>Get-Item ".\foo" | select -ExpandProperty lastwritetime
Friday, February 12, 2016 1:56:53 PM
PS>"a new file" >> .\foo\hello1.txt
PS>Get-Item ".\foo" | select -ExpandProperty lastwritetime
Friday, February 12, 2016 1:57:55 PM
How do I get the time when a directory changed ?
I wrote a powershell script that exports a bunch of files from a Dynamics NAV instance. It calls a perl script that I also wrote that then splits all of the files into individual objects and sticks them in subdirectories under a dir I create in perl. Then the powershell script attempts to copy the files to a different dir, and fails.
Powershell generates the dir name:
$datestamp = get-date -f MM-dd-yyyy_HH_mm_ss
$dumpdir = "\temp\nav_export\" + $env:username + "\" + $servicetier + "~" + $database + "~" + $datestamp;
Then powershell does a bunch of stuff that works fine, and calls the perl script ($servicetier and $database are defined earlier in the script):
& c:\navgit\split-objects.pl $servicetier $database $datestamp
perl proceeds to create the directory and split the files correctly:
use strict;
use warnings;
use File::Path qw(make_path remove_tree);
my $username = getlogin || getpwuid($<);
my $servicetier = $ARGV[0];
my $database = $ARGV[1];
my $datestamp = $ARGV[2];
undef #ARGV;
my $work_dir = "/temp/nav_export";
my $objects_dir = "$work_dir/$username/objects";
my $export_dir = "$work_dir/$username/$servicetier~$database~$datestamp";
print "Objects from $servicetier~$database being exported to $export_dir\n";
make_path("$export_dir/Page", "$export_dir/Codeunit", "$export_dir/MenuSuite", "$export_dir/Query", "$export_dir/Report", "$export_dir/Table", "$export_dir/XMLport");
chdir $objects_dir or die "Could not change to $objects_dir: $!";
<does all of the filehandling and parsing>
Control returns to the powershell script, which tries to finish with:
Copy-Item -Path $dumpdir -Destination $cwd -Force -Recurse
But that throws the error:
Copy-Item : Cannot find path 'C:\temp\nav_export\danielj\cen-dev-erp-st1~JustFoodERP-PROTO~01-20-2015_19_26_50' because it does not exist.
At C:\navgit\nav-export.ps1:175 char:1
+ Copy-Item -Path $dumpdir -Destination $cwd -Force -Recurse
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : ObjectNotFound: (C:\temp\nav_exp...0-2015_19_26_50:String) [Copy-Item], ItemNotFoundException
+ FullyQualifiedErrorId : PathNotFound,Microsoft.PowerShell.Commands.CopyItemCommand
The directory I'm trying to copy from does exist. But powershell doesn't see it! I added some code to list the contents of the parent dir:
Copy-Item -Path $dumpdir -Destination $cwd -Force -Recurse
Write-Host "Copy-Item -Path $dumpdir -Destination $cwd -Force -Recurse"
$test = "C:\temp\nav_export\$env:username"
Get-ChildItem $test -Force
Copy-Item -Path \temp\nav_export\danielj\cen-dev-erp-st1~JustFoodERP-PROTO~01-20-2015_19_26_50 -Destination C:\Users\danielj\erp\ -Force -Recurse
Directory: C:\temp\nav_export\danielj
Mode LastWriteTime Length Name
---- ------------- ------ ----
d---- 1/20/2015 6:32 PM cen-dev-erp-st1~JustFoodERP-PROTO~01-20-2015_18_32_33
d---- 1/20/2015 7:08 PM cen-dev-erp-st1~JustFoodERP-PROTO~01-20-2015_19_08_49
d---- 1/19/2015 1:07 PM cen-dev-erp-st1~JustFoodERP-PROTO~20150119-130747
d---- 1/20/2015 7:26 PM logs
d---- 1/20/2015 7:26 PM objects
-a--- 1/20/2015 7:26 PM 309 objects.bat
-a--- 1/20/2015 1:41 PM 436 soap_envelope.txt
If I do a directory listing from outside the script, there it is:
PS C:\Users\danielj\erp> $test = "C:\temp\nav_export\$env:username"
Get-ChildItem $test -Force
Directory: C:\temp\nav_export\danielj
Mode LastWriteTime Length Name
---- ------------- ------ ----
d---- 1/20/2015 6:32 PM cen-dev-erp-st1~JustFoodERP-PROTO~01-20-2015_18_32_33
d---- 1/20/2015 7:08 PM cen-dev-erp-st1~JustFoodERP-PROTO~01-20-2015_19_08_49
d---- 1/20/2015 7:26 PM cen-dev-erp-st1~JustFoodERP-PROTO~01-20-2015_19_26_50
d---- 1/19/2015 1:07 PM cen-dev-erp-st1~JustFoodERP-PROTO~20150119-130747
d---- 1/20/2015 7:26 PM logs
d---- 1/20/2015 7:26 PM objects
-a--- 1/20/2015 7:26 PM 309 objects.bat
-a--- 1/20/2015 1:41 PM 436 soap_envelope.txt
I've tried calling an external script from the main powershell script after perl is finished, and the results are the same.
Why would powershell not see the directory or files that were created by the perl script? And more importantly, how can I get it to do so?
Are you sure that everything proced in the exact order you assume it does ?
try to start your perl script like this :
& c:\navgit\split-objects.pl $servicetier $database $datestamp | Out-Null
It makes sure that your Perl part is terminated when the PowerShell execute the rest of your script.
I ended up just creating the directory in powershell in the first place. No reason it should have to have been created in the perl script.
It'd still be nice to know why it didn't work the way I'd have expected though.