PowerShell asynchronous timer events not working outside of testing console - shell

I have a PowerShell script that uses an Asynchronous Timer Event (background process) to measure how long a certain condition has been occurring before taking appropriate action.
This is working perfectly fine when I run the script inside PowerGUI but when I run the script using dot-sourcing or run it via a batch file the Timer Event actions are not firing.
Here is a code snippet.
$timer = New-Object System.Timers.Timer
$timer.Interval = 10000
$timer.AutoReset = $true
$timeout = 0
$action = {
"timeout: $timeout" | Add-Content $loglocation
<more stuff here>
$timer.stop()
}
$start = Register-ObjectEvent -InputObject $timer -SourceIdentifier TimerElapsed -EventName Elapsed -Action $action
$timer.start()
while(1)
{
<do some testing here>
}
So when it works, I will see the "timeout: XX" output every 10 seconds written to the log. But this is only happening when run inside the editor. When I run it via batch file nothing happens (although I can confirm the while loop is processing fine).
So my question is why is my experience different when I'm running the script inside PowerGUI versus via command line? My thought is there might be an issue with scoping or parallel threads but I'm not exactly sure what the issue is. Also I am not running these events inside any functions or loops.

When calling the script file, the $action script block is executed using the scope of the caller (parent scope), not the script file's scope (child scope). Therefore, variables defined within the script file are not available within the $action script block, unless they are defined to use the global scope or dot-sourced (which will make them available in the global scope). See this wonderful article for more information.
Assume the below code is contained within a file named test.ps1.
$timer = New-Object System.Timers.Timer
$timer.Interval = 10000
$timer.AutoReset = $false
$timeout = 100
$location = 'SomeLocation'
$sourceIdentifier = 'SomeIdentifier'
$action = {
Write-Host "Timer Event Elapsed. Timeout: $timeout, Location: $location, SourceIdentifier: $sourceIdentifier"
$timer.stop()
Unregister-Event $sourceIdentifier
}
$start = Register-ObjectEvent -InputObject $timer -SourceIdentifier $sourceIdentifier -EventName Elapsed -Action $action
$timer.start()
while(1)
{
Write-Host "Looping..."
Start-Sleep -s 5
}
When calling from the powershell console, when the $action script block is executed, the variables it uses will have no values.
./test.ps1
Timer Event Elapsed. Timeout: , Location: , SourceIdentifier:
If you define the variables used in the $action script block before you call the script, the values will be available when the action executes:
$timeout = 5; $location = "SomeLocation"; $sourceIdentifier = "SomeSI"
./test.ps1
Timer Event Elapsed. Timeout: 5, Location: SomeLocation, SourceIdentifier: SomeSI
If you dot-source the script, the variables defined within the script will become available in the current scope, so when the action executes, the values will be available:
. ./test.ps1
Timer Event Elapsed. Timeout: 100, Location: SomeLocation, SourceIdentifier: SomeIdentifier
If the variables would have been declared in the global scope in the script file:
$global:timeout = 100
$global:location = 'SomeLocation'
$global:sourceIdentifier = 'SomeIdentifier'
Then when the $action script block executes in the parent scope, the values will be available:
./test.ps1
Timer Event Elapsed. Timeout: 100, Location: SomeLocation, SourceIdentifier: SomeIdentifier

Like dugas' answer, but if you don't want to clutter up your PowerShell instance with extra variables or do any dot-sourcing, you can put it in a function. This also has the benefit of letting you use named parameters and makes it more modular if you want to re-use it in the future.
function Start-Timer
{
param($timeout = 5, $location = "SomeLocation", $sourceIdentifier = "SomeSI")
$timer = [System.Timers.Timer]::new()
$timer.Interval = $timeout
$timer.AutoReset = $False
$action =
{
$myArgs = $event.MessageData
$timeout = $myArgs.timeout
$location = $myArgs.location
$sourceIdentifier = $myArgs.sourceIdentifier
$timer = $myArgs.timer
Write-Host "Timer Event Elapsed. Timeout: $timeout, Location: $location, SourceIdentifier: $sourceIdentifier"
$timer.Stop()
Unregister-Event $sourceIdentifier
}
# You have to pass the data this way
$passThru =
#{
timeout = $timeout;
location = $location;
sourceIdentifier = $sourceIdentifier;
timer = $timer;
}
Register-ObjectEvent -InputObject $timer -EventName Elapsed -SourceIdentifier Tick -Action $action -MessageData $passThru | Out-Null
$timer.Start()
}
Then you can call it with named parameters:
Start-Timer -location "NewLocation"
A disadvantage to purely using this approach is that if the Handler uses a large number of variables from the containing scope, the code will get messy.

Related

How to trigger Powershell Script when a new File created inside a folder? [duplicate]

I am new to PowerShell and I am trying to use the System.IO.FileSystemWatcher to monitor the presence of a file in a specified folder. However, as soon as the file is detected I want to stop monitoring this folder immediately and stop the FileSystemWatcher. The plan is to incorporate the PowerShell script into a SQL Agent to enable users to restore their own databases. Basically I need to know the command to stop FileSystemWatcher from monitoring as soon as one file is found. Here is the script so far.
### SET FOLDER TO WATCH + FILES TO WATCH + SUBFOLDERS YES/NO
$watcher = New-Object System.IO.FileSystemWatcher
$watcher.Path = "C:\TriggerBatch"
$watcher.Filter = "*.*"
$watcher.IncludeSubdirectories = $true
$watcher.EnableRaisingEvents = $true
### DEFINE ACTIONS AFTER A EVENT IS DETECTED
$action = { $path = $Event.SourceEventArgs.FullPath
$changeType = $Event.SourceEventArgs.ChangeType
$logline = "$(Get-Date), $changeType, $path"
Add-content "C:\log2.txt" -value $logline
}
### DECIDE WHICH EVENTS SHOULD BE WATCHED + SET CHECK FREQUENCY
$created = Register-ObjectEvent $watcher Created -Action $action
while ($true) {sleep 1}
## Unregister-Event Created ??
##Stop-ScheduledTask ??
Unregister-Event $created.Id
This will unregister the event. You will probably want to add this to the $action.
Do note that if there are events in the queue they will still be fired.
This might help too.
Scriptblocks that are run as an action on a subscribed event have access to the $Args, $Event, $EventArgs and $EventSubscriber automatic variables.
Just add the Unregister-Event command to the end of your scriptblock, like so:
### DEFINE ACTIONS AFTER A EVENT IS DETECTED
$action = { $path = $Event.SourceEventArgs.FullPath
$changeType = $Event.SourceEventArgs.ChangeType
$logline = "$(Get-Date), $changeType, $path"
Add-content "C:\log2.txt" -value $logline
Unregister-Event -SubscriptionId $EventSubscriber.SubscriptionId
}
This is the pattern for an event that only performs an action once and then cleans itself up.
It's difficult to effectively explore these automatic variables since they are within the scope of a Job, but you can futz with them by assigning them to global variables while you are sketching out your code. You may also get some joy with Wait-Debugger and Debug-Runspace. In the case of the $EventSubscriber variable, it returns the exact object you get if you run Get-EventSubscriber (having created a single subscription already). That's how I found the SubscriptionId property.
If you want to stop/unregister all registered events you can call
Get-EventSubscriber|Unregister-Event

No parallelization despite the use of a runspace pool with powershell 5.1

We are working on a Powershell script that, among other things, performs a job import of multiple computers via a REST API. The normal job import also works flawlessly and gets an XML with all necessary information passed as parameter.
Now we want to parallelize this job import, so that several of these imports can take place at the same time to reduce the time of the import with a high number of computers.
For this purpose, we use a runspace pool and pass a worker - which contains the code for the job import - as well as all necessary parameters to the respective Powershell instance. Unfortunately, this doesn't seem to work, since even after measuring the import time, we couldn't see any speedup due to the parallelization of the job import. The measured time is always about the same as if we would perform the job import sequentially - i.e. without parallelization.
Here is the relevant code snippet:
function changeApplicationSequenceFromComputer {
param (
[Parameter(Mandatory=$True )]
[string]$tenant = $(throw "Parameter tenant is missing"),
[Parameter(Mandatory=$True)]
[string]$newSequenceName = $(throw "Parameter newSequenceName is missing")
)
# Other things before parallelization
# Passing all local functions and imported modules in runspace pool to call it from worker
$InitialSessionState = [initialsessionstate]::CreateDefault()
Get-ChildItem function:/ | Where-Object Source -like "" | ForEach-Object {
$functionDefinition = Get-Content "Function:\$($_.Name)"
$sessionStateFunction = New-Object System.Management.Automation.Runspaces.SessionStateFunctionEntry -ArgumentList $_.Name, $functionDefinition
$InitialSessionState.Commands.Add($sessionStateFunction)
}
# Using a synchronized Hashtable to pass necessary global variables for logging purpose
$Configuration = [hashtable]::Synchronized(#{})
$Configuration.ScriptPath = $global:ScriptPath
$Configuration.LogPath = $global:LogPath
$Configuration.LogFileName = $global:LogFileName
$InitialSessionState.ImportPSModule(#("$global:ScriptPath\lib\MigrationFuncLib.psm1"))
# Worker for parallelized job-import in for-each loop below
$Worker = {
param($currentComputerObjectTenant, $currentComputerObjectDisplayName, $newSequenceName, $Credentials, $Configuration)
$global:ScriptPath = $Configuration.ScriptPath
$global:LogPath = $Configuration.LogPath
$global:LogFileName = $Configuration.LogFileName
try {
# Function handleComputerSoftwareSequencesXml creates the xml that has to be uploaded for each computer
# We already tried to create the xml outside of the worker and pass it as an argument, so that the worker just imports it. Same result.
$importXml = handleComputerSoftwareSequencesXml -tenant $currentComputerObjectTenant -computerName $currentComputerObjectDisplayName -newSequence $newSequenceName -Credentials $Credentials
$Result = job-import $importXml -Server localhost -Credentials $Credentials
# sleep 1 just for testing purpose
Log "Result from Worker: $Result"
} catch {
$Result = $_.Exception.Message
}
}
# Preparatory work for parallelization
$cred = $Credentials
$MaxRunspacesProcessors = ($env:NUMBER_OF_PROCESSORS) * $multiplier # we tried it with just the number of processors as well as with a multiplied version.
Log "Number of Processors: $MaxRunspacesProcessors"
$RunspacePool = [runspacefactory]::CreateRunspacePool(1, $MaxRunspacesProcessors, $InitialSessionState, $Host)
$RunspacePool.Open()
$Jobs = New-Object System.Collections.ArrayList
foreach ($computer in $computerWithOldApplicationSequence) {
# Different things to do before parallelization, i.e. define some variables
# Parallelized job-import
Log "Creating or reusing runspace for computer '$currentComputerObjectDisplayName'"
$PowerShell = [powershell]::Create()
$PowerShell.RunspacePool = $RunspacePool
Log "Before worker"
$PowerShell.AddScript($Worker).AddArgument($currentComputerObjectTenant).AddArgument($currentComputerObjectDisplayName).AddArgument($newSequenceName).AddArgument($cred).AddArgument($Configuration) | Out-Null
Log "After worker"
$JobObj = New-Object -TypeName PSObject -Property #{
Runspace = $PowerShell.BeginInvoke()
PowerShell = $PowerShell
}
$Jobs.Add($JobObj) | Out-Null
# For logging in Worker
$JobIndex = $Jobs.IndexOf($JobObj)
Log "$($Jobs[$JobIndex].PowerShell.EndInvoke($Jobs[$JobIndex].Runspace))"
}
<#
while ($Jobs.Runspace.IsCompleted -contains $false) {
Log "Still running..."
Start-Sleep 1
}
#>
# Closing/Disposing pool
} # End of the function
The rest of the script looks like this (simplified):
# Parameter passed when calling the script
param (
[Parameter(Mandatory=$True)]
[string]$newSequenceName = $(throw "Parameter target is missing"),
[Parameter(Mandatory=$True)]
[float]$multiplier= $(throw "Parameter multiplier is missing")
)
# 'main' block
$timeToRun = (Measure-Command{
changeApplicationSequenceFromComputer -tenant "testTenant" -newSequenceName $newSequenceName
}).TotalSeconds
Log "Total time to run with multiplier $($multiplier) is $timeToRun"
Any ideas why the job import is obviously only executed sequentially despite runspace pool and corresponding parallelization?
We have found the error. The foreach contained the following code block:
# For logging in Worker
$JobIndex = $Jobs.IndexOf($JobObj)
Log "$($Jobs[$JobIndex].PowerShell.EndInvoke($Jobs[$JobIndex].Runspace))"
This had to be created outside the foreach so that the code looks like this:
function changeApplicationSequenceFromComputer {
param (
[Parameter(Mandatory=$True )]
[string]$tenant = $(throw "Parameter tenant is missing"),
[Parameter(Mandatory=$True)]
[string]$newSequenceName = $(throw "Parameter newSequenceName is missing")
)
# ... Everything as before
$Jobs.Add($JobObj) | Out-Null
} #end of foreach
$Results = #()
foreach($Job in $Jobs ){
$Results += $Job.PowerShell.EndInvoke($Job.Runspace)
}
So the EndInvoke() has to be called outside the foreach.

How to subscribe to Windows ForwardedEvents Log with Powershell?

so I want to trigger an action every time a new event occurs in the ForwardedEvents Log in windows with my powershell script.
I found the following code to subscribe to Application Log, but this does not work for the ForwardedEvent log
$Name = 'Application'
# get an instance
$Log = [System.Diagnostics.EventLog]$Name
# determine what to do when an event occurs
$Action = {
# do something when a new ecent occurs
}
# subscribe to its "EntryWritten" event
$job = Register-ObjectEvent -InputObject $log -EventName EntryWritten -SourceIdentifier 'NewEventHandler' -Action $Action
The error I get when trying this code with "ForwardedEvents" as name is:
Register-ObjectEvent : The event log 'ForwardedEvents' on computer '.' does not exist.
Thank you for any help!
EDIT:
My ForwardedEvent Log is active and filled with events. I use Windows 10 for this.

Dismiss a Powershell form controlled by a start-job task

I've been tasked with building a powershell script with a GUI which enables users to install network printers. I've succesfully managed to do so, but I cannot meet the requirement that the user be shown a 'please wait' window whilst the printers install. If I switch to the window from the main thread, the GUI hangs. If I move showing the window to a seperate job, I'm never able to close the window again. Here's my attempt:
$waitForm = New-Object 'System.Windows.Forms.Form'
$CloseButton_Click={
# open "please wait form"
Start-Job -Name waitJob -ScriptBlock $callWork -ArgumentList $waitForm
#perform long-running (duration unknown) task of adding several network printers here
$max = 5
foreach ($i in $(1..$max)){
sleep 1 # lock up the thread for a second at a time
}
# close the wait form - doesn't work. neither does remove-job
$waitForm.Close()
Remove-Job -Name waitJob -Force
}
$callWork ={
param $waitForm
[void][reflection.assembly]::Load("System.Windows.Forms, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089")
$waitForm = New-Object 'System.Windows.Forms.Form'
$labelInstallingPrintersPl = New-Object 'System.Windows.Forms.Label'
$waitForm.Controls.Add($labelInstallingPrintersPl)
$waitForm.ClientSize = '502, 103'
$labelInstallingPrintersPl.Location = '25, 28'
$labelInstallingPrintersPl.Text = "Installing printers - please wait..."
$waitForm.ShowDialog($this)
}
Does anyone know how I can dismiss the $waitForm window when the long-running task has concluded?
You could try to run the Windows Forms dialog on the main thread and do the actual work in a background job:
Add-Type -Assembly System.Windows.Forms
$waitForm = New-Object 'System.Windows.Forms.Form'
$labelInstallingPrintersPl = New-Object 'System.Windows.Forms.Label'
$waitForm.Controls.Add($labelInstallingPrintersPl)
$waitForm.ClientSize = '502, 103'
$labelInstallingPrintersPl.Location = '25, 28'
$labelInstallingPrintersPl.Text = "Installing printers - please wait..."
$waitForm.ShowDialog($this)
Start-Job -ScriptBlock $addPrinters | Wait-Job
$waitForm.Close()
$addPrinters = {
$max = 5
foreach ($i in $(1..$max)) {
sleep 1 # lock up the thread for a second at a time
}
}
This first answer was correct, create the form on the main thread and perform the long running task on a separate thread. The reason it doesn't execute the main code until after the form is dismissed is because you're using the 'ShowDialog' method of the form, this method haults subsequent code execution until the form is closed.
Instead use the 'show' method, code execution will continue, you should probably include some event handlers to dispose of the form
Add-Type -Assembly System.Windows.Forms
$waitForm = New-Object 'System.Windows.Forms.Form'
$labelInstallingPrintersPl = New-Object 'System.Windows.Forms.Label'
$waitForm.Controls.Add($labelInstallingPrintersPl)
$waitForm.ClientSize = '502, 103'
$labelInstallingPrintersPl.Location = '25, 28'
$labelInstallingPrintersPl.Text = "Installing printers - please wait..."
$waitForm.Add_FormClosed({
$labelInstallingPrintersPl.Dispose()
$waitForm.Dispose()
})
$waitForm.Show($this)
Start-Job -ScriptBlock $addPrinters | Wait-Job
$waitForm.Close()
$addPrinters = {
$max = 5
foreach ($i in $(1..$max)) {
sleep 1 # lock up the thread for a second at a time
}
}
How about adding a Windows.Forms.Progressbar to the main GUI window? Update its value step by step when adding printers, so users will see that the application is working.

Using WIndows PowerShell 1.0 or 2.0 to evaluate performance of executable files

I am writing a simple script on Windows PowerShell in order to evaluate performance of executable files.
The important hypothesisi is the following: I have an executable file, it can be an application written in any possible language (.net and not, Viual-Prolog, C++, C, everything that can be compiled as an .exe file). I want to profile it getting execution times.
I did this:
Function Time-It {
Param ([string]$ProgramPath, [string]$Arguments)
$Watch = New-Object System.Diagnostics.Stopwatch
$NsecPerTick = (1000 * 1000 * 1000) / [System.Diagnostics.Stopwatch]::Frequency
Write-Output "Stopwatch created! NSecPerTick = $NsecPerTick"
$Watch.Start() # Starts the timer
[System.Diagnostics.Process]::Start($ProgramPath, $Arguments)
$Watch.Stop() # Stops the timer
# Collectiong timings
$Ticks = $Watch.ElapsedTicks
$NSecs = $Watch.ElapsedTicks * $NsecPerTick
Write-Output "Program executed: time is: $Nsecs ns ($Ticks ticks)"
}
This function uses stopwatch.
Well, the functoin accepts a program path, the stopwatch is started, the program run and the stopwatch then stopped. Problem: the System.Diagnostics.Process.Start is asynchronous and the next instruction (watch stopped) is not executed when the application finishes. A new process is created...
I need to stop the timer once the program ends.
I thought about the Process class, thicking it held some info regarding the execution times... not lucky...
How to solve this?
You can use Process.WaitForExit()
$proc = new-object "System.Diagnostics.Process"
$proc.StartInfo.FileName = "notepad.exe"
$proc.StartInfo.UseShellExecute = $false
$proc.Start()
$proc.WaitForExit()
Here's kprobst's answer, combined with the Measure-Command CmdLet, for a complete solution:
$proc = new-object "System.Diagnostics.Process"
$proc.StartInfo.FileName = "notepad.exe"
$proc.StartInfo.UseShellExecute = $false
$timeSpan = (MeasureCommand {
$proc.Start()
$proc.WaitForExit()
}
);
"Program executed: Time is {0} seconds" -f $timeSpan.TotalSeconds;

Resources