I have automated some of my backup routines using power-shell scripts. What can I do to make sure the console stays open for maybe 5 seconds after execution completes so that I can read whatever feedback is provided and verify that there weren't any unexpected errors?
The following will work in most cases, but, as mentioned in this answer, it does not work in Windows PowerShell ISE.
function Pause {
Write-Host -NoNewLine 'Press any key to continue . . .';
$null = $Host.UI.RawUI.ReadKey('NoEcho,IncludeKeyDown');
}
This closely mimics the the PAUSE command in the Windows command prompt. Simply place the following line where ever you want the code to pause for the user to view information.
Pause
Related
So there are several factors in play with this question, so here they are:
SailPoint 8.2 and IQService 8.2
Windows Server 2016
A service Account(Domain Admin)
An interactive User account (Domain admin)
Powershell 5.1 build 14393 revision 4583
So what we have is SailPoint is executing a rule on its end, sending over some information to IQService, and IQService is executing the PowerShell scripts as the service account. In one of the PowerShell scripts, we have the following command:
LogToFile("calling start job")
$j = Start-Job -ScriptBlock { C:/SailPoint/Scripts/PowershellContainerAfterCreateRetry.ps1 -sAMAccountName $args[0] -company $args[1] } -ArgumentList $sAMAccountName, $company -Name 'PowershellContainerAfterCreateRetry'
LogToFile($j | Select-Object -Property *)
LogToFile("finished start-job")
and this is where things get interesting because this command, as you can note, we can log to file to see what its output is, which is as follows:
calling start job
#{
State=Running; HasMoreData=True;
StatusMessage=;
Location=localhost;
Command= C:/SailPoint/Scripts/PowershellContainerAfterCreateRetry.ps1 -sAMAccountName $args[0] -company $args[1] ;
JobStateInfo=Running;
Finished=System.Threading.ManualResetEvent;
InstanceId=aa889c06-7a8a-402e-807a-880d02465bdd; Id=1;
Name=PowershellContainerAfterCreateRetry;
ChildJobs=System.Collections.Generic.List`1[System.Management.Automation.Job];
PSBeginTime=10/15/2021 21:14:22; PSEndTime=;
PSJobTypeName=BackgroundJob;
Output=System.Management.Automation.PSDataCollection`1[System.Management.Automation.PSObject];
Error=System.Management.Automation.PSDataCollection`1[System.Management.Automation.ErrorRecord];
Progress=System.Management.Automation.PSDataCollection`1[System.Management.Automation.ProgressRecord];
Verbose=System.Management.Automation.PSDataCollection`1[System.Management.Automation.VerboseRecord];
Debug=System.Management.Automation.PSDataCollection`1[System.Management.Automation.DebugRecord];
Warning=System.Management.Automation.PSDataCollection`1[System.Management.Automation.WarningRecord];
Information=System.Management.Automation.PSDataCollection`1[System.Management.Automation.InformationRecord]}
finished start-job
When I execute this command either by itself OR within this script using Windows PowerShell ISE, it completes with no issue and calls the script in question, and everything works perfectly! (whether I am using my interactive account OR the service account)
When this script executes using the IQService, something "else" is happening - I say something "else" because I don't have any log files or errors; it just seems to disappear into the ether. (I have a log write out five lines into the PowerShell script, so one would think I would at least get SOMETHING!?!? I am out of ideas...thoughts?
As a minor note, I ran an experiment that showed me that there is something strange about the setup which should have succeeded without issue - like the above it appears to execute (because I can see the same information above, that shows that the job has started). Still, just like the above, it never actually "appears" to complete or error out. The only thing I can think of is that somehow the primary script closing out is causing this to close out as well - but I would think it would be able to get a couple of log files written to if that was the case? Anyway...thanks for reading!
$doit = {
"test" | Out-File -filepath ("c:\test.txt") -append
}
Start-job -ScriptBlock $doit
i think Start-Job is the problem here, as iqservice will launch a powershell script process and that may not support the background job aspect you are trying to use.
if you need to have something retry or wait and loop, you'll need to use another identityiq/iqservice mechanism (a workflow in iiq perhaps that calls down to AD when conditions are, timer is hit, etc.) beyond start-job inside of an iqservice powershell script.
This scripts works fine when executed from Powershell console...
but does not work when executed with Powershell.exe from CMD.exe...
(powershell.exe -file script.ps1, using Powershell 5.1.17763.771)
# display Windows Shell Folder propertes
$App = New-Object -ComObject Shell.Application;
$AppNS = $App.NameSpace( "c:\windows" );
$AppNS.Self.InvokeVerb( "Properties" );
I tested other GUI objects (Winforms & WPF)
and they work fine...
?any ideas...
The problem is that the in-process COM object you're creating goes out of scope when the calling process exits, which in your case, when called from cmd.exe via PowerShell's CLI, means that the window typically never even gets a chance to display or is automatically closed after a very brief appearance.
In an interactive PowerShell session, the process lives on after exiting the script - that's why your code works there.
When you invoke a script via via PowerShell's CLI (powershell.exe for Windows PowerShell, pwsh for PowerShell Core, without the -NoExit switch to keep the process alive indefinitely), the PowerShell process exits when the script terminates.
Use of -NoExit would be a stopgap at best, because it would keep the PowerShell process around indefinitely, even though you presumably want it to live only for as long as the Properties dialog window is open - whenever the user chooses to close it.
Therefore, you need to synchronously wait for (a) the Properties dialog window to open and then (b) wait for it close before exiting the script.
You can do this with the help of the .NET UI Automation library as follows; note that the code uses PowerShell v5+ syntax:
using namespace System.Windows.Automation
# Load the UI Automation client assemblies.
# Requires Windows PowerShell or PowerShell Core v7+ (on Windows only).
Add-Type -AssemblyName UIAutomationClient; Add-Type -AssemblyName UIAutomationTypes
# Initiate display of the Windows folder's Properties dialog.
$App = New-Object -ComObject Shell.Application
$AppNS = $App.NameSpace('c:\windows')
$AppNS.Self.InvokeVerb('Properties')
# Comment out this line to suppress the verbose messages.
$VerbosePreference = 'Continue'
Write-Verbose 'Wating for the window''s creation...'
do {
# Search among the current process' top-level windows for a winow
# with class name '#32770', which is what the Properties dialog windows
# use (don't know why, but it has been stable over time).
$w = [AutomationElement]::RootElement.FindFirst([TreeScope]::Children,
[AndCondition]::new(
[PropertyCondition]::new([AutomationElement]::ClassNameProperty, '#32770'),
[PropertyCondition]::new([AutomationElement]::ProcessIdProperty, $PID)
)
)
Start-Sleep -Milliseconds 100
} while (-not $w)
Write-Verbose 'Window has appeared, waiting for it to close...'
while ($w.Current.ProcessId) {
Start-Sleep -Milliseconds 100
}
Write-Verbose 'Window is now closed, moving on.'
# At this point, if the script was invoked via PowerShell's CLI (powershell.exe -file ...)
# the PowerShell process terminates.
Now, invoking your PowerShell script as follows from your batch file will pop up the Properties dialog and wait for it to close before continuing:
#echo off
:: # ... your batch file
:: # Pop up the Properties dialog and *wait for it to close*.
powershell.exe -file script.ps1
:: # ...
If, by contrast, you simply want to launch the Properties dialog while continuing to run your batch file (be sure to disable the verbose messages first):
:: # Only *initiate* display of the Properties dialog and *continue execution*.
start /B powershell.exe -file script.ps1
Seems like it has to wait for the graphics to finish. "get-childitem | out-gridview" does a similar thing. Or add "sleep 120" to the end of the script, or find some other way to wait. Killing the script kills the window.
powershell -noexit .\explorer.ps1
I seem to have run into a problem/bug when trying to capture tracing output when running a Powershell script from Control-M.
The output file shows the headers and footers of the start-trace and stop-trace commands, but it does not show anything else I try to capture. Specifically, if my script issues a write-host command somewhere, then that information is not captured in the output (trace) file.
Here is a super simple script that illustraes my problem:
start-transcript -path "C:\Powershell\transcript.log"
write-host "test message"
#do stuff...
stop-transcript
Here is an example of my current output when running the script through Control-M:
**********************
Windows PowerShell Transcript Start
Start time: 20140212002005
Username : mydomain\SYSTEM
Machine : myserver (Microsoft Windows NT 6.1.7601 Service Pack 1)
**********************
**********************
Windows PowerShell Transcript End
End time: 20140212002008
**********************
Note that my test message does not show up! This only happens when I run the script via Control-M. When I run my script manually, my "test message" does show up in the transcript output.
My first suspicion was file permissions, but those look good to me. The Control-M agent uses system level access, so it should have all the permissions it needs anyway. If it were a file permission issue, I don't believe i would even get the header/footer messages.
I'm on PS v2.0. My server is running 2008r2.
Any thoughts appreciated...
Write-Host writes to to the console window, which is not what's being "watched" by Control-M. Try Write-Output instead. Write-Host is usually not what you want for producing output.
See http://windowsitpro.com/blog/what-do-not-do-powershell-part-1 and http://powershell.com/cs/blogs/donjones/archive/2012/04/06/2012-scripting-games-commentary-stop-using-write-host.aspx
I have a list of Windows packages that I'm installing via powershell using the following command:
& mypatch.exe /passive /norestart
mypatch.exe is being passed from a list and it doesn't wait for the prior install to finish - it just keeps going. It builds up a huge window of installs that are pending installation. Also, I can't use $LASTEXITCODE to determine if the install succeeded or failed.
Is there anyway to make the installs wait before starting the next?
Start-Process <path to exe> -Wait
JesnG is correct in using start-process,
however as the question showed passing arguments, the line should be:
Start-Process "mypatch.exe" -argumentlist "/passive /norestart" -wait
The OP also mentioned determining if the install succeeded or failed. I find that using a "try, catch throw" to pick up on error states works well in this scenario
try {
Start-Process "mypatch.exe" -argumentlist "/passive /norestart" -wait
} catch {
# Catch will pick up any non zero error code returned
# You can do anything you like in this block to deal with the error, examples below:
# $_ returns the error details
# This will just write the error
Write-Host "mypatch.exe returned the following error $_"
# If you want to pass the error upwards as a system error and abort your powershell script or function
Throw "Aborted mypatch.exe returned $_"
}
Sure, write a one line batch script that runs the installer. The batch script will wait for the installer to finish before returning. Call the script from PowerShell which will in turn wait for the batch script to finish.
If you have access to how mypatch is written, you could have that create some random file when it completes that PowerShell can check for its existence in a while loop and just sleeps while the file doesn't exist.
If you don't, you could also have that batch script create a dummy file when the installer completes.
Yet another way, though probably the worst of all of these is to just hard-code a sleep timer (start-sleep) once you call the installer.
EDIT just saw JensG's answer. Didn't know about that one. Nice
I'm using Register-ScheduledJob to register job in powershell in background job I execute script. This script contains some commands like Get-Process and Write-Host command. And...
Altough every command is executed in results I don't see outputs from write-hosts (get-Process is is ok)
Maybe someone know why?
Write-Host writes to the host, which is the app that the script is running in (PowerShell.exe for instance), so it is output explicitly to the screen, and DOES NOTHING when you're running in a non-interactive environment. You should never use that to output data that you want to collect, only for lightweight debugging or for printing to the screen in interactive scripts.
You should generally use write-output for the data that you want to collect as output.
Although you can also use the debug/warning/error output (those are collected by the job, but not shown in the regular output).
Thank You very much. Write-Output helped.
Additionaly what i discover during last days and could be helpful for others: if you start scheduled background job and in this job start powershell script like this:
Register-ScheduledJob -Name1 temp -ScheduledJobOption $option -ScriptBlock {
D:\scprit.ps1
}
Your job will never end because after finish script powershell window is still open. So additionaly you have to add exit in your scriptblock:
Register-ScheduledJob -Name1 temp -ScheduledJobOption $option -ScriptBlock {
D:\scprit.ps1
Exit-PSSession
}