I want my PowerShell script to stop when any of the commands I run fail (like set -e in bash). I'm using both Powershell commands (New-Object System.Net.WebClient) and programs (.\setup.exe).
$ErrorActionPreference = "Stop" will get you part of the way there (i.e. this works great for cmdlets).
However for EXEs you're going to need to check $LastExitCode yourself after every exe invocation and determine whether that failed or not. Unfortunately I don't think PowerShell can help here because on Windows, EXEs aren't terribly consistent on what constitutes a "success" or "failure" exit code. Most follow the UNIX standard of 0 indicating success but not all do. Check out the CheckLastExitCode function in this blog post. You might find it useful.
You should be able to accomplish this by using the statement $ErrorActionPreference = "Stop" at the beginning of your scripts.
The default setting of $ErrorActionPreference is Continue, which is why you are seeing your scripts keep going after errors occur.
Sadly, due to buggy cmdlets like New-RegKey and Clear-Disk, none of these answers are enough. I've currently settled on the following code in a file called ps_support.ps1:
Set-StrictMode -Version Latest
$ErrorActionPreference = "Stop"
$PSDefaultParameterValues['*:ErrorAction']='Stop'
function ThrowOnNativeFailure {
if (-not $?)
{
throw 'Native Failure'
}
}
Then in any powershell file, after the CmdletBinding and Param for the file (if present), I have the following:
$ErrorActionPreference = "Stop"
. "$PSScriptRoot\ps_support.ps1"
The duplicated ErrorActionPreference = "Stop" line is intentional. If I've goofed and somehow gotten the path to ps_support.ps1 wrong, that needs to not silently fail!
I keep ps_support.ps1 in a common location for my repo/workspace, so the path to it for the dot-sourcing may change depending on where the current .ps1 file is.
Any native call gets this treatment:
native_call.exe
ThrowOnNativeFailure
Having that file to dot-source has helped me maintain my sanity while writing powershell scripts. :-)
A slight modification to the answer from #alastairtree:
function Invoke-Call {
param (
[scriptblock]$ScriptBlock,
[string]$ErrorAction = $ErrorActionPreference
)
& #ScriptBlock
if (($lastexitcode -ne 0) -and $ErrorAction -eq "Stop") {
exit $lastexitcode
}
}
Invoke-Call -ScriptBlock { dotnet build . } -ErrorAction Stop
The key differences here are:
it uses the Verb-Noun (mimicing Invoke-Command)
implies that it uses the call operator under the covers
mimics -ErrorAction behavior from built in cmdlets
exits with same exit code rather than throwing exception with new message
You need slightly different error handling for powershell functions and for calling exe's, and you need to be sure to tell the caller of your script that it has failed. Building on top of Exec from the library Psake, a script that has the structure below will stop on all errors, and is usable as a base template for most scripts.
Set-StrictMode -Version latest
$ErrorActionPreference = "Stop"
# Taken from psake https://github.com/psake/psake
<#
.SYNOPSIS
This is a helper function that runs a scriptblock and checks the PS variable $lastexitcode
to see if an error occcured. If an error is detected then an exception is thrown.
This function allows you to run command-line programs without having to
explicitly check the $lastexitcode variable.
.EXAMPLE
exec { svn info $repository_trunk } "Error executing SVN. Please verify SVN command-line client is installed"
#>
function Exec
{
[CmdletBinding()]
param(
[Parameter(Position=0,Mandatory=1)][scriptblock]$cmd,
[Parameter(Position=1,Mandatory=0)][string]$errorMessage = ("Error executing command {0}" -f $cmd)
)
& $cmd
if ($lastexitcode -ne 0) {
throw ("Exec: " + $errorMessage)
}
}
Try {
# Put all your stuff inside here!
# powershell functions called as normal and try..catch reports errors
New-Object System.Net.WebClient
# call exe's and check their exit code using Exec
Exec { setup.exe }
} Catch {
# tell the caller it has all gone wrong
$host.SetShouldExit(-1)
throw
}
I'm new to powershell but this seems to be most effective:
doSomething -arg myArg
if (-not $?) {throw "Failed to doSomething"}
As far as I know, Powershell does not have any automatic handling of non-zero exit codes returned by sub-programs it invokes.
The only solution I know about so far to mimick the behavior of bash -e is to add this check after every call to an external command:
if(!$?) { Exit $LASTEXITCODE }
I came here looking for the same thing. $ErrorActionPreference="Stop" kills my shell immediately when I'd rather see the error message (pause) before it terminates. Falling back on my batch sensibilities:
IF %ERRORLEVEL% NEQ 0 pause & GOTO EOF
I found that this works pretty much the same for my particular ps1 script:
Import-PSSession $Session
If ($? -ne "True") {Pause; Exit}
Seems like simple rethrow does the trick.
param ([string] $Path, [string] $Find, [string] $Replace)
try {
((Get-Content -path $Path -Raw) -replace $Find, $Replace) | Set-Content -Path $Path
Write-Output Completed.
} catch {
# Without try/catch block errors don't interrupt program flow.
throw
}
Now output Completed appears only after successful execution.
for people coming here on 2021 this is my solution that covers both cmdlets and programs
function CheckLastExitCode {
param ([int[]]$SuccessCodes = #(0))
if (!$?) {
Write-Host "Last CMD failed" -ForegroundColor Red
#GoToWrapperDirectory in my code I go back to the original directory that launched the script
exit
}
if ($SuccessCodes -notcontains $LastExitCode) {
Write-Host "EXE RETURNED EXIT CODE $LastExitCode" -ForegroundColor Red
#GoToWrapperDirectory in my code I go back to the original directory that launched the script
exit
}
}
you can use it like this
cd NonExistingpath
CheckLastExitCode
Redirecting stderr to stdout seems to also do the trick without any other commands/scriptblock wrappers although I can't find an explanation why it works that way..
# test.ps1
$ErrorActionPreference = "Stop"
aws s3 ls s3://xxx
echo "==> pass"
aws s3 ls s3://xxx 2>&1
echo "shouldn't be here"
This will output the following as expected (the command aws s3 ... returns $LASTEXITCODE = 255)
PS> .\test.ps1
An error occurred (AccessDenied) when calling the ListObjectsV2 operation: Access Denied
==> pass
Related
I have a powershell script that needs to run 24 * 7.
To make sure it does this, I have created two (almost) identical tasks listed in task scheduler. One starts the task every day at midnight, the other is set to run with the trigger 'At System startup'. The script is set to exit at a minute to midnight.
So far so good, everything works fine. All my bases are covered. The scheduled task takes care of the script 99% of the time, and the 'on startup' task covers the occasional power-failure
However, I've noticed a subtle difference when I look at the process details.
If I open a powershell session and check the pid for the task that started at midnight using this -
PS C:\Users\Elvis> get-wmiobject win32_process | where{$_.ProcessId -eq nnnn}
(where nnnn is the PID) I see lots of details listed, including this....
CommandLine : "C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe" -NoExit -command "&c:\myDir\myScript.ps1"
This makes sense, it's exactly what I put into the task definition.
If do a similar thing with the task that starts on boot-up, then instead of seeing the full command line I just get
CommandLine :
This may not seem important, but I want to check that no other versions of the script are running when I start a new copy. I do this by including this line in the script. (basically it checks for other powershell process running the same script name but with a different PID)
get-wmiobject win32_process | where{$_.processname -eq 'powershell.exe' -and $_.ProcessId -ne $pid -and $_.commandline -match 'myScript'}
I need to be able to either persuade the task scheduler to include the script name in the process details, or find another way to check if there's another copy of the script already running
Use what I call a "PID lockfile". Write the PID to a known file path, if the file already exists, check for the PID. If it's already running, throw an error or otherwise exit. When the script exits have it delete that file.
$lockfilePath = "\path\to\script.pid"
try {
if( Test-Path -PathType Leaf $lockFilePath ) {
$oldPid = ( Get-Content -Raw $lockfilePath ).Trim()
if( Get-Process -Id $oldPid -EA SilentlyContinue ) {
throw "Only one instance of this script can run at a time"
}
}
$PID > $lockfilePath
# Rest of your script goes within this try block
} finally {
# Add a catch block if you like but this finally code
# guarantees a deletion attempt will be made on the
# PID file whether the try block succeeds or errors
if( Test-Path -PathType Leaf $lockfilePath ) {
Remove-Item $lockfilePath -Force -EA Continue
}
}
I essentially require a functionality in Powershell that executes the given string (it can be a CMD/Powershell command, a perl/python/powershell with arguments or an exe with arguments, etc) captures its exit value.
In perl I would pass the string to 'system()' and use the '$CHILD_ERROR' perlval and shift it to access the exit code.
In powershell I am clueless.
I tried using Invoke-Expression, but even if the expression passed to Invoke-Expression fails, the Invoke-Expression call itself will have succeeded.
You can use $LASTEXITCODE to get the exit code from an external program or the Boolean $? to check if the last operation succeeded or failed. Run Get-Help about_Automatic_Variables -ShowWindow from a PowerShell console to see more details.
You may want to check out the & (call) command as an alternative to Invoke-Expression when running external programs. Run Get-Help about_Automatic_Variables -ShowWindow from a PowerShell console for details.
Also remember you may be able to just call the external program without using one of the commands above. See the example below:
param($Hostname="127.0.0.1", $Tries=1, $Wait=1000)
$output = ping.exe $Hostname -n $Tries -w $Wait # captures anything written to stdout
$output|? {$_ -match 'Request timed out'}|Write-Warning
$LASTEXITCODE # returns the exit code from ping.exe
You can copy it to a test.ps1 file and run it from a PowerShell console window (.\test.ps1 8.8.8.8 for instance) to see how it works.
I have a batch file which calls a SQL script to send an email. This batch file is embedded in a powershell. How do I return a success or a failure return code from the batch file to the powershell. I have tried the following
In Batch file
exit /b %ERRORLEVEL%
In powershell
cmd.exe /c 'C:\scripts\send_email_sp_prd.bat F'
if($LastExitCode -eq -0){
write-host "Success"
}
else
{
write-host "Failure"
}
Even if the SQL script fails, the batch files is returning a value of 0 which is not what I want. Is there a better/correct way to do this?
Personally, I would eliminate the batch file and do this all through PowerShell, this will be much easier in terms of manipulating or validating against command output. In the installer for SQL Management Studio on the feature selection install "Management Tools" to obtain the PowerShell module. Once the module in installed you could create a script to execute your job. Here is something I created quickly but you could probably use this or something similar:
Function Invoke-Sqljob {
[CmdletBinding()]
param(
[Parameter(Mandatory=$true)]
[String[]] $ServerInstance,
[Parameter(Mandatory=$true)]
[String[]] $DatabaseName,
[Parameter(Mandatory=$true)]
[String[]] $Query
)
Import-Module SQLPS
Invoke-Sqlcmd -ServerInstance "$ServerInstance" -Database "$DatabaseName" -Query "$Query"
}
You would execute Invoke-Sqljob as follows:
Invoke-Sqljob -ServerInstance "DB_Server\SQL_Instance" -Database "DB_Name_Here" -Query "EXEC send_mail_sp_prd"
If you are unable to or would rather use sql authentication you could utilize the -Username and -Password parameters of the Invoke-Sqlcmd cmdlet.
Hope this helps solve you problem.
I am not sure if your post is the script, verbatim, but you have a - before the 0. Try changing the first line of the if block to $LastExitCode -eq 0
if($LastExitCode -eq 0){
write-host "Success"
}
else
{
write-host "Failure"
}
I use powershell as shell in Windows. When I'm trying to launch some application who's dll dependencies are missing in PATH environment variable, then nothing happens, powershell just silently returns with new command prompt.
Is there a way to make powershell fail louder, telling me what exactly is missing, like default cmd shell does?
I was having this same problem. PowerShell was setting $LASTEXITCODE code to -1073741515 (0xC0000142, 3221225794) but no output explaining what was actually wrong. When running it via cmd.exe I would get popup with something like:
The code execution cannot proceed because some.dll was not found. Reinstalling the program may fix this problem.
cygwin bash outputs errors relating to dll not found to stderr and if you run the the same via bash from PowerShell then you can see the error:
> & 'C:\tools\cygwin\bin\bash.exe' '-c' '"C:/Users/xxx/dir/main.exe"'
C:/Users/xxx/dir/main.exe: error while loading shared libraries: another.dll: cannot open shared object file: No such file or directory
This works with git bash also:
> & 'C:\Program Files\Git\bin\bash.exe' '-c' '"C:/Users/xxx/dir/main.exe"'
C:/Users/xxx/dir/main.exe: error while loading shared libraries: another.dll: cannot open shared object file: No such file or directory
Quite a hack but better than nothing.
You could echo the %ERROR% variable, which stores errors until the PowerShell window is closed.
Update: In PowerShell, you could use the Get-Error command, or look at the $Error variable.
Another way would be to use Dependancy walker, if you can use a command line option, then you should be able to use this in PowerShell.
I am afraid there is no way to get that info... But try to read
An Introduction to Error Handling in PowerShell http://blogs.msdn.com/b/kebab/archive/2013/06/09/an-introduction-to-error-handling-in-powershell.aspx
or
PowerShell Tutorial – Try Catch Finally and error handling in PowerShell
http://www.vexasoft.com/blogs/powershell/7255220-powershell-tutorial-try-catch-finally-and-error-handling-in-powershell
Try
{
$AuthorizedUsers = Get-Content \\ FileServer\HRShare\UserList.txt -ErrorAction Stop
}
Catch [System.OutOfMemoryException]
{
Restart-Computer localhost
}
Catch
{
$ErrorMessage = $_.Exception.Message
$FailedItem = $_.Exception.ItemName
Send-MailMessage -From ExpensesBot#MyCompany.Com -To WinAdmin#MyCompany.Com -Subject "HR File Read Failed!" -SmtpServer EXCH01.AD.MyCompany.Com -Body "We failed to read file $FailedItem. The error message was $ErrorMessage"
Break
}
Finally
{
$Time=Get-Date
"This script made a read attempt at $Time" | out-file c:\logs\ExpensesScript.log -append
}
I am writing a batch script in PowerShell v1 that will get scheduled to run let's say once every minute. Inevitably, there will come a time when the job needs more than 1 minute to complete and now we have two instances of the script running, and then possibly 3, etc...
I want to avoid this by having the script itself check if there is an instance of itself already running and if so, the script exits.
I've done this in other languages on Linux but never done this on Windows with PowerShell.
For example in PHP I can do something like:
exec("ps auxwww|grep mybatchscript.php|grep -v grep", $output);
if($output){exit;}
Is there anything like this in PowerShell v1? I haven't come across anything like this yet.
Out of these common patterns, which one makes the most sense with a PowerShell script running frequently?
Lock File
OS Task Scheduler
Infinite loop with a sleep interval
Here's my solution. It uses the commandline and process ID so there's nothing to create and track. and it doesn't care how you launched either instance of your script.
The following should just run as-is:
Function Test-IfAlreadyRunning {
<#
.SYNOPSIS
Kills CURRENT instance if this script already running.
.DESCRIPTION
Kills CURRENT instance if this script already running.
Call this function VERY early in your script.
If it sees itself already running, it exits.
Uses WMI because any other methods because we need the commandline
.PARAMETER ScriptName
Name of this script
Use the following line *OUTSIDE* of this function to get it automatically
$ScriptName = $MyInvocation.MyCommand.Name
.EXAMPLE
$ScriptName = $MyInvocation.MyCommand.Name
Test-IfAlreadyRunning -ScriptName $ScriptName
.NOTES
$PID is a Built-in Variable for the current script''s Process ID number
.LINK
#>
[CmdletBinding()]
Param (
[Parameter(Mandatory=$true)]
[ValidateNotNullorEmpty()]
[String]$ScriptName
)
#Get array of all powershell scripts currently running
$PsScriptsRunning = get-wmiobject win32_process | where{$_.processname -eq 'powershell.exe'} | select-object commandline,ProcessId
#Get name of current script
#$ScriptName = $MyInvocation.MyCommand.Name #NO! This gets name of *THIS FUNCTION*
#enumerate each element of array and compare
ForEach ($PsCmdLine in $PsScriptsRunning){
[Int32]$OtherPID = $PsCmdLine.ProcessId
[String]$OtherCmdLine = $PsCmdLine.commandline
#Are other instances of this script already running?
If (($OtherCmdLine -match $ScriptName) -And ($OtherPID -ne $PID) ){
Write-host "PID [$OtherPID] is already running this script [$ScriptName]"
Write-host "Exiting this instance. (PID=[$PID])..."
Start-Sleep -Second 7
Exit
}
}
} #Function Test-IfAlreadyRunning
#Main
#Get name of current script
$ScriptName = $MyInvocation.MyCommand.Name
Test-IfAlreadyRunning -ScriptName $ScriptName
write-host "(PID=[$PID]) This is the 1st and only instance allowed to run" #this only shows in one instance
read-host 'Press ENTER to continue...' # aka Pause
#Put the rest of your script here
If the script was launched using the powershell.exe -File switch, you can detect all powershell instances that have the script name present in the process commandline property:
Get-WmiObject Win32_Process -Filter "Name='powershell.exe' AND CommandLine LIKE '%script.ps1%'"
Loading up an instance of Powershell is not trivial, and doing it every minute is going to impose a lot of overhead on the system. I'd just scedule one instance, and write the script to run in a process-sleep-process loop. Normally I'd uses a stopwatch timer, but I don't think they added those until V2.
$interval = 1
while ($true)
{
$now = get-date
$next = (get-date).AddMinutes($interval)
do-stuff
if ((get-date) -lt $next)
{
start-sleep -Seconds (($next - (get-date)).Seconds)
}
}
This is the classic method typically used by Win32 applications. It is done by trying to create a named event object. In .NET there exists a wrapper class EventWaitHandle, which makes this easy to use from PowerShell too.
$AppId = 'Put-Your-Own-GUID-Here!'
$CreatedNew = $false
$script:SingleInstanceEvent = New-Object Threading.EventWaitHandle $true, ([Threading.EventResetMode]::ManualReset), "Global\$AppID", ([ref] $CreatedNew)
if( -not $CreatedNew ) {
throw "An instance of this script is already running."
}
Notes:
Make sure $AppId is truly unique, which is fullfilled when you use a random GUID for it.
The variable $SingleInstanceEvent should exist as long as the script is running. Putting it in the script scope as I did above, should normally be sufficient.
The event object is created in the "Global" kernel namespace, meaning it blocks execution even if the script is already running in another client session (e. g. when multiple users are logged onto the same machine). Replace "Global\$AppID" by "Local\$AppID" if you want to prevent multiple instances from running within the current client session only.
This doesn't have a race condition like the WMI (commandline) solution, because the OS kernel makes sure that only one instance of an event object with the same name can be created across all processes.
I'm not aware of a way to do what you want directly. You could consider using an external lock instead. When the script starts it changes a registry key, creates a file, or changes a file contents, or something similar, when the script is done it reverses the lock. Also at the top of the script before the lock is set there needs to be a check to see the status of the lock. If it is locked, the script exits.
$otherScriptInstances=get-wmiobject win32_process | where{$_.processname -eq 'powershell.exe' -and $_.ProcessId -ne $pid -and $_.commandline -match $($MyInvocation.MyCommand.Path)}
if ($otherScriptInstances -ne $null)
{
"Already running"
cmd /c pause
}else
{
"Not yet running"
cmd /c pause
}
You may want to replace
$MyInvocation.MyCommand.Path (FullPathName)
with
$MyInvocation.MyCommand.Name (Scriptname)
It's "always" best to let the "highest process" handle such situations. The process should check this before it runs the second instance. So my advise is to use Task Scheduler to do the job for you. This will also eliminate possible problems with permissions(saving a file without having permissions), and it will keep your script clean.
When configuring the task in Task Scheduler, you have an option under Settings:
If the task is already running, then the following rule applies:
Do not start a new instace