Getting exit status of somecommand after "if ! somecommand" - bash

I seem to not be able to get the exit code of the command execution in a Bash if condition:
#! /bin/bash
set -eu
if ! curl -sS --fail http://not-there; then
echo "ERROR: curl failed with exit code of $?" >&2
fi
But the $? always returns zero, when my curl exits with non-zero.
If I don't do the curl command inside of the if-condition, then my $? returns correctly.
Am I missing something here?

In your original code, $? is returning the exit status not of curl, but of ! curl.
To preserve the original value, choose a control structure that doesn't require that inversion:
curl -sS --fail http://not-there || {
echo "ERROR: curl failed with exit code of $?" >&2
exit 1
}
...or something akin to:
if curl -sS --fail http://not-there; then
: "Unexpected success"
else
echo "ERROR: curl failed with exit status of $?" >&2
fi

Another way to achieve what you want to do is to collect the return code first, then perform the if statement.
#!/bin/bash
set -eu
status=0
curl -sS --fail http://not-there || status=$?
if ((status)) then
echo "ERROR: curl failed with exit code of $status" >&2
fi
I find this method especially convenient when checking for failure of several commands when you want to return an error code at the end of your script or function if any of these has failed.
Please note in the above I use an arithmetic test, which returns true (0) if the value inside is non-zero, and false (non-zero) otherwise. It is shorter (and more readable to my own taste) than using something like [[ $status != 0 ]].

Related

Check return value of command in if bash statement

I have a simple script that tries to curl and URL and echo a string if it failed or succeeded. But I get the following warnings, depending on how I form this if statement.
Depending on the quotes I use in the statement below I get the following warnings:
: -ne: unary operator expected
: integer expression expected
With the alternative check (as comment), I get the following error
((: != 0 : syntax error: operand expected (error token is "!= 0 ")
The script:
c=`curl -s -m 10 https://example.com` || ce=$?
#if (( ${je} != 0 )); then
if [ ${ce} -ne 0 ]; then
echo "Failed"
else
echo "Succeeded"
fi
How do I correctly check the return value of the curl command in a bash if-statement?
The problem is that you only set the exit status when the curl command fails.
If the command succeeds, then variable ce is not set (and also not quoted) and the test executes if [ -ne 0 ]; then and prints the error message.
Quoting the variable alone wouldn't help in this case, you would just get a different error message.
To fix this, set variable ce after the curl command no matter what the exit status of the curl command is:
c=$(curl -s -m 10 https://example.com)
ce=$?
if [ "$ce" -ne 0 ]; then
echo "Failed"
else
echo "Succeeded"
fi
Or shorter without exit status variable:
c=$(curl -s -m 10 https://example.com)
if [ $? -ne 0 ]; then
echo "Failed"
else
echo "Succeeded"
fi

How to execute a bash script line by line? [duplicate]

This question already has answers here:
Automatic exit from Bash shell script on error [duplicate]
(8 answers)
Closed 6 years ago.
#Example Script
wget http://file1.com
cd /dir
wget http://file2.com
wget http://file3.com
I want to execute the bash script line by line and test the exit code ($?) of each execution and determine whether to proceed or not:
It basically means I need to add the following script below every line in the original script:
if test $? -eq 0
then
echo "No error"
else
echo "ERROR"
exit
fi
and the original script becomes:
#Example Script
wget http://file1.com
if test $? -eq 0
then
echo "No error"
else
echo "ERROR"
exit
fi
cd /dir
if test $? -eq 0
then
echo "No error"
else
echo "ERROR"
exit
fi
wget http://file2.com
if test $? -eq 0
then
echo "No error"
else
echo "ERROR"
exit
fi
wget http://file3.com
if test $? -eq 0
then
echo "No error"
else
echo "ERROR"
exit
fi
But the script becomes bloated.
Is there a better method?
One can use set -e but it's not without it's own pitfalls. Alternative one can bail out on errors:
command || exit 1
And an your if-statement can be written less verbose:
if command; then
The above is the same as:
command
if test "$?" -eq 0; then
set -e makes the script fail on non-zero exit status of any command. set +e removes the setting.
There are many ways to do that.
For example can use set in order to automatically stop on "bad" rc; simply by putting
set -e
on top of your script. Alternatively, you could write a "check_rc" function; see here for some starting points.
Or, you start with this:
check_error () {
if [ $RET == 0 ]; then
echo "DONE"
echo ""
else
echo "ERROR"
exit 1
fi
}
To be used with:
echo "some example command"
RET=$? ; check_error
As said; many ways to do this.
Best bet is to use set -e to terminate the script as soon as any non-zero return code is observed. Alternatively you can write a function to deal with error traps and call it after every command, this will reduce the if...else part and you can print any message before exiting.
trap errorsRead ERR;
function errorsRead() {
echo "Some none-zero return code observed..";
exit 1;
}
somecommand #command of your need
errorsRead # calling trap handling function
You can do this contraption:
wget http://file1.com || exit 1
This will terminate the script with error code 1 if a command returns a non-zero (failed) result.

Get the exit code for a command in Bash and KornShell (ksh)

I want to write code like this:
command="some command"
safeRunCommand $command
safeRunCommand() {
cmnd=$1
$($cmnd)
if [ $? != 0 ]; then
printf "Error when executing command: '$command'"
exit $ERROR_CODE
fi
}
But this code does not work the way I want. Where did I make the mistake?
Below is the fixed code:
#!/bin/ksh
safeRunCommand() {
typeset cmnd="$*"
typeset ret_code
echo cmnd=$cmnd
eval $cmnd
ret_code=$?
if [ $ret_code != 0 ]; then
printf "Error: [%d] when executing command: '$cmnd'" $ret_code
exit $ret_code
fi
}
command="ls -l | grep p"
safeRunCommand "$command"
Now if you look into this code, the few things that I changed are:
use of typeset is not necessary, but it is a good practice. It makes cmnd and ret_code local to safeRunCommand
use of ret_code is not necessary, but it is a good practice to store the return code in some variable (and store it ASAP), so that you can use it later like I did in printf "Error: [%d] when executing command: '$command'" $ret_code
pass the command with quotes surrounding the command like safeRunCommand "$command". If you don’t then cmnd will get only the value ls and not ls -l. And it is even more important if your command contains pipes.
you can use typeset cmnd="$*" instead of typeset cmnd="$1" if you want to keep the spaces. You can try with both depending upon how complex is your command argument.
'eval' is used to evaluate so that a command containing pipes can work fine
Note: Do remember some commands give 1 as the return code even though there isn't any error like grep. If grep found something it will return 0, else 1.
I had tested with KornShell and Bash. And it worked fine. Let me know if you face issues running this.
Try
safeRunCommand() {
"$#"
if [ $? != 0 ]; then
printf "Error when executing command: '$1'"
exit $ERROR_CODE
fi
}
It should be $cmd instead of $($cmd). It works fine with that on my box.
Your script works only for one-word commands, like ls. It will not work for "ls cpp". For this to work, replace cmd="$1"; $cmd with "$#". And, do not run your script as command="some cmd"; safeRun command. Run it as safeRun some cmd.
Also, when you have to debug your Bash scripts, execute with '-x' flag. [bash -x s.sh].
There are several things wrong with your script.
Functions (subroutines) should be declared before attempting to call them. You probably want to return() but not exit() from your subroutine to allow the calling block to test the success or failure of a particular command. That aside, you don't capture 'ERROR_CODE' so that is always zero (undefined).
It's good practice to surround your variable references with curly braces, too. Your code might look like:
#!/bin/sh
command="/bin/date -u" #...Example Only
safeRunCommand() {
cmnd="$#" #...insure whitespace passed and preserved
$cmnd
ERROR_CODE=$? #...so we have it for the command we want
if [ ${ERROR_CODE} != 0 ]; then
printf "Error when executing command: '${command}'\n"
exit ${ERROR_CODE} #...consider 'return()' here
fi
}
safeRunCommand $command
command="cp"
safeRunCommand $command
The normal idea would be to run the command and then use $? to get the exit code. However, sometimes you have multiple cases in which you need to get the exit code. For example, you might need to hide its output, but still return the exit code, or print both the exit code and the output.
ec() { [[ "$1" == "-h" ]] && { shift && eval $* > /dev/null 2>&1; ec=$?; echo $ec; } || eval $*; ec=$?; }
This will give you the option to suppress the output of the command you want the exit code for. When the output is suppressed for the command, the exit code will directly be returned by the function.
I personally like to put this function in my .bashrc file.
Below I demonstrate a few ways in which you can use this:
# In this example, the output for the command will be
# normally displayed, and the exit code will be stored
# in the variable $ec.
$ ec echo test
test
$ echo $ec
0
# In this example, the exit code is output
# and the output of the command passed
# to the `ec` function is suppressed.
$ echo "Exit Code: $(ec -h echo test)"
Exit Code: 0
# In this example, the output of the command
# passed to the `ec` function is suppressed
# and the exit code is stored in `$ec`
$ ec -h echo test
$ echo $ec
0
Solution to your code using this function
#!/bin/bash
if [[ "$(ec -h 'ls -l | grep p')" != "0" ]]; then
echo "Error when executing command: 'grep p' [$ec]"
exit $ec;
fi
You should also note that the exit code you will be seeing will be for the grep command that's being run, as it is the last command being executed. Not the ls.

Get cURL response in bash

I have a simple bash script that uploads files to an FTP. I was wondering how to get a response from curl that I can record (error or success)?
eval curl -T "${xmlFolder}"/"${xmlFile}" "${mediaFTP}"
Thanks in advance
Given the command provided, this should suffice:
curl -T "$xmlFolder/$xmlFile" "$mediaFTP" ||
printf '%s\n' $?
Or, if you want to discard the error message:
curl -T "$xmlFolder/$xmlFile" "$mediaFTP" >/dev/null ||
printf '%s\n' $?
The $? bash variable indicates success (val 0) / failure (val non 0) of the previous command. So you could do:
eval curl -T "${xmlFolder}"/"${xmlFile}" "${mediaFTP}"
err=$?
if [ $err -ne 0 ]
then
echo "Failed with error code $err"
exit
fi

Check wget's return value

I'm writing a script to download a bunch of files, and I want it to inform when a particular file doesn't exist.
r=`wget -q www.someurl.com`
if [ $r -ne 0 ]
then echo "Not there"
else echo "OK"
fi
But it gives the following error on execution:
./file: line 2: [: -ne: unary operator expected
What's wrong?
Others have correctly posted that you can use $? to get the most recent exit code:
wget_output=$(wget -q "$URL")
if [ $? -ne 0 ]; then
...
This lets you capture both the stdout and the exit code. If you don't actually care what it prints, you can just test it directly:
if wget -q "$URL"; then
...
And if you want to suppress the output:
if wget -q "$URL" > /dev/null; then
...
$r is the text output of wget (which you've captured with backticks). To access the return code, use the $? variable.
$r is empty, and therefore your condition becomes if [ -ne 0 ] and it seems as if -ne is used as a unary operator. Try this instead:
wget -q www.someurl.com
if [ $? -ne 0 ]
...
EDIT As Andrew explained before me, backticks return standard output, while $? returns the exit code of the last operation.
you could just
wget ruffingthewitness.com && echo "WE GOT IT" || echo "Failure"
-(~)----------------------------------------------------------(07:30 Tue Apr 27)
risk#DockMaster [2024] --> wget ruffingthewitness.com && echo "WE GOT IT" || echo "Failure"
--2010-04-27 07:30:56-- http://ruffingthewitness.com/
Resolving ruffingthewitness.com... 69.56.251.239
Connecting to ruffingthewitness.com|69.56.251.239|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: unspecified [text/html]
Saving to: `index.html.1'
[ <=> ] 14,252 72.7K/s in 0.2s
2010-04-27 07:30:58 (72.7 KB/s) - `index.html.1' saved [14252]
WE GOT IT
-(~)-----------------------------------------------------------------------------------------------------------(07:30 Tue Apr 27)
risk#DockMaster [2025] --> wget ruffingthewitness.biz && echo "WE GOT IT" || echo "Failure"
--2010-04-27 07:31:05-- http://ruffingthewitness.biz/
Resolving ruffingthewitness.biz... failed: Name or service not known.
wget: unable to resolve host address `ruffingthewitness.biz'
zsh: exit 1 wget ruffingthewitness.biz
Failure
-(~)-----------------------------------------------------------------------------------------------------------(07:31 Tue Apr 27)
risk#DockMaster [2026] -->
Best way to capture the result from wget and also check the call status
wget -O filename URL
if [[ $? -ne 0 ]]; then
echo "wget failed"
exit 1;
fi
This way you can check the status of wget as well as store the output data.
If call is successful use the output stored
Otherwise it will exit with the error wget failed
I been trying all the solutions without lucky.
wget executes in non-interactive way. This means that wget work in the background and you can't catch the return code with $?.
One solution it's to handle the "--server-response" property, searching http 200 status code
Example:
wget --server-response -q -o wgetOut http://www.someurl.com
sleep 5
_wgetHttpCode=`cat wgetOut | gawk '/HTTP/{ print $2 }'`
if [ "$_wgetHttpCode" != "200" ]; then
echo "[Error] `cat wgetOut`"
fi
Note: wget need some time to finish his work, for that reason I put "sleep 5". This is not the best way to do but worked ok for test the solution.

Resources