A function that can make caller return - bash

I am writing some BASH script and I want it have some error handling mechanism:
function f()
{
command1 || { echo "command1 failed"; return 1; }
command2 || { echo "command2 failed"; return 1; }
command3 || { echo "command3 failed"; return 1; }
command4 || { echo "command4 failed"; return 1; }
}
I want to make this repetitive structure more readable by defining some function:
function print_and_return()
{
echo "$#"
# some way to exit the caller function
}
so that I can write function f as
function f()
{
command1 || print_and_return "command1 failed"
command2 || print_and_return "command2 failed"
command3 || print_and_return "command3 failed"
command4 || print_and_return "command4 failed"
}
What's the best way to achieve this? Thanks.

The Problem
You want to:
Pretty up your code a bit.
DRY up the code a little.
Stop execution the first time a command fails.
One Possible Solution
You can refactor your code. For example:
print_and_return() {
echo "command failed: $#" >&2
return 1
}
commands=("cmd1" "cmd2" "cmd3" "cmd4")
for cmd in "${commands[#]}"; do
{ $cmd || print_and_return "$cmd"; } || break
done

Maybe you can use set -e, though you have to be careful since it exits the shell:
function f()
{
(
set -e
command1
command2
command3
command4
)
}
As long as the commands diagnose any problems, this function will stop when the first command fails.
Test version:
function f()
{
(
set -e
true
echo true
true
echo true
false
echo false
true
echo true
)
}
echo calling function
f
echo still here
Output:
calling function
true
true
still here
NB: When I used the sequence:
echo calling function
if f
then echo still here - f passed
else echo still here - f failed
fi
Then the function behaved differently under bash 3.2.48 on Mac OS X 10.7.4:
calling function
true
true
false
true
still here - f passed
So, inventive, but not wholly reliable.

Similiar to CodeGnome's solution, but without using arrays (which might not be present in every shell):
function callthem()
{
for cmd; do # short form for "for cmd in "$#"; do"
"$cmd" || { echo "$cmd failed." >&2; return 1; }
done
}
function f()
{
callthem command1 command2 command3 command4
# or maybe even
callthem command{1,2,3,4}
}

Related

Is there some command that would guarantee stop of further processing but not exit terminal?

Is there something between exit and return 1 in bash? Some command that would guarantee stop of further processing but not exit terminal?
Meaning if I use exit in a sourced function in my bash anytime exit is invoked it will actually quit bash (or log off of ssh connection if I am on remote host). If I use return 1 then I have to check the value in the calling function.
With return I have to write code like the following:
foo(){
if [[ "$#" -ne 1 ]]; then
echo "Unexpected number of arguments [actual=$#][expected=1]"
return 1
fi
# ... do stuff.
}
bar(){
foo
if [[ "$?" -ne 0 ]]; then
echo "Line:$LINENO failure"
return 1;
fi
# do stuff only when foo() is successful
}
I could use an exit but as described then I will quit my current bash if the operation is not succesful:
foo(){
if [[ "$#" -ne 1 ]]; then
echo "Unexpected number of arguments [actual=$#][expected=1]"
exit
fi
# ... do stuff.
}
bar(){
foo
# do stuff only when foo() is successful
}
What I would like is something like:
foo(){
if [[ "$#" -ne 1 ]]; then
echo "Unexpected number of arguments [actual=$#][expected=1]"
# Simulate CTRL+C press (to cause everything to halt but not exit terminal)
# Like an exception throw or something?
fi
# ... do stuff.
}
bar(){
foo
# do stuff only when foo() is successful
}
With return I have to write code like the following:
bar() {
foo
if [[ "$?" -ne 0 ]]; then
echo "Line:$LINENO failure"
return 1;
fi
# do stuff only when foo() is successful
}
Explicitly checking $? can usually be avoided. You can shorten the above to:
bar() {
foo || return
# do stuff only when foo() is successful
}
You just run a command as the if expression. If will check the return code and if it is 0 will evaluate true. If it is non 0 will evaluate false.
He is some sample code.
$ foo() { return 0; }
$ if foo
> then
> echo hello
> else
> echo good bye
> fi
hello
$ foo() { return 1; }
$ if foo
> then
> echo hello
> else
> echo good bye
> fi
good bye
You really should just check the exit value. But, if you want to return when the shell is interactive and exit otherwise, in bash you could do:
foo() {
case $- in
*i*) return 1;;
*) exit 1;;
esac
}
What I was looking for is kill -INT -$$ which interrupts the current process but does not exist the current shell (unlike exit 1). Which allows kill -INT -$$ to be use in command line shell.

Persist variable in success or fail shell command

I have the following program.sh:
#!/bin/bash
(true && { echo true1; echo true2; TEST=1; } || { echo false1; echo false2; TEST=0; }) >> program.log
echo test: $TEST
Why the output of program.sh is:
test:
What is a workaround to persist value in TEST?
Using parentheses creates a subshell. Variable assignments in a subshell don't propagate back to the parent shell. Try replacing () with {}.
{ true && { echo true1; echo true2; TEST=1; } || { echo false1; echo false2; TEST=0; }; } >> program.log

Pass bash syntax (pipe operator) correctly to function

How is it possible that operator >> and stream redirection operator are passed to the function try() which catches errors and exits...
When I do this :
exitFunc() { echo "EXIIIIIIIIIIIIIIIIT" }
yell() { echo "$0: $*" >&2; }
die() { yell "$*"; exitFunc 111; }
try() { "$#" || die "cannot $*"; }
try commandWhichFails >> "logFile.log" 2>&1
When I run the above, also the exitFunction echo is output into the logFile...
How do I need to change the above that the try command does basically this
try ( what ever comes here >> "logFile.log" 2>&1 )
Can this be achieved with subshells?
If you want to use stderr in yell and not have it lost by your redirection in the body of the script, then you need to preserve it at the start of the script. For example in file descriptor 5:
#!/bin/bash
exec 5>&2
yell() { echo "$0: $*" >&5; }
...
If your bash supports it you can ask it to allocate the new file descriptor for you using a new syntax:
#!/bin/bash
exec {newfd}>&2
yell() { echo "$0: $*" >&$newfd; }
...
If you need to you can close the new fd with exec {newfd}>&-.
If I understand you correctly, you can't achieve it with subshells.
If you want the output of commandWhichFails to be sent to logFile.log, but not the errors from try() etc., the problem with your code is that redirections are resolved before command execution, in order of appearance.
Where you've put
try false >> "logFile.log" 2>&1
(using false as a command which fails), the redirections apply to the output of try, not to its arguments (at this point, there is no way to know that try executes its arguments as a command).
There may be a better way to do this, but my instinct is to add a catch function, thus:
last_command=
exitFunc() { echo "EXIIIIIIIIIIIIIIIIT"; } #added ; here
yell() { echo "$0: $*" >&2; }
die() { yell "$*"; exitFunc 111; }
try() { last_command="$#"; "$#"; }
catch() { [ $? -eq 0 ] || die "cannot $last_command"; }
try false >> "logFile.log" 2>&1
catch
Depending on portability requirements, you can always replace last_command with a function like last_command() { history | tail -2 | sed -n '1s/^ *[0-9] *//p' ;} (bash), which requires set -o history and removes the necessity of the try() function. You can replace the -2 with -"$1" to get the N th previous command.
For a more complete discussion, see BASH: echoing the last command run . I'd also recommend looking at trap for general error handling.

Behavior of the 'return' statement in Bash functions

I'm having trouble understanding the behavior of the return built-in in Bash. Here is a sample script.
#!/bin/bash
dostuff() {
date | while true; do
echo returning 0
return 0
echo really-notreached
done
echo notreached
return 3
}
dostuff
echo returncode: $?
The output of this script is:
returning 0
notreached
returncode: 3
If, however, the date | is removed from line 4, the output is as I expected:
returning 0
returncode: 0
It seems like the return statement as used above is acting the way I thought the break statement ought to behave, but only when the loop is on the right hand side of a pipe. Why is this the case? I couldn't find anything to explain this behavior in the Bash man page or online. The script acts the same way in Bash 4.1.5 and Dash 0.5.5.
In the date | while ... scenario, that while loop is executed in a subshell due to the presence of the pipe. Thus, the return statement breaks the loop and the subshell ends, leaving your function to carry on.
You'll have to reconstruct the code to remove the pipeline so that no subshells are created:
dostuff() {
# redirect from a process substitution instead of a pipeline
while true; do
echo returning 0
return 0
echo really-notreached
done < <(date)
echo notreached
return 3
}
If you return inside a function, that function will stop executing but the overall program will not exit.
If you exit inside a function, the overall program will exit.
You cannot return in the main body of a Bash script. You can only return inside a function or sourced script.
For example:
#!/usr/bin/env bash
function doSomething {
echo "a"
return
echo "b" # this will not execute because it is after 'return'
}
function doSomethingElse {
echo "d"
exit 0
echo "e" # this will not execute because the program has exited
}
doSomething
echo "c"
doSomethingElse
echo "f" # this will not execute because the program exited in 'doSomethingElse'
Running the above code will output:
a
c
d
But return should terminate a function call, not a subshell. exit is intended to terminate (sub)shell. I think, it's some undocumented bug/feature.
echo | return typed in a commandline gives an error. That's correct - return should be in a function.
f(){ echo|return; } is accepted in the Bash and Dash, but return doesn't terminate a function call.
If return terminates a subshell, it would work outside a function. So, the conclusion is: return terminates a subshell in a function which is strange.
The thing is: the subshell is a separate process.
It doesn't really have a way to say to the parent shell: "I'm exiting because of a return"
There is no such thing in the exit status, which is the only thing the parent shell gets.
To cover this interesting feature of Bash...
return inside if (or any control command with expression like if/while/...)
return inside if with simple and less simple expressions
The subshell explanation is good. Control returns out of the current subshell. Which might be the Bash function. Or it might be any of the nested control commands with an expression which has caused a subshell to be invoked.
for very simple expressions, e.g., "true" or "1 == 1" there is no subshell invoked. So return behaves as ~normal/expected~.
for less simple expressions, e.g., variable expanded and compared with something then return behaves like break;
Simple (no subshell) examples:
$ rtest () { if true; then echo one; return 2; echo two; fi; echo not simple; return 7; }
$ rtest
one
$ echo $?
2
$ rtest () { if [[ 1 == 1 ]] ; then echo one; return 2; echo two; fi; echo not simple; return 7; }
$ rtest
one
$ echo $?
2
$ rtest () { if [[ 1 =~ 1 ]] ; then echo one; return 2; echo two; fi; echo not simple; return 7; }
$ rtest
one
$ echo $?
2
$ rtest () { if $DO ; then echo one; return 2; echo two; else echo three; return 3; fi; echo not simple; return 7; }
$ rtest
one
$ echo $?
2
$ rtest () { if [[ $DO ]]; then echo one; return 2; echo two; else echo three; return 3; fi; echo not simple; return 7; }
$ rtest
three
$ echo $?
3
$ rtest () { if [[ $DO == 1 ]] ; then echo one; return 2; echo two; else echo three; return 3; echo four; fi; echo not simple; return 7; }
$ rtest; echo $?
one
2
$ DO=1; rtest; echo $?
one
2
$ DO=0; rtest; echo $?
three
3
Expression not simple and presuming subshell is invoked, return behaviour is like break;
Not simple (subshell) example ... =~ inside [[ ]]:
$ rtest () { if [[ $DO =~ 1 ]] ; then echo one; return 2; echo two; fi; echo not simple; return 7; }
$ rtest
not simple
$ echo $?
7

Bash Script function verification

I am trying to do validation for every function I call in a script and erroring out the whole script if one function fails.
Looking for the best way to do this. I feel I knew a good way to do it at one time, but can't figure it out again.
I can brute force it, but it's nasty. This is how I can get it to work correctly
copy_files $1
if [ "$?" -ne "0" ]; then
error "Copying files"
return -1
fi
This gets pretty ugly since I have a script that goes:
command1
command2
command3
Having to wrap all these commands with returns is not very ideal. :( Please help!
command1 || (error "Cmd1 fail"; return -1); command2 || (error "Cmd2 fail"; return -1);
etc. etc. The || operator means the next command will only execute if the previous command failed. && is the opposite. The brackets group the commands to run.
Since you said you want to do this for every command in a script, you could insert
set -e
at the beginning. That call makes the shell exit immediately if any command returns a status code other than 0. (Exceptions: commands that are part of the test in a conditional statement, and those that are followed by && or || and further commands)
You could do something like this:
good() {
return 0;
}
bad() {
return 5;
}
wrapper() {
$1
rc=$?
if [ "$rc" -ne "0" ]; then
echo "error: $1 returned $rc"
exit -1
fi
}
wrapper bad
wrapper good
or, if you could pass a list of function, like so:
wrapper() {
for f in $*; do
$f
rc=$?
if [ "$rc" -ne "0" ]; then
echo "error: $f returned ${rc}"
return -1
fi
done
return 0;
}
wrapper good bad
if [ "$?" -ne "0" ]; then
#error handling here
exit -1;
fi

Resources