I'd like to return a string from a Bash function.
I'll write the example in java to show what I'd like to do:
public String getSomeString() {
return "tadaa";
}
String variable = getSomeString();
The example below works in bash, but is there a better way to do this?
function getSomeString {
echo "tadaa"
}
VARIABLE=$(getSomeString)
There is no better way I know of. Bash knows only status codes (integers) and strings written to the stdout.
You could have the function take a variable as the first arg and modify the variable with the string you want to return.
#!/bin/bash
set -x
function pass_back_a_string() {
eval "$1='foo bar rab oof'"
}
return_var=''
pass_back_a_string return_var
echo $return_var
Prints "foo bar rab oof".
Edit: added quoting in the appropriate place to allow whitespace in string to address #Luca Borrione's comment.
Edit: As a demonstration, see the following program. This is a general-purpose solution: it even allows you to receive a string into a local variable.
#!/bin/bash
set -x
function pass_back_a_string() {
eval "$1='foo bar rab oof'"
}
return_var=''
pass_back_a_string return_var
echo $return_var
function call_a_string_func() {
local lvar=''
pass_back_a_string lvar
echo "lvar='$lvar' locally"
}
call_a_string_func
echo "lvar='$lvar' globally"
This prints:
+ return_var=
+ pass_back_a_string return_var
+ eval 'return_var='\''foo bar rab oof'\'''
++ return_var='foo bar rab oof'
+ echo foo bar rab oof
foo bar rab oof
+ call_a_string_func
+ local lvar=
+ pass_back_a_string lvar
+ eval 'lvar='\''foo bar rab oof'\'''
++ lvar='foo bar rab oof'
+ echo 'lvar='\''foo bar rab oof'\'' locally'
lvar='foo bar rab oof' locally
+ echo 'lvar='\'''\'' globally'
lvar='' globally
Edit: demonstrating that the original variable's value is available in the function, as was incorrectly criticized by #Xichen Li in a comment.
#!/bin/bash
set -x
function pass_back_a_string() {
eval "echo in pass_back_a_string, original $1 is \$$1"
eval "$1='foo bar rab oof'"
}
return_var='original return_var'
pass_back_a_string return_var
echo $return_var
function call_a_string_func() {
local lvar='original lvar'
pass_back_a_string lvar
echo "lvar='$lvar' locally"
}
call_a_string_func
echo "lvar='$lvar' globally"
This gives output:
+ return_var='original return_var'
+ pass_back_a_string return_var
+ eval 'echo in pass_back_a_string, original return_var is $return_var'
++ echo in pass_back_a_string, original return_var is original return_var
in pass_back_a_string, original return_var is original return_var
+ eval 'return_var='\''foo bar rab oof'\'''
++ return_var='foo bar rab oof'
+ echo foo bar rab oof
foo bar rab oof
+ call_a_string_func
+ local 'lvar=original lvar'
+ pass_back_a_string lvar
+ eval 'echo in pass_back_a_string, original lvar is $lvar'
++ echo in pass_back_a_string, original lvar is original lvar
in pass_back_a_string, original lvar is original lvar
+ eval 'lvar='\''foo bar rab oof'\'''
++ lvar='foo bar rab oof'
+ echo 'lvar='\''foo bar rab oof'\'' locally'
lvar='foo bar rab oof' locally
+ echo 'lvar='\'''\'' globally'
lvar='' globally
All answers above ignore what has been stated in the man page of bash.
All variables declared inside a function will be shared with the calling environment.
All variables declared local will not be shared.
Example code
#!/bin/bash
f()
{
echo function starts
local WillNotExists="It still does!"
DoesNotExists="It still does!"
echo function ends
}
echo $DoesNotExists #Should print empty line
echo $WillNotExists #Should print empty line
f #Call the function
echo $DoesNotExists #Should print It still does!
echo $WillNotExists #Should print empty line
And output
$ sh -x ./x.sh
+ echo
+ echo
+ f
+ echo function starts
function starts
+ local 'WillNotExists=It still does!'
+ DoesNotExists='It still does!'
+ echo function ends
function ends
+ echo It still 'does!'
It still does!
+ echo
Also under pdksh and ksh this script does the same!
Bash, since version 4.3, feb 2014(?), has explicit support for reference variables or name references (namerefs), beyond "eval", with the same beneficial performance and indirection effect, and which may be clearer in your scripts and also harder to "forget to 'eval' and have to fix this error":
declare [-aAfFgilnrtux] [-p] [name[=value] ...]
typeset [-aAfFgilnrtux] [-p] [name[=value] ...]
Declare variables and/or give them attributes
...
-n Give each name the nameref attribute, making it a name reference
to another variable. That other variable is defined by the value
of name. All references and assignments to name, except for⋅
changing the -n attribute itself, are performed on the variable
referenced by name's value. The -n attribute cannot be applied to
array variables.
...
When used in a function, declare and typeset make each name local,
as with the local command, unless the -g option is supplied...
and also:
PARAMETERS
A variable can be assigned the nameref attribute using the -n option to the
declare or local builtin commands (see the descriptions of declare and local
below) to create a nameref, or a reference to another variable. This allows
variables to be manipulated indirectly. Whenever the nameref variable is⋅
referenced or assigned to, the operation is actually performed on the variable
specified by the nameref variable's value. A nameref is commonly used within
shell functions to refer to a variable whose name is passed as an argument to⋅
the function. For instance, if a variable name is passed to a shell function
as its first argument, running
declare -n ref=$1
inside the function creates a nameref variable ref whose value is the variable
name passed as the first argument. References and assignments to ref are
treated as references and assignments to the variable whose name was passed as⋅
$1. If the control variable in a for loop has the nameref attribute, the list
of words can be a list of shell variables, and a name reference will be⋅
established for each word in the list, in turn, when the loop is executed.
Array variables cannot be given the -n attribute. However, nameref variables
can reference array variables and subscripted array variables. Namerefs can be⋅
unset using the -n option to the unset builtin. Otherwise, if unset is executed
with the name of a nameref variable as an argument, the variable referenced by⋅
the nameref variable will be unset.
For example (EDIT 2: (thank you Ron) namespaced (prefixed) the function-internal variable name, to minimize external variable clashes, which should finally answer properly, the issue raised in the comments by Karsten):
# $1 : string; your variable to contain the return value
function return_a_string () {
declare -n ret=$1
local MYLIB_return_a_string_message="The date is "
MYLIB_return_a_string_message+=$(date)
ret=$MYLIB_return_a_string_message
}
and testing this example:
$ return_a_string result; echo $result
The date is 20160817
Note that the bash "declare" builtin, when used in a function, makes the declared variable "local" by default, and "-n" can also be used with "local".
I prefer to distinguish "important declare" variables from "boring local" variables, so using "declare" and "local" in this way acts as documentation.
EDIT 1 - (Response to comment below by Karsten) - I cannot add comments below any more, but Karsten's comment got me thinking, so I did the following test which WORKS FINE, AFAICT - Karsten if you read this, please provide an exact set of test steps from the command line, showing the problem you assume exists, because these following steps work just fine:
$ return_a_string ret; echo $ret
The date is 20170104
(I ran this just now, after pasting the above function into a bash term - as you can see, the result works just fine.)
Like bstpierre above, I use and recommend the use of explicitly naming output variables:
function some_func() # OUTVAR ARG1
{
local _outvar=$1
local _result # Use some naming convention to avoid OUTVARs to clash
... some processing ....
eval $_outvar=\$_result # Instead of just =$_result
}
Note the use of quoting the $. This will avoid interpreting content in $result as shell special characters. I have found that this is an order of magnitude faster than the result=$(some_func "arg1") idiom of capturing an echo. The speed difference seems even more notable using bash on MSYS where stdout capturing from function calls is almost catastrophic.
It's ok to send in a local variables since locals are dynamically scoped in bash:
function another_func() # ARG
{
local result
some_func result "$1"
echo result is $result
}
You could also capture the function output:
#!/bin/bash
function getSomeString() {
echo "tadaa!"
}
return_var=$(getSomeString)
echo $return_var
# Alternative syntax:
return_var=`getSomeString`
echo $return_var
Looks weird, but is better than using global variables IMHO. Passing parameters works as usual, just put them inside the braces or backticks.
The most straightforward and robust solution is to use command substitution, as other people wrote:
assign()
{
local x
x="Test"
echo "$x"
}
x=$(assign) # This assigns string "Test" to x
The downside is performance as this requires a separate process.
The other technique suggested in this topic, namely passing the name of a variable to assign to as an argument, has side effects, and I wouldn't recommend it in its basic form. The problem is that you will probably need some variables in the function to calculate the return value, and it may happen that the name of the variable intended to store the return value will interfere with one of them:
assign()
{
local x
x="Test"
eval "$1=\$x"
}
assign y # This assigns string "Test" to y, as expected
assign x # This will NOT assign anything to x in this scope
# because the name "x" is declared as local inside the function
You might, of course, not declare internal variables of the function as local, but you really should always do it as otherwise you may, on the other hand, accidentally overwrite an unrelated variable from the parent scope if there is one with the same name.
One possible workaround is an explicit declaration of the passed variable as global:
assign()
{
local x
eval declare -g $1
x="Test"
eval "$1=\$x"
}
If name "x" is passed as an argument, the second row of the function body will overwrite the previous local declaration. But the names themselves might still interfere, so if you intend to use the value previously stored in the passed variable prior to write the return value there, be aware that you must copy it into another local variable at the very beginning; otherwise the result will be unpredictable!
Besides, this will only work in the most recent version of BASH, namely 4.2. More portable code might utilize explicit conditional constructs with the same effect:
assign()
{
if [[ $1 != x ]]; then
local x
fi
x="Test"
eval "$1=\$x"
}
Perhaps the most elegant solution is just to reserve one global name for function return values and
use it consistently in every function you write.
As previously mentioned, the "correct" way to return a string from a function is with command substitution. In the event that the function also needs to output to console (as #Mani mentions above), create a temporary fd in the beginning of the function and redirect to console. Close the temporary fd before returning your string.
#!/bin/bash
# file: func_return_test.sh
returnString() {
exec 3>&1 >/dev/tty
local s=$1
s=${s:="some default string"}
echo "writing directly to console"
exec 3>&-
echo "$s"
}
my_string=$(returnString "$*")
echo "my_string: [$my_string]"
executing script with no params produces...
# ./func_return_test.sh
writing directly to console
my_string: [some default string]
hope this helps people
-Andy
You could use a global variable:
declare globalvar='some string'
string ()
{
eval "$1='some other string'"
} # ---------- end of function string ----------
string globalvar
echo "'${globalvar}'"
This gives
'some other string'
To illustrate my comment on Andy's answer, with additional file descriptor manipulation to avoid use of /dev/tty:
#!/bin/bash
exec 3>&1
returnString() {
exec 4>&1 >&3
local s=$1
s=${s:="some default string"}
echo "writing to stdout"
echo "writing to stderr" >&2
exec >&4-
echo "$s"
}
my_string=$(returnString "$*")
echo "my_string: [$my_string]"
Still nasty, though.
The way you have it is the only way to do this without breaking scope. Bash doesn't have a concept of return types, just exit codes and file descriptors (stdin/out/err, etc)
Addressing Vicky Ronnen's head up, considering the following code:
function use_global
{
eval "$1='changed using a global var'"
}
function capture_output
{
echo "always changed"
}
function test_inside_a_func
{
local _myvar='local starting value'
echo "3. $_myvar"
use_global '_myvar'
echo "4. $_myvar"
_myvar=$( capture_output )
echo "5. $_myvar"
}
function only_difference
{
local _myvar='local starting value'
echo "7. $_myvar"
local use_global '_myvar'
echo "8. $_myvar"
local _myvar=$( capture_output )
echo "9. $_myvar"
}
declare myvar='global starting value'
echo "0. $myvar"
use_global 'myvar'
echo "1. $myvar"
myvar=$( capture_output )
echo "2. $myvar"
test_inside_a_func
echo "6. $_myvar" # this was local inside the above function
only_difference
will give
0. global starting value
1. changed using a global var
2. always changed
3. local starting value
4. changed using a global var
5. always changed
6.
7. local starting value
8. local starting value
9. always changed
Maybe the normal scenario is to use the syntax used in the test_inside_a_func function, thus you can use both methods in the majority of cases, although capturing the output is the safer method always working in any situation, mimicking the returning value from a function that you can find in other languages, as Vicky Ronnen correctly pointed out.
The options have been all enumerated, I think. Choosing one may come down to a matter of the best style for your particular application, and in that vein, I want to offer one particular style I've found useful. In bash, variables and functions are not in the same namespace. So, treating the variable of the same name as the value of the function is a convention that I find minimizes name clashes and enhances readability, if I apply it rigorously. An example from real life:
UnGetChar=
function GetChar() {
# assume failure
GetChar=
# if someone previously "ungot" a char
if ! [ -z "$UnGetChar" ]; then
GetChar="$UnGetChar"
UnGetChar=
return 0 # success
# else, if not at EOF
elif IFS= read -N1 GetChar ; then
return 0 # success
else
return 1 # EOF
fi
}
function UnGetChar(){
UnGetChar="$1"
}
And, an example of using such functions:
function GetToken() {
# assume failure
GetToken=
# if at end of file
if ! GetChar; then
return 1 # EOF
# if start of comment
elif [[ "$GetChar" == "#" ]]; then
while [[ "$GetChar" != $'\n' ]]; do
GetToken+="$GetChar"
GetChar
done
UnGetChar "$GetChar"
# if start of quoted string
elif [ "$GetChar" == '"' ]; then
# ... et cetera
As you can see, the return status is there for you to use when you need it, or ignore if you don't. The "returned" variable can likewise be used or ignored, but of course only after the function is invoked.
Of course, this is only a convention. You are free to fail to set the associated value before returning (hence my convention of always nulling it at the start of the function) or to trample its value by calling the function again (possibly indirectly). Still, it's a convention I find very useful if I find myself making heavy use of bash functions.
As opposed to the sentiment that this is a sign one should e.g. "move to perl", my philosophy is that conventions are always important for managing the complexity of any language whatsoever.
In my programs, by convention, this is what the pre-existing $REPLY variable is for, which read uses for that exact purpose.
function getSomeString {
REPLY="tadaa"
}
getSomeString
echo $REPLY
This echoes
tadaa
But to avoid conflicts, any other global variable will do.
declare result
function getSomeString {
result="tadaa"
}
getSomeString
echo $result
If that isn’t enough, I recommend Markarian451’s solution.
They key problem of any 'named output variable' scheme where the caller can pass in the variable name (whether using eval or declare -n) is inadvertent aliasing, i.e. name clashes: From an encapsulation point of view, it's awful to not be able to add or rename a local variable in a function without checking ALL the function's callers first to make sure they're not wanting to pass that same name as the output parameter. (Or in the other direction, I don't want to have to read the source of the function I'm calling just to make sure the output parameter I intend to use is not a local in that function.)
The only way around that is to use a single dedicated output variable like REPLY (as suggested by Evi1M4chine) or a convention like the one suggested by Ron Burk.
However, it's possible to have functions use a fixed output variable internally, and then add some sugar over the top to hide this fact from the caller, as I've done with the call function in the following example. Consider this a proof of concept, but the key points are
The function always assigns the return value to REPLY, and can also return an exit code as usual
From the perspective of the caller, the return value can be assigned to any variable (local or global) including REPLY (see the wrapper example). The exit code of the function is passed through, so using them in e.g. an if or while or similar constructs works as expected.
Syntactically the function call is still a single simple statement.
The reason this works is because the call function itself has no locals and uses no variables other than REPLY, avoiding any potential for name clashes. At the point where the caller-defined output variable name is assigned, we're effectively in the caller's scope (technically in the identical scope of the call function), rather than in the scope of the function being called.
#!/bin/bash
function call() { # var=func [args ...]
REPLY=; "${1#*=}" "${#:2}"; eval "${1%%=*}=\$REPLY; return $?"
}
function greet() {
case "$1" in
us) REPLY="hello";;
nz) REPLY="kia ora";;
*) return 123;;
esac
}
function wrapper() {
call REPLY=greet "$#"
}
function main() {
local a b c d
call a=greet us
echo "a='$a' ($?)"
call b=greet nz
echo "b='$b' ($?)"
call c=greet de
echo "c='$c' ($?)"
call d=wrapper us
echo "d='$d' ($?)"
}
main
Output:
a='hello' (0)
b='kia ora' (0)
c='' (123)
d='hello' (0)
You can echo a string, but catch it by piping (|) the function to something else.
You can do it with expr, though ShellCheck reports this usage as deprecated.
bash pattern to return both scalar and array value objects:
definition
url_parse() { # parse 'url' into: 'url_host', 'url_port', ...
local "$#" # inject caller 'url' argument in local scope
local url_host="..." url_path="..." # calculate 'url_*' components
declare -p ${!url_*} # return only 'url_*' object fields to the caller
}
invocation
main() { # invoke url parser and inject 'url_*' results in local scope
eval "$(url_parse url=http://host/path)" # parse 'url'
echo "host=$url_host path=$url_path" # use 'url_*' components
}
Although there were a lot of good answers, they all did not work the way I wanted them to. So here is my solution with these key points:
Helping the forgetful programmer
Atleast I would struggle to always remember error checking after something like this: var=$(myFunction)
Allows assigning values with newline chars \n
Some solutions do not allow for that as some forgot about the single quotes around the value to assign. Right way: eval "${returnVariable}='${value}'" or even better: see the next point below.
Using printf instead of eval
Just try using something like this myFunction "date && var2" to some of the supposed solutions here. eval will execute whatever is given to it. I only want to assign values so I use printf -v "${returnVariable}" "%s" "${value}" instead.
Encapsulation and protection against variable name collision
If a different user or at least someone with less knowledge about the function (this is likely me in some months time) is using myFunction I do not want them to know that he must use a global return value name or some variable names are forbidden to use. That is why I added a name check at the top of myFunction:
if [[ "${1}" = "returnVariable" ]]; then
echo "Cannot give the ouput to \"returnVariable\" as a variable with the same name is used in myFunction()!"
echo "If that is still what you want to do please do that outside of myFunction()!"
return 1
fi
Note this could also be put into a function itself if you have to check a lot of variables.
If I still want to use the same name (here: returnVariable) I just create a buffer variable, give that to myFunction and then copy the value returnVariable.
So here it is:
myFunction():
myFunction() {
if [[ "${1}" = "returnVariable" ]]; then
echo "Cannot give the ouput to \"returnVariable\" as a variable with the same name is used in myFunction()!"
echo "If that is still what you want to do please do that outside of myFunction()!"
return 1
fi
if [[ "${1}" = "value" ]]; then
echo "Cannot give the ouput to \"value\" as a variable with the same name is used in myFunction()!"
echo "If that is still what you want to do please do that outside of myFunction()!"
return 1
fi
local returnVariable="${1}"
local value=$'===========\nHello World\n==========='
echo "setting the returnVariable now..."
printf -v "${returnVariable}" "%s" "${value}"
}
Test cases:
var1="I'm not greeting!"
myFunction var1
[[ $? -eq 0 ]] && echo "myFunction(): SUCCESS" || echo "myFunction(): FAILURE"
printf "var1:\n%s\n" "${var1}"
# Output:
# setting the returnVariable now...
# myFunction(): SUCCESS
# var1:
# ===========
# Hello World
# ===========
returnVariable="I'm not greeting!"
myFunction returnVariable
[[ $? -eq 0 ]] && echo "myFunction(): SUCCESS" || echo "myFunction(): FAILURE"
printf "returnVariable:\n%s\n" "${returnVariable}"
# Output
# Cannot give the ouput to "returnVariable" as a variable with the same name is used in myFunction()!
# If that is still what you want to do please do that outside of myFunction()!
# myFunction(): FAILURE
# returnVariable:
# I'm not greeting!
var2="I'm not greeting!"
myFunction "date && var2"
[[ $? -eq 0 ]] && echo "myFunction(): SUCCESS" || echo "myFunction(): FAILURE"
printf "var2:\n%s\n" "${var2}"
# Output
# setting the returnVariable now...
# ...myFunction: line ..: printf: `date && var2': not a valid identifier
# myFunction(): FAILURE
# var2:
# I'm not greeting!
myFunction var3
[[ $? -eq 0 ]] && echo "myFunction(): SUCCESS" || echo "myFunction(): FAILURE"
printf "var3:\n%s\n" "${var3}"
# Output
# setting the returnVariable now...
# myFunction(): SUCCESS
# var3:
# ===========
# Hello World
# ===========
#Implement a generic return stack for functions:
STACK=()
push() {
STACK+=( "${1}" )
}
pop() {
export $1="${STACK[${#STACK[#]}-1]}"
unset 'STACK[${#STACK[#]}-1]';
}
#Usage:
my_func() {
push "Hello world!"
push "Hello world2!"
}
my_func ; pop MESSAGE2 ; pop MESSAGE1
echo ${MESSAGE1} ${MESSAGE2}
agt#agtsoft:~/temp$ cat ./fc
#!/bin/sh
fcall='function fcall { local res p=$1; shift; fname $*; eval "$p=$res"; }; fcall'
function f1 {
res=$[($1+$2)*2];
}
function f2 {
local a;
eval ${fcall//fname/f1} a 2 3;
echo f2:$a;
}
a=3;
f2;
echo after:a=$a, res=$res
agt#agtsoft:~/temp$ ./fc
f2:10
after:a=3, res=
Related
I am working with a bash script and I want to execute a function to print a return value:
function fun1(){
return 34
}
function fun2(){
local res=$(fun1)
echo $res
}
When I execute fun2, it does not print "34". Why is this the case?
Although Bash has a return statement, the only thing you can specify with it is the function's own exit status (a value between 0 and 255, 0 meaning "success"). So return is not what you want.
You might want to convert your return statement to an echo statement - that way your function output could be captured using $() braces, which seems to be exactly what you want.
Here is an example:
function fun1(){
echo 34
}
function fun2(){
local res=$(fun1)
echo $res
}
Another way to get the return value (if you just want to return an integer 0-255) is $?.
function fun1(){
return 34
}
function fun2(){
fun1
local res=$?
echo $res
}
Also, note that you can use the return value to use Boolean logic - like fun1 || fun2 will only run fun2 if fun1 returns a non-0 value. The default return value is the exit value of the last statement executed within the function.
Functions in Bash are not functions like in other languages; they're actually commands. So functions are used as if they were binaries or scripts fetched from your path. From the perspective of your program logic, there shouldn't really be any difference.
Shell commands are connected by pipes (aka streams), and not fundamental or user-defined data types, as in "real" programming languages. There is no such thing like a return value for a command, maybe mostly because there's no real way to declare it. It could occur on the man-page, or the --help output of the command, but both are only human-readable and hence are written to the wind.
When a command wants to get input it reads it from its input stream, or the argument list. In both cases text strings have to be parsed.
When a command wants to return something, it has to echo it to its output stream. Another often practiced way is to store the return value in dedicated, global variables. Writing to the output stream is clearer and more flexible, because it can take also binary data. For example, you can return a BLOB easily:
encrypt() {
gpg -c -o- $1 # Encrypt data in filename to standard output (asks for a passphrase)
}
encrypt public.dat > private.dat # Write the function result to a file
As others have written in this thread, the caller can also use command substitution $() to capture the output.
Parallely, the function would "return" the exit code of gpg (GnuPG). Think of the exit code as a bonus that other languages don't have, or, depending on your temperament, as a "Schmutzeffekt" of shell functions. This status is, by convention, 0 on success or an integer in the range 1-255 for something else. To make this clear: return (like exit) can only take a value from 0-255, and values other than 0 are not necessarily errors, as is often asserted.
When you don't provide an explicit value with return, the status is taken from the last command in a Bash statement/function/command and so forth. So there is always a status, and return is just an easy way to provide it.
$(...) captures the text sent to standard output by the command contained within. return does not output to standard output. $? contains the result code of the last command.
fun1 (){
return 34
}
fun2 (){
fun1
local res=$?
echo $res
}
The problem with other answers is they either use a global, which can be overwritten when several functions are in a call chain, or echo which means your function cannot output diagnostic information (you will forget your function does this and the "result", i.e. return value, will contain more information than your caller expects, leading to weird bugs), or eval which is way too heavy and hacky.
The proper way to do this is to put the top level stuff in a function and use a local with Bash's dynamic scoping rule. Example:
func1()
{
ret_val=hi
}
func2()
{
ret_val=bye
}
func3()
{
local ret_val=nothing
echo $ret_val
func1
echo $ret_val
func2
echo $ret_val
}
func3
This outputs
nothing
hi
bye
Dynamic scoping means that ret_val points to a different object, depending on the caller! This is different from lexical scoping, which is what most programming languages use. This is actually a documented feature, just easy to miss, and not very well explained. Here is the documentation for it (emphasis is mine):
Variables local to the function may be declared with the local
builtin. These variables are visible only to the function and the
commands it invokes.
For someone with a C, C++, Python, Java,C#, or JavaScript background, this is probably the biggest hurdle: functions in bash are not functions, they are commands, and behave as such: they can output to stdout/stderr, they can pipe in/out, and they can return an exit code. Basically, there isn't any difference between defining a command in a script and creating an executable that can be called from the command line.
So instead of writing your script like this:
Top-level code
Bunch of functions
More top-level code
write it like this:
# Define your main, containing all top-level code
main()
Bunch of functions
# Call main
main
where main() declares ret_val as local and all other functions return values via ret_val.
See also the Unix & Linux question Scope of Local Variables in Shell Functions.
Another, perhaps even better solution depending on situation, is the one posted by ya.teck which uses local -n.
Another way to achieve this is name references (requires Bash 4.3+).
function example {
local -n VAR=$1
VAR=foo
}
example RESULT
echo $RESULT
The return statement sets the exit code of the function, much the same as exit will do for the entire script.
The exit code for the last command is always available in the $? variable.
function fun1(){
return 34
}
function fun2(){
local res=$(fun1)
echo $? # <-- Always echos 0 since the 'local' command passes.
res=$(fun1)
echo $? #<-- Outputs 34
}
As an add-on to others' excellent posts, here's an article summarizing these techniques:
set a global variable
set a global variable, whose name you passed to the function
set the return code (and pick it up with $?)
'echo' some data (and pick it up with MYVAR=$(myfunction) )
Returning Values from Bash Functions
I like to do the following if running in a script where the function is defined:
POINTER= # Used for function return values
my_function() {
# Do stuff
POINTER="my_function_return"
}
my_other_function() {
# Do stuff
POINTER="my_other_function_return"
}
my_function
RESULT="$POINTER"
my_other_function
RESULT="$POINTER"
I like this, because I can then include echo statements in my functions if I want
my_function() {
echo "-> my_function()"
# Do stuff
POINTER="my_function_return"
echo "<- my_function. $POINTER"
}
The simplest way I can think of is to use echo in the method body like so
get_greeting() {
echo "Hello there, $1!"
}
STRING_VAR=$(get_greeting "General Kenobi")
echo $STRING_VAR
# Outputs: Hello there, General Kenobi!
Instead of calling var=$(func) with the whole function output, you can create a function that modifies the input arguments with eval,
var1="is there"
var2="anybody"
function modify_args() {
echo "Modifying first argument"
eval $1="out"
echo "Modifying second argument"
eval $2="there?"
}
modify_args var1 var2
# Prints "Modifying first argument" and "Modifying second argument"
# Sets var1 = out
# Sets var2 = there?
This might be useful in case you need to:
Print to stdout/stderr within the function scope (without returning it)
Return (set) multiple variables.
Git Bash on Windows is using arrays for multiple return values
Bash code:
#!/bin/bash
## A 6-element array used for returning
## values from functions:
declare -a RET_ARR
RET_ARR[0]="A"
RET_ARR[1]="B"
RET_ARR[2]="C"
RET_ARR[3]="D"
RET_ARR[4]="E"
RET_ARR[5]="F"
function FN_MULTIPLE_RETURN_VALUES(){
## Give the positional arguments/inputs
## $1 and $2 some sensible names:
local out_dex_1="$1" ## Output index
local out_dex_2="$2" ## Output index
## Echo for debugging:
echo "Running: FN_MULTIPLE_RETURN_VALUES"
## Here: Calculate output values:
local op_var_1="Hello"
local op_var_2="World"
## Set the return values:
RET_ARR[ $out_dex_1 ]=$op_var_1
RET_ARR[ $out_dex_2 ]=$op_var_2
}
echo "FN_MULTIPLE_RETURN_VALUES EXAMPLES:"
echo "-------------------------------------------"
fn="FN_MULTIPLE_RETURN_VALUES"
out_dex_a=0
out_dex_b=1
eval $fn $out_dex_a $out_dex_b ## <-- Call function
a=${RET_ARR[0]} && echo "RET_ARR[0]: $a "
b=${RET_ARR[1]} && echo "RET_ARR[1]: $b "
echo
## ---------------------------------------------- ##
c="2"
d="3"
FN_MULTIPLE_RETURN_VALUES $c $d ## <--Call function
c_res=${RET_ARR[2]} && echo "RET_ARR[2]: $c_res "
d_res=${RET_ARR[3]} && echo "RET_ARR[3]: $d_res "
echo
## ---------------------------------------------- ##
FN_MULTIPLE_RETURN_VALUES 4 5 ## <--- Call function
e=${RET_ARR[4]} && echo "RET_ARR[4]: $e "
f=${RET_ARR[5]} && echo "RET_ARR[5]: $f "
echo
##----------------------------------------------##
read -p "Press Enter To Exit:"
Expected output:
FN_MULTIPLE_RETURN_VALUES EXAMPLES:
-------------------------------------------
Running: FN_MULTIPLE_RETURN_VALUES
RET_ARR[0]: Hello
RET_ARR[1]: World
Running: FN_MULTIPLE_RETURN_VALUES
RET_ARR[2]: Hello
RET_ARR[3]: World
Running: FN_MULTIPLE_RETURN_VALUES
RET_ARR[4]: Hello
RET_ARR[5]: World
Press Enter To Exit:
I am working with a bash script and I want to execute a function to print a return value:
function fun1(){
return 34
}
function fun2(){
local res=$(fun1)
echo $res
}
When I execute fun2, it does not print "34". Why is this the case?
Although Bash has a return statement, the only thing you can specify with it is the function's own exit status (a value between 0 and 255, 0 meaning "success"). So return is not what you want.
You might want to convert your return statement to an echo statement - that way your function output could be captured using $() braces, which seems to be exactly what you want.
Here is an example:
function fun1(){
echo 34
}
function fun2(){
local res=$(fun1)
echo $res
}
Another way to get the return value (if you just want to return an integer 0-255) is $?.
function fun1(){
return 34
}
function fun2(){
fun1
local res=$?
echo $res
}
Also, note that you can use the return value to use Boolean logic - like fun1 || fun2 will only run fun2 if fun1 returns a non-0 value. The default return value is the exit value of the last statement executed within the function.
Functions in Bash are not functions like in other languages; they're actually commands. So functions are used as if they were binaries or scripts fetched from your path. From the perspective of your program logic, there shouldn't really be any difference.
Shell commands are connected by pipes (aka streams), and not fundamental or user-defined data types, as in "real" programming languages. There is no such thing like a return value for a command, maybe mostly because there's no real way to declare it. It could occur on the man-page, or the --help output of the command, but both are only human-readable and hence are written to the wind.
When a command wants to get input it reads it from its input stream, or the argument list. In both cases text strings have to be parsed.
When a command wants to return something, it has to echo it to its output stream. Another often practiced way is to store the return value in dedicated, global variables. Writing to the output stream is clearer and more flexible, because it can take also binary data. For example, you can return a BLOB easily:
encrypt() {
gpg -c -o- $1 # Encrypt data in filename to standard output (asks for a passphrase)
}
encrypt public.dat > private.dat # Write the function result to a file
As others have written in this thread, the caller can also use command substitution $() to capture the output.
Parallely, the function would "return" the exit code of gpg (GnuPG). Think of the exit code as a bonus that other languages don't have, or, depending on your temperament, as a "Schmutzeffekt" of shell functions. This status is, by convention, 0 on success or an integer in the range 1-255 for something else. To make this clear: return (like exit) can only take a value from 0-255, and values other than 0 are not necessarily errors, as is often asserted.
When you don't provide an explicit value with return, the status is taken from the last command in a Bash statement/function/command and so forth. So there is always a status, and return is just an easy way to provide it.
$(...) captures the text sent to standard output by the command contained within. return does not output to standard output. $? contains the result code of the last command.
fun1 (){
return 34
}
fun2 (){
fun1
local res=$?
echo $res
}
The problem with other answers is they either use a global, which can be overwritten when several functions are in a call chain, or echo which means your function cannot output diagnostic information (you will forget your function does this and the "result", i.e. return value, will contain more information than your caller expects, leading to weird bugs), or eval which is way too heavy and hacky.
The proper way to do this is to put the top level stuff in a function and use a local with Bash's dynamic scoping rule. Example:
func1()
{
ret_val=hi
}
func2()
{
ret_val=bye
}
func3()
{
local ret_val=nothing
echo $ret_val
func1
echo $ret_val
func2
echo $ret_val
}
func3
This outputs
nothing
hi
bye
Dynamic scoping means that ret_val points to a different object, depending on the caller! This is different from lexical scoping, which is what most programming languages use. This is actually a documented feature, just easy to miss, and not very well explained. Here is the documentation for it (emphasis is mine):
Variables local to the function may be declared with the local
builtin. These variables are visible only to the function and the
commands it invokes.
For someone with a C, C++, Python, Java,C#, or JavaScript background, this is probably the biggest hurdle: functions in bash are not functions, they are commands, and behave as such: they can output to stdout/stderr, they can pipe in/out, and they can return an exit code. Basically, there isn't any difference between defining a command in a script and creating an executable that can be called from the command line.
So instead of writing your script like this:
Top-level code
Bunch of functions
More top-level code
write it like this:
# Define your main, containing all top-level code
main()
Bunch of functions
# Call main
main
where main() declares ret_val as local and all other functions return values via ret_val.
See also the Unix & Linux question Scope of Local Variables in Shell Functions.
Another, perhaps even better solution depending on situation, is the one posted by ya.teck which uses local -n.
Another way to achieve this is name references (requires Bash 4.3+).
function example {
local -n VAR=$1
VAR=foo
}
example RESULT
echo $RESULT
The return statement sets the exit code of the function, much the same as exit will do for the entire script.
The exit code for the last command is always available in the $? variable.
function fun1(){
return 34
}
function fun2(){
local res=$(fun1)
echo $? # <-- Always echos 0 since the 'local' command passes.
res=$(fun1)
echo $? #<-- Outputs 34
}
As an add-on to others' excellent posts, here's an article summarizing these techniques:
set a global variable
set a global variable, whose name you passed to the function
set the return code (and pick it up with $?)
'echo' some data (and pick it up with MYVAR=$(myfunction) )
Returning Values from Bash Functions
I like to do the following if running in a script where the function is defined:
POINTER= # Used for function return values
my_function() {
# Do stuff
POINTER="my_function_return"
}
my_other_function() {
# Do stuff
POINTER="my_other_function_return"
}
my_function
RESULT="$POINTER"
my_other_function
RESULT="$POINTER"
I like this, because I can then include echo statements in my functions if I want
my_function() {
echo "-> my_function()"
# Do stuff
POINTER="my_function_return"
echo "<- my_function. $POINTER"
}
The simplest way I can think of is to use echo in the method body like so
get_greeting() {
echo "Hello there, $1!"
}
STRING_VAR=$(get_greeting "General Kenobi")
echo $STRING_VAR
# Outputs: Hello there, General Kenobi!
Instead of calling var=$(func) with the whole function output, you can create a function that modifies the input arguments with eval,
var1="is there"
var2="anybody"
function modify_args() {
echo "Modifying first argument"
eval $1="out"
echo "Modifying second argument"
eval $2="there?"
}
modify_args var1 var2
# Prints "Modifying first argument" and "Modifying second argument"
# Sets var1 = out
# Sets var2 = there?
This might be useful in case you need to:
Print to stdout/stderr within the function scope (without returning it)
Return (set) multiple variables.
Git Bash on Windows is using arrays for multiple return values
Bash code:
#!/bin/bash
## A 6-element array used for returning
## values from functions:
declare -a RET_ARR
RET_ARR[0]="A"
RET_ARR[1]="B"
RET_ARR[2]="C"
RET_ARR[3]="D"
RET_ARR[4]="E"
RET_ARR[5]="F"
function FN_MULTIPLE_RETURN_VALUES(){
## Give the positional arguments/inputs
## $1 and $2 some sensible names:
local out_dex_1="$1" ## Output index
local out_dex_2="$2" ## Output index
## Echo for debugging:
echo "Running: FN_MULTIPLE_RETURN_VALUES"
## Here: Calculate output values:
local op_var_1="Hello"
local op_var_2="World"
## Set the return values:
RET_ARR[ $out_dex_1 ]=$op_var_1
RET_ARR[ $out_dex_2 ]=$op_var_2
}
echo "FN_MULTIPLE_RETURN_VALUES EXAMPLES:"
echo "-------------------------------------------"
fn="FN_MULTIPLE_RETURN_VALUES"
out_dex_a=0
out_dex_b=1
eval $fn $out_dex_a $out_dex_b ## <-- Call function
a=${RET_ARR[0]} && echo "RET_ARR[0]: $a "
b=${RET_ARR[1]} && echo "RET_ARR[1]: $b "
echo
## ---------------------------------------------- ##
c="2"
d="3"
FN_MULTIPLE_RETURN_VALUES $c $d ## <--Call function
c_res=${RET_ARR[2]} && echo "RET_ARR[2]: $c_res "
d_res=${RET_ARR[3]} && echo "RET_ARR[3]: $d_res "
echo
## ---------------------------------------------- ##
FN_MULTIPLE_RETURN_VALUES 4 5 ## <--- Call function
e=${RET_ARR[4]} && echo "RET_ARR[4]: $e "
f=${RET_ARR[5]} && echo "RET_ARR[5]: $f "
echo
##----------------------------------------------##
read -p "Press Enter To Exit:"
Expected output:
FN_MULTIPLE_RETURN_VALUES EXAMPLES:
-------------------------------------------
Running: FN_MULTIPLE_RETURN_VALUES
RET_ARR[0]: Hello
RET_ARR[1]: World
Running: FN_MULTIPLE_RETURN_VALUES
RET_ARR[2]: Hello
RET_ARR[3]: World
Running: FN_MULTIPLE_RETURN_VALUES
RET_ARR[4]: Hello
RET_ARR[5]: World
Press Enter To Exit:
I hope that I can do something like this, and the output would be "hello"
#!/bin/bash
foo="hello"
dummy() {
local local_foo=`echo $foo`
echo $local_foo
}
foo=''
dummy
This question means that I would like to capture the value of some global values at definition time, usually used via source blablabla.bash and would like that it defines a function that captures current variable's value.
The Sane Way
Functions are evaluated when they're run, not when they're defined. Since you want to capture a variable as it exists at definition time, you'll need a separate variable assigned at that time.
foo="hello"
# By convention, global variables prefixed by a function name and double underscore are for
# the exclusive use of that function.
readonly dummy__foo="$foo" # capture foo as of dummy definition time, and prevent changes
dummy() {
local local_foo=$dummy__foo # ...and refer to that captured copy
echo "$local_foo"
}
foo=""
dummy
The Insane Way
If you're willing to commit crimes against humanity, however, it is possible to do code generation to capture a value. For instance:
# usage: with_locals functionname k1=v1 [k2=v2 [...]]
with_locals() {
local func_name func_text assignments
func_name=$1; shift || return ## fail if out of arguments
(( $# )) || return ## noop if not given at least one assignment
func_text=$(declare -f "$func_name")
for arg; do
if [[ $arg = *=* ]]; then ## if we already look like an assignment, leave be
printf -v arg_q 'local %q; ' "$arg"
else ## otherwise, assume we're a bare name and run a lookup
printf -v arg_q 'local %q=%q; ' "$arg" "${!arg}"
fi
assignments+="$arg_q"
done
# suffix first instance of { in the function definition with our assignments
eval "${func_text/{/{ $assignments}"
}
...thereafter:
foo=hello
dummy() {
local local_foo="$foo"
echo "$local_foo"
}
with_locals dummy foo ## redefine dummy to always use the current value of "foo"
foo=''
dummy
Well, you can comment out or remove the foo='' line, and that will do it. The function dummy does not execute until you call it, which is after you've blanked out the foo value, so it makes sense that you would get a blank line echoed. Hope this helps.
There is no way to execute the code inside a function unless that function gets called by bash. There is only an alternative of calling some other function that is used to define the function you want to call after.
That is what a dynamic function definition is.
I don't believe that you want that.
An alternative is to store the value of foo (calling the function) and then calling it again after the value has changed. Something hack-sh like this:
#!/bin/bash
foo="hello"
dummy() {
${global_foo+false} &&
global_foo="$foo" ||
echo "old_foo=$global_foo new_foo=$foo"
}
dummy
foo='new'
dummy
foo="a whole new foo"
dummy
Calling it will print:
$ ./script
old_foo=hello new_foo=new
old_foo=hello new_foo=a whole new foo
As I am not sure this address your real problem, just: Hope this helps.
After inspired by #CharlesDuffy, I think using eval might solve some of the problems, and the example can be modified as following:
#!/bin/bash
foo="hello"
eval "
dummy() {
local local_foo=$foo
echo \$local_foo
}
"
foo=''
dummy
Which will give the result 'hello' instead of nothing.
#CharlesDuffy pointed out that such solution is quite dangerous:
local local_foo=$foo is dangerously buggy: If your foo value contains
an expansion such as $(rm -rf $HOME), it'll be executed
Using eval is good in performance, however being bad in security. And therefore I'd suggest #CharlesDuffy 's answer.
I am working with a bash script and I want to execute a function to print a return value:
function fun1(){
return 34
}
function fun2(){
local res=$(fun1)
echo $res
}
When I execute fun2, it does not print "34". Why is this the case?
Although Bash has a return statement, the only thing you can specify with it is the function's own exit status (a value between 0 and 255, 0 meaning "success"). So return is not what you want.
You might want to convert your return statement to an echo statement - that way your function output could be captured using $() braces, which seems to be exactly what you want.
Here is an example:
function fun1(){
echo 34
}
function fun2(){
local res=$(fun1)
echo $res
}
Another way to get the return value (if you just want to return an integer 0-255) is $?.
function fun1(){
return 34
}
function fun2(){
fun1
local res=$?
echo $res
}
Also, note that you can use the return value to use Boolean logic - like fun1 || fun2 will only run fun2 if fun1 returns a non-0 value. The default return value is the exit value of the last statement executed within the function.
Functions in Bash are not functions like in other languages; they're actually commands. So functions are used as if they were binaries or scripts fetched from your path. From the perspective of your program logic, there shouldn't really be any difference.
Shell commands are connected by pipes (aka streams), and not fundamental or user-defined data types, as in "real" programming languages. There is no such thing like a return value for a command, maybe mostly because there's no real way to declare it. It could occur on the man-page, or the --help output of the command, but both are only human-readable and hence are written to the wind.
When a command wants to get input it reads it from its input stream, or the argument list. In both cases text strings have to be parsed.
When a command wants to return something, it has to echo it to its output stream. Another often practiced way is to store the return value in dedicated, global variables. Writing to the output stream is clearer and more flexible, because it can take also binary data. For example, you can return a BLOB easily:
encrypt() {
gpg -c -o- $1 # Encrypt data in filename to standard output (asks for a passphrase)
}
encrypt public.dat > private.dat # Write the function result to a file
As others have written in this thread, the caller can also use command substitution $() to capture the output.
Parallely, the function would "return" the exit code of gpg (GnuPG). Think of the exit code as a bonus that other languages don't have, or, depending on your temperament, as a "Schmutzeffekt" of shell functions. This status is, by convention, 0 on success or an integer in the range 1-255 for something else. To make this clear: return (like exit) can only take a value from 0-255, and values other than 0 are not necessarily errors, as is often asserted.
When you don't provide an explicit value with return, the status is taken from the last command in a Bash statement/function/command and so forth. So there is always a status, and return is just an easy way to provide it.
$(...) captures the text sent to standard output by the command contained within. return does not output to standard output. $? contains the result code of the last command.
fun1 (){
return 34
}
fun2 (){
fun1
local res=$?
echo $res
}
The problem with other answers is they either use a global, which can be overwritten when several functions are in a call chain, or echo which means your function cannot output diagnostic information (you will forget your function does this and the "result", i.e. return value, will contain more information than your caller expects, leading to weird bugs), or eval which is way too heavy and hacky.
The proper way to do this is to put the top level stuff in a function and use a local with Bash's dynamic scoping rule. Example:
func1()
{
ret_val=hi
}
func2()
{
ret_val=bye
}
func3()
{
local ret_val=nothing
echo $ret_val
func1
echo $ret_val
func2
echo $ret_val
}
func3
This outputs
nothing
hi
bye
Dynamic scoping means that ret_val points to a different object, depending on the caller! This is different from lexical scoping, which is what most programming languages use. This is actually a documented feature, just easy to miss, and not very well explained. Here is the documentation for it (emphasis is mine):
Variables local to the function may be declared with the local
builtin. These variables are visible only to the function and the
commands it invokes.
For someone with a C, C++, Python, Java,C#, or JavaScript background, this is probably the biggest hurdle: functions in bash are not functions, they are commands, and behave as such: they can output to stdout/stderr, they can pipe in/out, and they can return an exit code. Basically, there isn't any difference between defining a command in a script and creating an executable that can be called from the command line.
So instead of writing your script like this:
Top-level code
Bunch of functions
More top-level code
write it like this:
# Define your main, containing all top-level code
main()
Bunch of functions
# Call main
main
where main() declares ret_val as local and all other functions return values via ret_val.
See also the Unix & Linux question Scope of Local Variables in Shell Functions.
Another, perhaps even better solution depending on situation, is the one posted by ya.teck which uses local -n.
Another way to achieve this is name references (requires Bash 4.3+).
function example {
local -n VAR=$1
VAR=foo
}
example RESULT
echo $RESULT
The return statement sets the exit code of the function, much the same as exit will do for the entire script.
The exit code for the last command is always available in the $? variable.
function fun1(){
return 34
}
function fun2(){
local res=$(fun1)
echo $? # <-- Always echos 0 since the 'local' command passes.
res=$(fun1)
echo $? #<-- Outputs 34
}
As an add-on to others' excellent posts, here's an article summarizing these techniques:
set a global variable
set a global variable, whose name you passed to the function
set the return code (and pick it up with $?)
'echo' some data (and pick it up with MYVAR=$(myfunction) )
Returning Values from Bash Functions
I like to do the following if running in a script where the function is defined:
POINTER= # Used for function return values
my_function() {
# Do stuff
POINTER="my_function_return"
}
my_other_function() {
# Do stuff
POINTER="my_other_function_return"
}
my_function
RESULT="$POINTER"
my_other_function
RESULT="$POINTER"
I like this, because I can then include echo statements in my functions if I want
my_function() {
echo "-> my_function()"
# Do stuff
POINTER="my_function_return"
echo "<- my_function. $POINTER"
}
The simplest way I can think of is to use echo in the method body like so
get_greeting() {
echo "Hello there, $1!"
}
STRING_VAR=$(get_greeting "General Kenobi")
echo $STRING_VAR
# Outputs: Hello there, General Kenobi!
Instead of calling var=$(func) with the whole function output, you can create a function that modifies the input arguments with eval,
var1="is there"
var2="anybody"
function modify_args() {
echo "Modifying first argument"
eval $1="out"
echo "Modifying second argument"
eval $2="there?"
}
modify_args var1 var2
# Prints "Modifying first argument" and "Modifying second argument"
# Sets var1 = out
# Sets var2 = there?
This might be useful in case you need to:
Print to stdout/stderr within the function scope (without returning it)
Return (set) multiple variables.
Git Bash on Windows is using arrays for multiple return values
Bash code:
#!/bin/bash
## A 6-element array used for returning
## values from functions:
declare -a RET_ARR
RET_ARR[0]="A"
RET_ARR[1]="B"
RET_ARR[2]="C"
RET_ARR[3]="D"
RET_ARR[4]="E"
RET_ARR[5]="F"
function FN_MULTIPLE_RETURN_VALUES(){
## Give the positional arguments/inputs
## $1 and $2 some sensible names:
local out_dex_1="$1" ## Output index
local out_dex_2="$2" ## Output index
## Echo for debugging:
echo "Running: FN_MULTIPLE_RETURN_VALUES"
## Here: Calculate output values:
local op_var_1="Hello"
local op_var_2="World"
## Set the return values:
RET_ARR[ $out_dex_1 ]=$op_var_1
RET_ARR[ $out_dex_2 ]=$op_var_2
}
echo "FN_MULTIPLE_RETURN_VALUES EXAMPLES:"
echo "-------------------------------------------"
fn="FN_MULTIPLE_RETURN_VALUES"
out_dex_a=0
out_dex_b=1
eval $fn $out_dex_a $out_dex_b ## <-- Call function
a=${RET_ARR[0]} && echo "RET_ARR[0]: $a "
b=${RET_ARR[1]} && echo "RET_ARR[1]: $b "
echo
## ---------------------------------------------- ##
c="2"
d="3"
FN_MULTIPLE_RETURN_VALUES $c $d ## <--Call function
c_res=${RET_ARR[2]} && echo "RET_ARR[2]: $c_res "
d_res=${RET_ARR[3]} && echo "RET_ARR[3]: $d_res "
echo
## ---------------------------------------------- ##
FN_MULTIPLE_RETURN_VALUES 4 5 ## <--- Call function
e=${RET_ARR[4]} && echo "RET_ARR[4]: $e "
f=${RET_ARR[5]} && echo "RET_ARR[5]: $f "
echo
##----------------------------------------------##
read -p "Press Enter To Exit:"
Expected output:
FN_MULTIPLE_RETURN_VALUES EXAMPLES:
-------------------------------------------
Running: FN_MULTIPLE_RETURN_VALUES
RET_ARR[0]: Hello
RET_ARR[1]: World
Running: FN_MULTIPLE_RETURN_VALUES
RET_ARR[2]: Hello
RET_ARR[3]: World
Running: FN_MULTIPLE_RETURN_VALUES
RET_ARR[4]: Hello
RET_ARR[5]: World
Press Enter To Exit:
Are there any idioms for returning multiple values from a bash function within a script?
http://tldp.org/LDP/abs/html/assortedtips.html describes how to echo multiple values and process the results (e.g., example 35-17), but that gets tricky if some of the returned values are strings with spaces in.
A more structured way to return would be to assign to global variables, like
foo () {
FOO_RV1="bob"
FOO_RV2="bill"
}
foo
echo "foo returned ${FOO_RV1} and ${FOO_RV2}"
I realize that if I need re-entrancy in a shell script I'm probably doing it wrong, but I still feel very uncomfortable throwing global variables around just to hold return values.
Is there a better way? I would prefer portability, but it's probably not a real limitation if I have to specify #!/bin/bash.
In the special case where your values never contain spaces, this read trick can be a simple solution:
get_vars () {
#...
echo "value1" "value2"
}
read var1 var2 < <(get_vars)
echo "var1='$var1', var2='$var2'"
But of course, it breaks as soon as there is a space in one of the values. You could modify IFS and use a special separator in your function's echo, but then the result is not really simpler than the other suggested solutions.
This question was posted 5 years ago, but I have some interesting answer to post. I have just started learning bash, and I also encounter to the same problem as you did. I think this trick might be helpful:
#!/bin/sh
foo=""
bar=""
my_func(){
echo 'foo="a"; bar="b"'
}
eval $(my_func)
echo $foo $bar
# result: a b
This trick is also useful for solving a problem when a child process can not send back a value to its parent process.
Much as I love shell, it's probably the case that as soon as you're throwing arbitrary structured data around, Unix bourne/posix shell is not the right choice.
If there are characters which do not occur inside fields, then separate with one of those. The classic example is /etc/passwd, /etc/group and various other files which use a colon as a field separator.
If using a shell which can handle a NUL character inside strings, then joining on the NUL and separating on it (via $IFS or whatever) can work well. But several common shells, including bash, break on NUL. A test would be an old .sig of mine:
foo=$'a\0b'; [ ${#foo} -eq 3 ] && echo "$0 rocks"
Even if that would work for you, you've just reached one of the warning signs that it's time to switch to a more structured language (Python, Perl, Ruby, Lua, Javascript ... pick your preferred poison). Your code is likely to become hard to maintain; even if you can, there's a smaller pool of people who'll understand it well enough to maintain it.
Yet another way:
function get_tuple()
{
echo -e "Value1\nValue2"
}
IFS=$'\n' read -d '' -ra VALUES < <(get_tuple)
echo "${VALUES[0]}" # Value1
echo "${VALUES[1]}" # Value2
In order version of Bash which doesn't support nameref (introduced in Bash 4.3-alpha) I may define helper function in which the return value is assigned to the given variable. It's sort of like using eval to do the same kind of variable assignment.
Example 1
## Add two complex numbers and returns it.
## re: real part, im: imaginary part.
##
## Helper function named by the 5th positional parameter
## have to have been defined before the function is called.
complexAdd()
{
local re1="$1" im1="$2" re2="$3" im2="$4" fnName="$5" sumRe sumIm
sumRe=$(($re1 + $re2))
sumIm=$(($im1 + $im2))
## Call the function and return 2 values.
"$fnName" "$sumRe" "$sumIm"
}
main()
{
local fooRe='101' fooIm='37' barRe='55' barIm='123' bazRe bazIm quxRe quxIm
## Define the function to receive mutiple return values
## before calling complexAdd().
retValAssign() { bazRe="$1"; bazIm="$2"; }
## Call comlexAdd() for the first time.
complexAdd "$fooRe" "$fooIm" "$barRe" "$barIm" 'retValAssign'
## Redefine the function to receive mutiple return values.
retValAssign() { quxRe="$1"; quxIm="$2"; }
## Call comlexAdd() for the second time.
complexAdd "$barRe" "$barIm" "$bazRe" "$bazIm" 'retValAssign'
echo "foo = $fooRe + $fooIm i"
echo "bar = $barRe + $barIm i"
echo "baz = foo + bar = $bazRe + $bazIm i"
echo "qux = bar + baz = $quxRe + $quxIm i"
}
main
Example 2
## Add two complex numbers and returns it.
## re: real part, im: imaginary part.
##
## Helper functions
## getRetRe(), getRetIm(), setRetRe() and setRetIm()
## have to have been defined before the function is called.
complexAdd()
{
local re1="$1" im1="$2" re2="$3" im2="$4"
setRetRe "$re1"
setRetRe $(($(getRetRe) + $re2))
setRetIm $(($im1 + $im2))
}
main()
{
local fooRe='101' fooIm='37' barRe='55' barIm='123' bazRe bazIm quxRe quxIm
## Define getter and setter functions before calling complexAdd().
getRetRe() { echo "$bazRe"; }
getRetIm() { echo "$bazIm"; }
setRetRe() { bazRe="$1"; }
setRetIm() { bazIm="$1"; }
## Call comlexAdd() for the first time.
complexAdd "$fooRe" "$fooIm" "$barRe" "$barIm"
## Redefine getter and setter functions.
getRetRe() { echo "$quxRe"; }
getRetIm() { echo "$quxIm"; }
setRetRe() { quxRe="$1"; }
setRetIm() { quxIm="$1"; }
## Call comlexAdd() for the second time.
complexAdd "$barRe" "$barIm" "$bazRe" "$bazIm"
echo "foo = $fooRe + $fooIm i"
echo "bar = $barRe + $barIm i"
echo "baz = foo + bar = $bazRe + $bazIm i"
echo "qux = bar + baz = $quxRe + $quxIm i"
}
main
you can make use of associative arrays with you have bash 4 eg
declare -A ARR
function foo(){
...
ARR["foo_return_value_1"]="VAR1"
ARR["foo_return_value_2"]="VAR2"
}
you can combine them as strings.
function foo(){
...
echo "$var1|$var2|$var3"
}
then whenever you need to use those return values,
ret="$(foo)"
IFS="|"
set -- $ret
echo "var1 one is: $1"
echo "var2 one is: $2"
echo "var3 one is: $3"
I would go for the solution I suggested here, but using an array variable instead. Older bash:es don't support associative arrays.
E.g.,
function some_func() # ARRVAR args...
{
local _retvar=$1 # I use underscore to avoid clashes with return variable names
local -a _out
# ... some processing ... (_out[2]=xxx etc.)
eval $_retvar='("${_out[#]}")'
}
Calling site:
function caller()
{
local -a tuple_ret # Do not use leading '_' here.
# ...
some_func tuple_ret "arg1"
printf " %s\n" "${tuple_ret[#]}" # Print tuple members on separate lines
}
Later version of Bash supports nameref. Use declare -n var_name to give var_name the nameref attribute. nameref gives your function the ability to "pass by reference" which is commonly used in C++ functions to return multiple values. According to Bash man page:
A variable can be assigned the nameref attribute using the -n option to the declare or local builtin commands to create a nameref, or a reference to another variable. This allows variables to be manipulated indirectly. Whenever the nameref variable is referenced or assigned to, the operation is actually performed on the variable specified by the nameref variable's value. A nameref is commonly used within shell functions to refer to a variable whose name is passed as an argument to the function.
The following are some interactive command line examples.
Example 1:
$ unset xx yy
$ xx=16
$ yy=xx
$ echo "[$xx] [$yy]"
[16] [xx]
$ declare -n yy
$ echo "[$xx] [$yy]"
[16] [16]
$ xx=80
$ echo "[$xx] [$yy]"
[80] [80]
$ yy=2016
$ echo "[$xx] [$yy]"
[2016] [2016]
$ declare +n yy # Use -n to add and +n to remove nameref attribute.
$ echo "[$xx] [$yy]"
[2016] [xx]
Example 2:
$ func()
> {
> local arg1="$1" arg2="$2"
> local -n arg3ref="$3" arg4ref="$4"
>
> echo ''
> echo 'Local variables:'
> echo " arg1='$arg1'"
> echo " arg2='$arg2'"
> echo " arg3ref='$arg3ref'"
> echo " arg4ref='$arg4ref'"
> echo ''
>
> arg1='1st value of local assignment'
> arg2='2st value of local assignment'
> arg3ref='1st return value'
> arg4ref='2nd return value'
> }
$
$ unset foo bar baz qux
$
$ foo='value of foo'
$ bar='value of bar'
$ baz='value of baz'
$ qux='value of qux'
$
$ func foo bar baz qux
Local variables:
arg1='foo'
arg2='bar'
arg3ref='value of baz'
arg4ref='value of qux'
$
$ {
> echo ''
> echo '2 values are returned after the function call:'
> echo " foo='$foo'"
> echo " bar='$bar'"
> echo " baz='$baz'"
> echo " qux='$qux'"
> }
2 values are returned after the function call:
foo='value of foo'
bar='value of bar'
baz='1st return value'
qux='2nd return value'
I am new to bash, But found this code helping.
function return_multiple_values() {
eval "$1='What is your name'"
eval "$2='my name is: BASH'"
}
return_var=''
res2=''
return_multiple_values return_var res2
echo $return_var
echo $res2
Shell script functions can only return the exit status of last command executed or the exit status of that function specified explicitly by a return statement.
To return some string one way may be this:
function fun()
{
echo "a+b"
}
var=`fun` # Invoke the function in a new child shell and capture the results
echo $var # use the stored result
This may reduce your discomfort although it adds the overhead of creation of a new shell and hence would be marginally slower.