What is the meaning of "-" in getopts template? - bash

What is the meaning of the following template in getopts?
while getopts ':x:y-:' val;
I know that it expects two options -x or -y but what is the meaning of the symbol - at the end of the template ?

Empirical approach
I was unable to find any documentation about this "-" in the option string. So, I tried an empirical approach to see how it influences the behavior of getopts. I found that passing "--something" to the script (without spaces after "--") makes it accept "--" as an option and report "something" in OPTARG:
#!/bin/bash
xopt=
yopt=
mopt=
while getopts ':x:y-:' val
do
case $val in
x) xopt=1
xval="$OPTARG";;
y) yopt=1;;
-) mopt=1
mval="$OPTARG";;
?) echo "Usage: $0: [-x value] [-y] [--long_opt_name] args" >&2
exit 2;;
esac
done
[ ! -z "$xopt" ] && echo "Option -x specified with parameter '$xval'"
[ ! -z "$yopt" ] && echo "Option -y specified"
[ ! -z "$mopt" ] && echo "Option -- specified with optname '$mval'"
shift $(($OPTIND - 1))
echo "Remaining arguments are: $*"
Examples of executions:
$ t.sh --v
Option -- specified with optname 'v'
Remaining arguments are:
$ t.sh --vv other1 other2
Option -- specified with optname 'vv'
Remaining arguments are: other1 other2
$ t.sh --help -x 123 -y others
Option -x specified with parameter '123'
Option -y specified
Option -- specified with optname 'help'
Remaining arguments are: others
$ t.sh --help -x 123 -- -y others
Option -x specified with parameter '123'
Option -- specified with optname 'help'
Remaining arguments are: -y others
$ t.sh -y -x val --x -- param1 -h -j -x -y
Option -x specified with parameter 'val'
Option -y specified
Option -- specified with optname 'x'
Remaining arguments are: param1 -h -j -x -y
Would it be a "hidden" feature to manage gnu-like long options but without parameters (i.e. only the "--long_opt_name") or am I promoting the side effect of a bug? Anyway, using such undocumented behavior is not advised as this may change after some future fixes or evolution of the command.
Nevertheless, if spaces are put after the double "-", the latter continues to play its usual documented role separating options from additional parameters:
$ t.sh --help -y -x val -- param1 -h -j -x -y
Option -x specified with parameter 'val'
Option -y specified
Option -- specified with optname 'help'
Remaining arguments are: param1 -h -j -x -y
$ t.sh -- -v
Remaining arguments are: -v
Verification in the source code
As getopts is a builtin of bash, I downloaded its source code (version 5.0) from here. The builtins are located in the eponym sub-directory. getopts source code is: builtins/getopts.def. For each argument on the command line, it calls sh_getopt(argc, argv, optstr). This function is defined in builtins/getopt.c:
[...]
int
sh_getopt (argc, argv, optstring)
int argc;
char *const *argv;
const char *optstring;
{
[...]
/* Look at and handle the next option-character. */
c = *nextchar++; sh_charindex++;
temp = strchr (optstring, c);
sh_optopt = c;
/* Increment `sh_optind' when we start to process its last character. */
if (nextchar == 0 || *nextchar == '\0')
{
sh_optind++;
nextchar = (char *)NULL;
}
if (sh_badopt = (temp == NULL || c == ':'))
{
if (sh_opterr)
BADOPT (c);
return '?';
}
if (temp[1] == ':')
{
if (nextchar && *nextchar)
{
/* This is an option that requires an argument. */
sh_optarg = nextchar;
/* If we end this ARGV-element by taking the rest as an arg,
we must advance to the next element now. */
sh_optind++;
}
else if (sh_optind == argc)
{
if (sh_opterr)
NEEDARG (c);
sh_optopt = c;
sh_optarg = ""; /* Needed by getopts. */
c = (optstring[0] == ':') ? ':' : '?';
}
else
/* We already incremented `sh_optind' once;
increment it again when taking next ARGV-elt as argument. */
sh_optarg = argv[sh_optind++];
nextchar = (char *)NULL;
}
return c;
}
In the previous source lines, nextchar points on the option character (i.e. the one located right after '-') in argv[] and temp points on the option character in the optstring (which is '-'). We can see when temp[1] == ':' (i.e. the optstring specifies "-:"), sh_optarg is set with the incremented value of nextchar which is the first letter of the option name located behind "--".
In our example, where optstring is ":x:y-:" and we pass to the script "--name", the above code does:
optstring = ":x:y-:"
^
|
temp
argv[x] = "--name"
^^
/ \
c nextchar (+ 1)
temp[1] == ':' ==> sh_optarg=nextchar="name"
Hence, with the above algorithm in bash, when ":-" is specified in the option string, any "--name" option on the command line reports "name" in OPTARG variable.
This is merely the output of the code path when the parameter is concatenated to the option name (e.g. -xfoo = option "x" and parameter "foo", --foo = option "-" and parameter "foo").

Related

what is this line means $runCmd = "cmexec $node1 echo \"\" > ".$logfile;

Code snippet:
my $node = shift;
my $runCmd = "cmviewcl -v -f line -p ".$package_name." | awk -F \"[:|=]\" \'(\$1 == \"script_log_file\") { print \$2 }\'";
my $logfile = $output[0];
chomp $logfile;
#DC1_list = utils::getDC1Host($hash_ref);
#DC2_list = utils::getDC2Host($hash_ref);
foreach $node1 (#DC1_list) {
$runCmd = "cmexec $node1 echo \"\" > ".$logfile;
Please let me know the what's this line means:
$runCmd = "cmexec $node1 echo \"\" > ".$logfile;
it was written before as:
$runCmd = "cmexec $node1 rm -rf ".$logfile;
which probably means remove the file in logfile variable forced recursive, but later changed to the above. so
what's it's doing?
Remove a file is different than an empty file.
The first option keep the file but override the content with "" (2x double quote), the second one remove the file.
Maybe your application need the file exist, because of this you cannot remove it.
If you have really copied this line verbatim, it is pretty nonsense.
Let's assume that the variables mentioned here have the folllowing values:
runCmd has value FOO
node1 has value BAR
logfile has value BAZ
After parameter expansion and making the quoting a bit more legible, this leaves you with a line equivalent to
FOO = 'cmexec BAR echo "" >' .BAZ
This means that a command named FOO is invoked. It must either be an executable file in the PATH, or a function. This command gets three parameters:
First parameter : a lonely equal sign
Second parameter: The string cmexec BAR echo "" >
Third paramete : the string .BAZ
I don't believe that anybody would seriously write such a command; my guess is that you made a typo, or error when doing a copy&paste of this command.

How preserve words containing spaces while expanding variables in bash script?

My scheme is the following:
I have a shell script that executes a command which calls a C program :
name=$1
StringWithSpaces=$name
command="someprogram.out $StringWithSpaces $otherarguments"
$command
where name is a string with spaces s.a. "String With Spaces" passed to the shell from another python script.
My problem is that when I read that argument in C, it is passed as several arguments instead of just one. I have tried $#, $* and all that stuff. I have also tried to make a function in C that separate the several argv[i] within the StringWithSpaces one, but I am a bit stuck. I wish I could read the variable in C just as a single argument, to make the program as simple as I can.
This is the exact shell code (bash):
#!/bin/bash
#$1 Nombre de la base de datos
#$2 $3 gps coordinates
#$4 geoDistance (radio en km)
#$5 imDownload (a 0: solo se descargan GPS, a 1 también imágenes y tags)
#$Disabled keywords (opcional, lista de keywords separados por comas)
#Generamos el base path
BASE_DIR=`pwd`
#BASE_DIR=${BASE_DIR%/*}
EXE_DIR=$BASE_DIR/FlickrAPI/bin
DB_PATH=$BASE_DIR/data
#Exportamos las librerías, necesario para que funcione el código
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:${BASE_DIR}/FlickrAPI/lib
cont=1;
#General DB
name=$1
gps="$2 $3"
geoDistance=$4
numImagesDB=3751
ncores=400;
imDownload=$5
dbDir=$DB_PATH/$name
mkdir "$dbDir";
rm -rf "$dbDir"/imagesDB
rm -rf "$dbDir"/tagsDB
rm -rf "$dbDir"/gps.txt
mkdir "$dbDir"/imagesDB
mkdir "$dbDir"/tagsDB
#tidx=`seq 7 $#`;
#keywords="";
#for ((i=7;i<=$#;i++))
#do
# keywords=$keywords" "${!i};
#done
keywords=$6;
export LD_LIBRARY_PATH=/home/user/anaconda2/lib/
command="$EXE_DIR/get_GPS_bimestral $dbDir $gps z $numImagesDB $ncores $geoDistance $imDownload $keywords"
echo $command
$command
Put the command in an array with:
Command=(someprogram.out "$StringWithSpaces" "$otherarguments")
When array is expanded in quotes using # to request all its members, as in "${Command[#]}", then bash expands its array member to a single word. So the desired strings with spaces are kept as single strings.
Here is a sample script:
#!/bin/bash -e
function ShowArguments()
{
for argument in "$#"; do
echo "Argument: $argument"
done
}
Argument1="abc def"
Argument2="pdq xyz"
Command=(ShowArguments "$Argument1" "$Argument2")
echo "${Command[#]}"
"${Command[#]}"
The output from the above is:
ShowArguments abc def pdq xyz
Argument: abc def
Argument: pdq xyz
You may want "$otherarguments" to be $otherarguments. Or, if it contains a string with spaces that should be kept as a string, you should handle it the same way, as an array that is expanded with ${otherarguments[#]} in quotes. Here is an example with quoting of one of the variables used to hold arguments:
#!/bin/bash -e
function ShowArguments()
{
for argument in "$#"; do
echo "Argument: $argument"
done
}
Argument1="Single argument with multiple words"
Argument2=("Multiple argument" with various "numbers of words")
Command=(ShowArguments "$Argument1" "${Argument2[#]}")
echo "${Command[#]}"
"${Command[#]}"
It produces:
ShowArguments Single argument with multiple words Multiple argument with various numbers of words
Argument: Single argument with multiple words
Argument: Multiple argument
Argument: with
Argument: various
Argument: numbers of words
After that, you might want to replace echo with a fancier command that quotes its arguments, to show the proper quoting to the user. But that is an aesthetic or user-interface choice; it will not affect the command executed.
My problem is that when I read that argument in C, it is passed as several arguments instead of just one.
in your script replace
command="$EXE_DIR/get_GPS_bimestral $dbDir $gps z $numImagesDB $ncores $geoDistance $imDownload $keywords"
echo $command
$command
by
command="$EXE_DIR/get_GPS_bimestral \"$dbDir $gps z $numImagesDB $ncores $geoDistance $imDownload $keywords\""
echo $command
echo $command | bash
Explanations :
if the shell script b is :
./a.out "$* 3 4"
and c.c is
#include <stdio.h>
int main(int argc, char ** argv)
{
printf("argc = %d, argv[1] = '%s'\n", argc, argv[1]);
}
then :
pi#raspberrypi:/tmp $ ./b 1 2
argc = 2, argv[1] = '1 2 3 4'
the C program receives only one arg, that is ok
but if the script is modified to be for instance :
command="./a.out \"$* 3 4\""
echo $command
$command
that doesn't work :
pi#raspberrypi:/tmp $ ./b 1 2
./a.out "1 2 3 4"
argc = 5, argv[1] = '"1'
so if you want to store in a var to echo then execute it you can do :
command="./a.out \"$* 3 4\""
echo $command
echo $command | bash
execution:
pi#raspberrypi:/tmp $ ./b 1 2
./a.out "1 2 3 4"
argc = 2, argv[1] = '1 2 3 4'
of course if you want for example 4 received as a separate argument just put it outside the string :
command="./a.out \"$* 3\" 4"
echo $command
echo $command | bash
execution :
pi#raspberrypi:/tmp $ ./b 1 2
./a.out "1 2 3" 4
argc = 3, argv[1] = '1 2 3'

How to write specified arguments in tcl something like -h [duplicate]

Does anyone know a standard package for tcl to easily parse the input arguments ? or a ready proc ? ( I have only 3 flags but something general is preferable ).
The documentation includes an example. Here is a simple example:
package require cmdline
set parameters {
{server.arg "" "Which server to search"}
{debug "Turn on debugging, default=off"}
}
set usage "- A simple script to demo cmdline parsing"
array set options [cmdline::getoptions ::argv $parameters $usage]
parray options
Sample runs:
$ tclsh simple.tcl
options(debug) = 0
options(server) =
$ tclsh simple.tcl -server google.com
options(debug) = 0
options(server) = google.com
$ tclsh simple.tcl -server google.com -debug
options(debug) = 1
options(server) = google.com
$ tclsh simple.tcl -help
simple - A simple script to demo cmdline parsing
-server value Which server to search <>
-debug Turn on debugging, default=off
-help Print this message
-? Print this message
while executing
"error [usage $optlist $usage]"
(procedure "cmdline::getoptions" line 15)
invoked from within
"cmdline::getoptions ::argv $parameters $usage"
invoked from within
"array set options [cmdline::getoptions ::argv $parameters $usage]"
(file "simple.tcl" line 11)
Discussion
Unlike most Linux utilities, TCL uses single dash instead of double dashes for command-line options
When a flags ends with .arg, then that flag expects an argument to follow, such as in the case of server.arg
The debug flag does not end with .arg, therefore it does not expect any argument
The user defines the command-line parameters by a list of lists. Each sub-list contains 2 or 3 parts:
The flag (e.g. debug)
The default value (e.g. 0), only if the parameter takes an argument (flag ends with .arg).
And the help message
Invoke usage/help with -help or -?, however, the output is not pretty, see the last sample run.
Update: Help/Usage
I have been thinking about the message output when the user invoke help (see the last sample run above). To get around that, you need to trap the error yourself:
set usage "- A simple script to demo cmdline parsing"
if {[catch {array set options [cmdline::getoptions ::argv $parameters $usage]}]} {
puts [cmdline::usage $parameters $usage]
} else {
parray options
}
Sample run 2:
$ tclsh simple.tcl -?
simple - A simple script to demo cmdline parsing
-server value Which server to search <>
-debug Turn on debugging, default=off
-help Print this message
-? Print this message
Tcllib has such a package, cmdline. It's a bit underdocumented, but it works.
Here is a simple, native, no-package argument parser:
#
# arg_parse simple argument parser
# Example `arg_parse {help version} {with-value} {-with-value 123 positional arguments}`
# will return:
# `positionals {positional arguments} with-value 123`
#
# #param boolean_flags flags which does not requires additional arguments (like help)
# #param argument_flags flags which requires values (-with-value value)
# #param args the got command line arguments
#
# #return stringified array of parsed arguments
#
proc arg_parse { boolean_flags argument_flags args } {
set argsarr(positionals) {}
for {set i 0} {$i < [llength $args]} {incr i} {
set arg [lindex $args $i]
if { [sstartswith $arg "-" ] } {
set flag [string range $arg 1 end]
if { [lsearch $boolean_flags $flag] >= 0 } {
set argsarr($flag) 1
} elseif { [lsearch $argument_flags $flag] >= 0 } {
incr i
set argsarr($flag) [lindex $args $i]
} else {
puts "ERROR: Unknown flag argument: $arg"
return
}
} else {
lappend argsarr(positionals) $arg
}
}
return [array get argsarr]
}
USE argument parser
#
# USE argument parser:
#
proc my_awesome_proc { args } {
array set argsarr [arg_parse "help version" "with-value" {*}$args]
parray argsarr
}
USE my_awesome_proc :
% my_awesome_proc -help
argsarr(help) = 1
argsarr(positionals) =
% my_awesome_proc -with-value 123
argsarr(positionals) =
argsarr(with-value) = 123
% my_awesome_proc -wrong
ERROR: Unknown flag argument: -wrong
% my_awesome_proc positional arguments
argsarr(positionals) = positional arguments
%

Empty function in BASH

I'm using FPM tool to create .deb package. This tool create before/after remove package from supported files.
Unfortunatly the bash script generated by FPM contains such function
dummy() {
}
And this script exit with an error:
Syntax error: "}" unexpected
Does BASH doesn't allow empty functions? Which version of bash/linux have this limitation?
You could use : that is equivalent to true and is mostly used
as do nothing operator...
dummy(){
:
}
A one liner
dummy(){ :; }
: is the null command
; is needed in the one line format
An empty bash function may be illegal. function contains only comments will be considered to be empty too.
a ":" (null command) can be placed in function if you want to "DO NOTHING"
see: http://tldp.org/LDP/abs/html/functions.html
I recommend this one:
dummy(){ unused(){ :;} }
If you use : null command, it will be printed by xtrace option:
(
set -o xtrace
dummy(){ :; }
dummy "null command"
)
echo ------
(
set -o xtrace
dummy(){ unused(){ :;} }
dummy "unused function"
)
output:
+ dummy 'null command'
+ :
------
+ dummy 'unused function'
For debug I use wrapper like this:
main() {(
pwd # doing something in subshell
)}
print_and_run() {
clear
(
eval "$1() { unused() { :; } }"
set -o xtrace
"$#"
)
time "$#"
}
print_and_run main aaa "bb bb" ccc "ddd"
# output:
# + main aaa 'bb bb' ccc ddd
# ..
dummy_success(){ true; } #always returns 0
dummy_fail(){ false; } #always returns 1
minimal functions returning always OK or ERROR status..
also useful to redefine missing functions with empty function
(or update it with some default action, for example - debug warning):
#!/bin/sh
#avoid error if calling unimportant_func which is underfined
declare -F unimportant_func >/dev/null || unimportant_func() { true; }
#get error if calling important_func which is underfined
declare -F important_func >/dev/null || important_func() { false; }
# print debug assert if function not exists
declare -F some_func >/dev/null || some_func() {
echo "assert: '${FUNCNAME[0]}() is not defined. called from ${BASH_SOURCE[1]##*/}[${BASH_LINENO[0]}]:${FUNCNAME[1]}($#)" 1>&2; }
my_func(){
echo $(some_func a1 a2 a3)
important_func && echo "important_func ok" || echo "important_func error"
unimportant_func && echo "unimportant_func ok" || echo "unimportant_func error"
}
my_func
output:
$> testlib.sh
assert: 'some_func() is not defined. called from testlib.sh[15]:my_func(a1 a2 a3)
important_func error
unimportant_func ok

idioms for returning multiple values in shell scripting

Are there any idioms for returning multiple values from a bash function within a script?
http://tldp.org/LDP/abs/html/assortedtips.html describes how to echo multiple values and process the results (e.g., example 35-17), but that gets tricky if some of the returned values are strings with spaces in.
A more structured way to return would be to assign to global variables, like
foo () {
FOO_RV1="bob"
FOO_RV2="bill"
}
foo
echo "foo returned ${FOO_RV1} and ${FOO_RV2}"
I realize that if I need re-entrancy in a shell script I'm probably doing it wrong, but I still feel very uncomfortable throwing global variables around just to hold return values.
Is there a better way? I would prefer portability, but it's probably not a real limitation if I have to specify #!/bin/bash.
In the special case where your values never contain spaces, this read trick can be a simple solution:
get_vars () {
#...
echo "value1" "value2"
}
read var1 var2 < <(get_vars)
echo "var1='$var1', var2='$var2'"
But of course, it breaks as soon as there is a space in one of the values. You could modify IFS and use a special separator in your function's echo, but then the result is not really simpler than the other suggested solutions.
This question was posted 5 years ago, but I have some interesting answer to post. I have just started learning bash, and I also encounter to the same problem as you did. I think this trick might be helpful:
#!/bin/sh
foo=""
bar=""
my_func(){
echo 'foo="a"; bar="b"'
}
eval $(my_func)
echo $foo $bar
# result: a b
This trick is also useful for solving a problem when a child process can not send back a value to its parent process.
Much as I love shell, it's probably the case that as soon as you're throwing arbitrary structured data around, Unix bourne/posix shell is not the right choice.
If there are characters which do not occur inside fields, then separate with one of those. The classic example is /etc/passwd, /etc/group and various other files which use a colon as a field separator.
If using a shell which can handle a NUL character inside strings, then joining on the NUL and separating on it (via $IFS or whatever) can work well. But several common shells, including bash, break on NUL. A test would be an old .sig of mine:
foo=$'a\0b'; [ ${#foo} -eq 3 ] && echo "$0 rocks"
Even if that would work for you, you've just reached one of the warning signs that it's time to switch to a more structured language (Python, Perl, Ruby, Lua, Javascript ... pick your preferred poison). Your code is likely to become hard to maintain; even if you can, there's a smaller pool of people who'll understand it well enough to maintain it.
Yet another way:
function get_tuple()
{
echo -e "Value1\nValue2"
}
IFS=$'\n' read -d '' -ra VALUES < <(get_tuple)
echo "${VALUES[0]}" # Value1
echo "${VALUES[1]}" # Value2
In order version of Bash which doesn't support nameref (introduced in Bash 4.3-alpha) I may define helper function in which the return value is assigned to the given variable. It's sort of like using eval to do the same kind of variable assignment.
Example 1
## Add two complex numbers and returns it.
## re: real part, im: imaginary part.
##
## Helper function named by the 5th positional parameter
## have to have been defined before the function is called.
complexAdd()
{
local re1="$1" im1="$2" re2="$3" im2="$4" fnName="$5" sumRe sumIm
sumRe=$(($re1 + $re2))
sumIm=$(($im1 + $im2))
## Call the function and return 2 values.
"$fnName" "$sumRe" "$sumIm"
}
main()
{
local fooRe='101' fooIm='37' barRe='55' barIm='123' bazRe bazIm quxRe quxIm
## Define the function to receive mutiple return values
## before calling complexAdd().
retValAssign() { bazRe="$1"; bazIm="$2"; }
## Call comlexAdd() for the first time.
complexAdd "$fooRe" "$fooIm" "$barRe" "$barIm" 'retValAssign'
## Redefine the function to receive mutiple return values.
retValAssign() { quxRe="$1"; quxIm="$2"; }
## Call comlexAdd() for the second time.
complexAdd "$barRe" "$barIm" "$bazRe" "$bazIm" 'retValAssign'
echo "foo = $fooRe + $fooIm i"
echo "bar = $barRe + $barIm i"
echo "baz = foo + bar = $bazRe + $bazIm i"
echo "qux = bar + baz = $quxRe + $quxIm i"
}
main
Example 2
## Add two complex numbers and returns it.
## re: real part, im: imaginary part.
##
## Helper functions
## getRetRe(), getRetIm(), setRetRe() and setRetIm()
## have to have been defined before the function is called.
complexAdd()
{
local re1="$1" im1="$2" re2="$3" im2="$4"
setRetRe "$re1"
setRetRe $(($(getRetRe) + $re2))
setRetIm $(($im1 + $im2))
}
main()
{
local fooRe='101' fooIm='37' barRe='55' barIm='123' bazRe bazIm quxRe quxIm
## Define getter and setter functions before calling complexAdd().
getRetRe() { echo "$bazRe"; }
getRetIm() { echo "$bazIm"; }
setRetRe() { bazRe="$1"; }
setRetIm() { bazIm="$1"; }
## Call comlexAdd() for the first time.
complexAdd "$fooRe" "$fooIm" "$barRe" "$barIm"
## Redefine getter and setter functions.
getRetRe() { echo "$quxRe"; }
getRetIm() { echo "$quxIm"; }
setRetRe() { quxRe="$1"; }
setRetIm() { quxIm="$1"; }
## Call comlexAdd() for the second time.
complexAdd "$barRe" "$barIm" "$bazRe" "$bazIm"
echo "foo = $fooRe + $fooIm i"
echo "bar = $barRe + $barIm i"
echo "baz = foo + bar = $bazRe + $bazIm i"
echo "qux = bar + baz = $quxRe + $quxIm i"
}
main
you can make use of associative arrays with you have bash 4 eg
declare -A ARR
function foo(){
...
ARR["foo_return_value_1"]="VAR1"
ARR["foo_return_value_2"]="VAR2"
}
you can combine them as strings.
function foo(){
...
echo "$var1|$var2|$var3"
}
then whenever you need to use those return values,
ret="$(foo)"
IFS="|"
set -- $ret
echo "var1 one is: $1"
echo "var2 one is: $2"
echo "var3 one is: $3"
I would go for the solution I suggested here, but using an array variable instead. Older bash:es don't support associative arrays.
E.g.,
function some_func() # ARRVAR args...
{
local _retvar=$1 # I use underscore to avoid clashes with return variable names
local -a _out
# ... some processing ... (_out[2]=xxx etc.)
eval $_retvar='("${_out[#]}")'
}
Calling site:
function caller()
{
local -a tuple_ret # Do not use leading '_' here.
# ...
some_func tuple_ret "arg1"
printf " %s\n" "${tuple_ret[#]}" # Print tuple members on separate lines
}
Later version of Bash supports nameref. Use declare -n var_name to give var_name the nameref attribute. nameref gives your function the ability to "pass by reference" which is commonly used in C++ functions to return multiple values. According to Bash man page:
A variable can be assigned the nameref attribute using the -n option to the declare or local builtin commands to create a nameref, or a reference to another variable. This allows variables to be manipulated indirectly. Whenever the nameref variable is referenced or assigned to, the operation is actually performed on the variable specified by the nameref variable's value. A nameref is commonly used within shell functions to refer to a variable whose name is passed as an argument to the function.
The following are some interactive command line examples.
Example 1:
$ unset xx yy
$ xx=16
$ yy=xx
$ echo "[$xx] [$yy]"
[16] [xx]
$ declare -n yy
$ echo "[$xx] [$yy]"
[16] [16]
$ xx=80
$ echo "[$xx] [$yy]"
[80] [80]
$ yy=2016
$ echo "[$xx] [$yy]"
[2016] [2016]
$ declare +n yy # Use -n to add and +n to remove nameref attribute.
$ echo "[$xx] [$yy]"
[2016] [xx]
Example 2:
$ func()
> {
> local arg1="$1" arg2="$2"
> local -n arg3ref="$3" arg4ref="$4"
>
> echo ''
> echo 'Local variables:'
> echo " arg1='$arg1'"
> echo " arg2='$arg2'"
> echo " arg3ref='$arg3ref'"
> echo " arg4ref='$arg4ref'"
> echo ''
>
> arg1='1st value of local assignment'
> arg2='2st value of local assignment'
> arg3ref='1st return value'
> arg4ref='2nd return value'
> }
$
$ unset foo bar baz qux
$
$ foo='value of foo'
$ bar='value of bar'
$ baz='value of baz'
$ qux='value of qux'
$
$ func foo bar baz qux
Local variables:
arg1='foo'
arg2='bar'
arg3ref='value of baz'
arg4ref='value of qux'
$
$ {
> echo ''
> echo '2 values are returned after the function call:'
> echo " foo='$foo'"
> echo " bar='$bar'"
> echo " baz='$baz'"
> echo " qux='$qux'"
> }
2 values are returned after the function call:
foo='value of foo'
bar='value of bar'
baz='1st return value'
qux='2nd return value'
I am new to bash, But found this code helping.
function return_multiple_values() {
eval "$1='What is your name'"
eval "$2='my name is: BASH'"
}
return_var=''
res2=''
return_multiple_values return_var res2
echo $return_var
echo $res2
Shell script functions can only return the exit status of last command executed or the exit status of that function specified explicitly by a return statement.
To return some string one way may be this:
function fun()
{
echo "a+b"
}
var=`fun` # Invoke the function in a new child shell and capture the results
echo $var # use the stored result
This may reduce your discomfort although it adds the overhead of creation of a new shell and hence would be marginally slower.

Resources