Bash: local array shadowing parameter ends up empty - bash

I have just noticed (by accident) that with bash, if I use a local array inside a function to hold an array passed as a parameter and give that local array the same name as the global parameter passed to the function, the local array ends up empty. This sound a little convoluted so here is an example:
foo() {
declare -a bar=("${!1}")
echo "${bar[#]}"
}
bar=(1 2 3)
foo bar[#]
On my system, Linux running GNU bash 4.4.23 this prints a newline. However both the following variants output 1 2 3:
foo() {
echo "${bar[#]}"
}
bar=(1 2 3)
foo bar[#]
As well as:
foo() {
declare -a foobar=("${!1}")
echo "${foobar[#]}"
}
bar=(1 2 3)
foo bar[#]
I would like to know why this happens, I would guess this has something to do with how bash performs name resolution but I'm not sure at all. Note that I'm not looking for an alternative way to do the same thing I'd just like an explanation.
EDIT: the third snipped previously contained echo "${bar[#]}" but should have read echo "${foobar[#]}".

When you declare a local, it starts out empty.
Indirect variable references use names that are in-scope at lookup time -- meaning, they'll match an empty local before a non-empty global with the same name.
...which is also to say that foo bar[#] isn't in any respect passing the contents of "${bar[#]}" as it exists in the current scope, but is just passing the string bar[#] (if you're lucky; if a file named bar# exists in the current directory, it could be expanded as a glob). And when an indirect lookup is done on bar[#] in the context of a function where bar is a local... well, there's your problem.
So, a more informative example of a working alternative is this:
foo() {
declare -a local_bar=("${!1}")
echo "${local_bar[#]}"
}
bar=(1 2 3)
foo 'bar[#]'
...where your local has a different name (local_bar), so the global isn't shadowed by an empty local.

Related

Setting a local array to its value

I have these two functions:
function two() {
local -a -x var=( ${var[#]} )
echo "${var[#]}"
}
function one() {
local -a -x var=(11 22 33)
two
}
If I call one, then nothing is printed. Why is that?
nothing is print. Why is that?
Here you're having the same identifier name var in both the functions
The var you defined in one could accessed by two because two is called from one. However,
when declaring and setting a local variable in a single command,
apparently the order of operations is to first set the variable, and
only afterwards restrict it to local scope.
So in
local -a -x var=( "${var[#]}" )
the ${var[#]} part will be empty as the variable var is set local first.
To verify this you could change the variable name in one to var1 and and in two do
local -a -x var=( "${var1[#]}" ) # var1 though local to one should be accessible here.
You could use #inian's answer as a work-around to pass variables easily and yet not bother about such dark corners in bash.
Your code is not reflecting what are you trying to do! The locals you've defined are only within the scope of the function Yes! but if you are passing it to the other function, pass it as positional arguments "$#". In the function below when you do two "${var[#]}", you are passing the local array as a positional argument array to be used in the other function.
two() {
local -a -x var=( "$#" )
echo "${var[#]}"
}
The argument list "$#" represents the argument list passed to the function two, now from the function one pass it as
one() {
local -a -x var=(11 22 33)
two "${var[#]}"
}
Also the use of function keyword is non-standard. POSIX does not recommend using it. If you are planning to re-use script for multiple shells, drop the keyword. Also quote the variables/array to avoid them being string-splited and glob-expanded. It could result in unexpected values in the final array.
Also worth noting that variables/arrays are global by default unless you override with local keyword inside a function.
$ x=2
$ test_local(){ local x=1; }
$ test_local; echo "$x"
2
But the same without local would print the value as 1 which proves the point explained above.
When you declare var in two that declaration hides the one in one. Curiously, local variables are visible in called functions. The easiest way to make this work is to do nothing: simply access $var in two.
two() {
echo "${var[#]}"
}
one() {
local var=(11 22 33)
two "${var[#]}"
}
I don't necessarily recommend doing this, though. It makes it hard to understand what two does just by reading it. It's better to explicitly pass the values as arguments.
two() {
local var=("$#")
echo "${var[#]}"
}
one() {
local var=(11 22 33)
two "${var[#]}"
}
By the way, you should always quote your variable expansions to prevent them from being subjected to word splitting and globbing. In your original code you should quote ${var[#]}:
local -a -x var=( "${var[#]}" )
Also, for portability you should either write one() or function one, but not both. I prefer the former.

how to call a bash function providing environment variables stored in a Bash array?

I got two variables in a bash script. One contains the name of a function within the script while the other one is an array containing KEY=VALUE or KEY='VALUE WITH SPACES' pairs. They are the result of parsing a specific file, and I can't change this.
What I want to do is to invoke the function whose name I got. This is quite simple:
# get the value for the function
myfunc="some_function"
# invoke the function whose name is stored in $myfunc
$myfunc
Consider the function foo be defined as
function foo
{
echo "MYVAR: $MYVAR"
echo "MYVAR2: $MYVAR2"
}
If I get the variables
funcname="foo"
declare -a funcenv=(MYVAR=test "MYVAR2='test2 test3'")
How would I use them to call foo with the pairs of funcenv being added to the environment? A (non-variable) invocation would look like
MYVAR=test MYVAR2='tes2 test3' foo
I tried to script it like
"${funcenv[#]}" "$funcname"
But this leads to an error (MYVAR=test: command not found).
How do I properly call the function with the arguments of the array put in its environment (I do not want to export them, they should just be available for the invoked function)?
You can do like this:
declare -a funcenv=(MYVAR=test "MYVAR2='test2 test3'")
for pairs in "${funcenv[#]}"; do
eval "$pairs"
done
"$funcname"
Note however that the variables will be visible outside the function too.
If you want to avoid that, then you can wrap all the above in a (...) subshell.
why don't you pass them as arguments to your function?
function f() { echo "first: $1"; echo "second: $2"; }
fn=f; $fn oneword "two words"

Accessing function-definition-time, not evaluation-time, value for a variable in bash

I hope that I can do something like this, and the output would be "hello"
#!/bin/bash
foo="hello"
dummy() {
local local_foo=`echo $foo`
echo $local_foo
}
foo=''
dummy
This question means that I would like to capture the value of some global values at definition time, usually used via source blablabla.bash and would like that it defines a function that captures current variable's value.
The Sane Way
Functions are evaluated when they're run, not when they're defined. Since you want to capture a variable as it exists at definition time, you'll need a separate variable assigned at that time.
foo="hello"
# By convention, global variables prefixed by a function name and double underscore are for
# the exclusive use of that function.
readonly dummy__foo="$foo" # capture foo as of dummy definition time, and prevent changes
dummy() {
local local_foo=$dummy__foo # ...and refer to that captured copy
echo "$local_foo"
}
foo=""
dummy
The Insane Way
If you're willing to commit crimes against humanity, however, it is possible to do code generation to capture a value. For instance:
# usage: with_locals functionname k1=v1 [k2=v2 [...]]
with_locals() {
local func_name func_text assignments
func_name=$1; shift || return ## fail if out of arguments
(( $# )) || return ## noop if not given at least one assignment
func_text=$(declare -f "$func_name")
for arg; do
if [[ $arg = *=* ]]; then ## if we already look like an assignment, leave be
printf -v arg_q 'local %q; ' "$arg"
else ## otherwise, assume we're a bare name and run a lookup
printf -v arg_q 'local %q=%q; ' "$arg" "${!arg}"
fi
assignments+="$arg_q"
done
# suffix first instance of { in the function definition with our assignments
eval "${func_text/{/{ $assignments}"
}
...thereafter:
foo=hello
dummy() {
local local_foo="$foo"
echo "$local_foo"
}
with_locals dummy foo ## redefine dummy to always use the current value of "foo"
foo=''
dummy
Well, you can comment out or remove the foo='' line, and that will do it. The function dummy does not execute until you call it, which is after you've blanked out the foo value, so it makes sense that you would get a blank line echoed. Hope this helps.
There is no way to execute the code inside a function unless that function gets called by bash. There is only an alternative of calling some other function that is used to define the function you want to call after.
That is what a dynamic function definition is.
I don't believe that you want that.
An alternative is to store the value of foo (calling the function) and then calling it again after the value has changed. Something hack-sh like this:
#!/bin/bash
foo="hello"
dummy() {
${global_foo+false} &&
global_foo="$foo" ||
echo "old_foo=$global_foo new_foo=$foo"
}
dummy
foo='new'
dummy
foo="a whole new foo"
dummy
Calling it will print:
$ ./script
old_foo=hello new_foo=new
old_foo=hello new_foo=a whole new foo
As I am not sure this address your real problem, just: Hope this helps.
After inspired by #CharlesDuffy, I think using eval might solve some of the problems, and the example can be modified as following:
#!/bin/bash
foo="hello"
eval "
dummy() {
local local_foo=$foo
echo \$local_foo
}
"
foo=''
dummy
Which will give the result 'hello' instead of nothing.
#CharlesDuffy pointed out that such solution is quite dangerous:
local local_foo=$foo is dangerously buggy: If your foo value contains
an expansion such as $(rm -rf $HOME), it'll be executed
Using eval is good in performance, however being bad in security. And therefore I'd suggest #CharlesDuffy 's answer.

Variable scope with child functions

I read this today
"Local can only be used within a function; it makes the variable name have a
visible scope restricted to that function and its children."
The ABS Guide author considers this behavior to be a bug.
ยง Local Variables
and I came up with this script
begin () {
local foo
alpha
}
alpha () {
foo=333 bar=444
bravo
}
bravo () {
printf 'foo %3s bar %s\n' "$foo" "$bar"
}
begin
bravo
Output
foo 333 bar 444
foo bar 444
So as you can see, because I did not local bar, it leaked out into global
scope. Questions:
Is a local variable being available to a child function actually a bug, or was
that just his opinion?
Does Bash have a way to mark everything local, similar to how set -a marks
everything for export?
Failing that, does Bash have a way I can check for these leaked global
variables?
Is a local variable being available to a child function actually a bug, or was that just his opinion?
No, it's not a bug. That's just his opinion.
Does Bash have a way to mark everything local, similar to how set -a marks everything for export?
No.
Failing that, does Bash have a way I can check for these leaked global variables?
Yes. Just try "set" or "declare", both without any parameter.
Failing that, does Bash have a way I can check for these leaked global variables?
No. Bash has an undocumented concept called "hidden variables" that make it impossible to test for whether a local is set without disturbing the variable.
This test demonstrates a hidden variable together with the scope-sensitive nature of the unset builtin.
function f {
case $1 in 1)
typeset x=1
f 2
;;
2)
typeset x
unset -v x # Does nothing (demonstrates hidden local)
f 3
;;
[345])
printf "x is %sunset\n" ${x+"not "}
unset -v x
f $(($1 + 1))
esac
}
f 1
# output:
# x is unset
# x is not unset
# x is unset
Bash has a way to force setting a global using declare -g, however there is no way to force bash to dereference it, or test whether it is set, making that feature of very limited utility.
This hopefully demonstrates the problem clearly
f() {
local x="in x" # Assign a local
declare -g x=global # Assign a global
declare -p x # prints "in x"
unset -v x # try unsetting the local
declare -p x # error (x is invisible)
}
f
declare -p x # x is visible again, but there's no way to test for that before now.

Passing argument to alias in bash [duplicate]

This question already has answers here:
Make a Bash alias that takes a parameter?
(24 answers)
Closed 5 years ago.
Is it possible to do the following:
I want to run the following:
mongodb bin/mongod
In my bash_profile I have
alias = "./path/to/mongodb/$1"
An alias will expand to the string it represents. Anything after the alias will appear after its expansion without needing to be or able to be passed as explicit arguments (e.g. $1).
$ alias foo='/path/to/bar'
$ foo some args
will get expanded to
$ /path/to/bar some args
If you want to use explicit arguments, you'll need to use a function
$ foo () { /path/to/bar "$#" fixed args; }
$ foo abc 123
will be executed as if you had done
$ /path/to/bar abc 123 fixed args
To undefine an alias:
unalias foo
To undefine a function:
unset -f foo
To see the type and definition (for each defined alias, keyword, function, builtin or executable file):
type -a foo
Or type only (for the highest precedence occurrence):
type -t foo
to use parameters in aliases, i use this method:
alias myalias='function __myalias() { echo "Hello $*"; unset -f __myalias; }; __myalias'
its a self-destructive function wrapped in an alias, so it pretty much is the best of both worlds, and doesnt take up an extra line(s) in your definitions... which i hate, oh yeah and if you need that return value, you'll have to store it before calling unset, and then return the value using the "return" keyword in that self destructive function there:
alias myalias='function __myalias() { echo "Hello $*"; myresult=$?; unset -f __myalias; return $myresult; }; __myalias'
so..
you could, if you need to have that variable in there
alias mongodb='function __mongodb() { ./path/to/mongodb/$1; unset -f __mongodb; }; __mongodb'
of course...
alias mongodb='./path/to/mongodb/'
would actually do the same thing without the need for parameters, but like i said, if you wanted or needed them for some reason (for example, you needed $2 instead of $1), you would need to use a wrapper like that. If it is bigger than one line you might consider just writing a function outright since it would become more of an eyesore as it grew larger. Functions are great since you get all the perks that functions give (see completion, traps, bind, etc for the goodies that functions can provide, in the bash manpage).
I hope that helps you out :)
Usually when I want to pass arguments to an alias in Bash, I use a combination of an alias and a function like this, for instance:
function __t2d {
if [ "$1x" != 'x' ]; then
date -d "#$1"
fi
}
alias t2d='__t2d'
This is the solution which can avoid using function:
alias addone='{ num=$(cat -); echo "input: $num"; echo "result:$(($num+1))"; }<<<'
test result
addone 200
input: 200
result:201
In csh (as opposed to bash) you can do exactly what you want.
alias print 'lpr \!^ -Pps5'
print memo.txt
The notation \!^ causes the argument to be inserted in the command at this point.
The ! character is preceeded by a \ to prevent it being interpreted as a history command.
You can also pass multiple arguments:
alias print 'lpr \!* -Pps5'
print part1.ps glossary.ps figure.ps
(Examples taken from http://unixhelp.ed.ac.uk/shell/alias_csh2.1.html .)
To simplify leed25d's answer, use a combination of an alias and a function. For example:
function __GetIt {
cp ./path/to/stuff/$* .
}
alias GetIt='__GetIt'

Resources