I have these two functions:
function two() {
local -a -x var=( ${var[#]} )
echo "${var[#]}"
}
function one() {
local -a -x var=(11 22 33)
two
}
If I call one, then nothing is printed. Why is that?
nothing is print. Why is that?
Here you're having the same identifier name var in both the functions
The var you defined in one could accessed by two because two is called from one. However,
when declaring and setting a local variable in a single command,
apparently the order of operations is to first set the variable, and
only afterwards restrict it to local scope.
So in
local -a -x var=( "${var[#]}" )
the ${var[#]} part will be empty as the variable var is set local first.
To verify this you could change the variable name in one to var1 and and in two do
local -a -x var=( "${var1[#]}" ) # var1 though local to one should be accessible here.
You could use #inian's answer as a work-around to pass variables easily and yet not bother about such dark corners in bash.
Your code is not reflecting what are you trying to do! The locals you've defined are only within the scope of the function Yes! but if you are passing it to the other function, pass it as positional arguments "$#". In the function below when you do two "${var[#]}", you are passing the local array as a positional argument array to be used in the other function.
two() {
local -a -x var=( "$#" )
echo "${var[#]}"
}
The argument list "$#" represents the argument list passed to the function two, now from the function one pass it as
one() {
local -a -x var=(11 22 33)
two "${var[#]}"
}
Also the use of function keyword is non-standard. POSIX does not recommend using it. If you are planning to re-use script for multiple shells, drop the keyword. Also quote the variables/array to avoid them being string-splited and glob-expanded. It could result in unexpected values in the final array.
Also worth noting that variables/arrays are global by default unless you override with local keyword inside a function.
$ x=2
$ test_local(){ local x=1; }
$ test_local; echo "$x"
2
But the same without local would print the value as 1 which proves the point explained above.
When you declare var in two that declaration hides the one in one. Curiously, local variables are visible in called functions. The easiest way to make this work is to do nothing: simply access $var in two.
two() {
echo "${var[#]}"
}
one() {
local var=(11 22 33)
two "${var[#]}"
}
I don't necessarily recommend doing this, though. It makes it hard to understand what two does just by reading it. It's better to explicitly pass the values as arguments.
two() {
local var=("$#")
echo "${var[#]}"
}
one() {
local var=(11 22 33)
two "${var[#]}"
}
By the way, you should always quote your variable expansions to prevent them from being subjected to word splitting and globbing. In your original code you should quote ${var[#]}:
local -a -x var=( "${var[#]}" )
Also, for portability you should either write one() or function one, but not both. I prefer the former.
Related
I have just noticed (by accident) that with bash, if I use a local array inside a function to hold an array passed as a parameter and give that local array the same name as the global parameter passed to the function, the local array ends up empty. This sound a little convoluted so here is an example:
foo() {
declare -a bar=("${!1}")
echo "${bar[#]}"
}
bar=(1 2 3)
foo bar[#]
On my system, Linux running GNU bash 4.4.23 this prints a newline. However both the following variants output 1 2 3:
foo() {
echo "${bar[#]}"
}
bar=(1 2 3)
foo bar[#]
As well as:
foo() {
declare -a foobar=("${!1}")
echo "${foobar[#]}"
}
bar=(1 2 3)
foo bar[#]
I would like to know why this happens, I would guess this has something to do with how bash performs name resolution but I'm not sure at all. Note that I'm not looking for an alternative way to do the same thing I'd just like an explanation.
EDIT: the third snipped previously contained echo "${bar[#]}" but should have read echo "${foobar[#]}".
When you declare a local, it starts out empty.
Indirect variable references use names that are in-scope at lookup time -- meaning, they'll match an empty local before a non-empty global with the same name.
...which is also to say that foo bar[#] isn't in any respect passing the contents of "${bar[#]}" as it exists in the current scope, but is just passing the string bar[#] (if you're lucky; if a file named bar# exists in the current directory, it could be expanded as a glob). And when an indirect lookup is done on bar[#] in the context of a function where bar is a local... well, there's your problem.
So, a more informative example of a working alternative is this:
foo() {
declare -a local_bar=("${!1}")
echo "${local_bar[#]}"
}
bar=(1 2 3)
foo 'bar[#]'
...where your local has a different name (local_bar), so the global isn't shadowed by an empty local.
I got two variables in a bash script. One contains the name of a function within the script while the other one is an array containing KEY=VALUE or KEY='VALUE WITH SPACES' pairs. They are the result of parsing a specific file, and I can't change this.
What I want to do is to invoke the function whose name I got. This is quite simple:
# get the value for the function
myfunc="some_function"
# invoke the function whose name is stored in $myfunc
$myfunc
Consider the function foo be defined as
function foo
{
echo "MYVAR: $MYVAR"
echo "MYVAR2: $MYVAR2"
}
If I get the variables
funcname="foo"
declare -a funcenv=(MYVAR=test "MYVAR2='test2 test3'")
How would I use them to call foo with the pairs of funcenv being added to the environment? A (non-variable) invocation would look like
MYVAR=test MYVAR2='tes2 test3' foo
I tried to script it like
"${funcenv[#]}" "$funcname"
But this leads to an error (MYVAR=test: command not found).
How do I properly call the function with the arguments of the array put in its environment (I do not want to export them, they should just be available for the invoked function)?
You can do like this:
declare -a funcenv=(MYVAR=test "MYVAR2='test2 test3'")
for pairs in "${funcenv[#]}"; do
eval "$pairs"
done
"$funcname"
Note however that the variables will be visible outside the function too.
If you want to avoid that, then you can wrap all the above in a (...) subshell.
why don't you pass them as arguments to your function?
function f() { echo "first: $1"; echo "second: $2"; }
fn=f; $fn oneword "two words"
I have these two functions in a bash script. I am just trying to pass arguments directly from one function to another without using global vars, but I can't seem to do it.
function suman {
NODE_EXEC_ARGS= "--inspect";
__handle_global_suman "${NODE_EXEC_ARGS}" "$#"
}
function __handle_global_suman {
# I want $1 to be node exec args and $2 to be args to node script
node $1 ${Z}/cli.js $2;
}
the problem I am having: in the __handle_global_suman function,
the values for $1 and $2 seem to represent the original arguments passed to the suman function, not the arguments passed to __handle_global_suman! I want to be able to access the arguments pass to the __handle_global_suman function.
One solution is to use global variables like the following (but this is bad programming in general):
NODE_EXEC_ARGS=""; // default
ORIGINAL_ARGS=""; // default
function suman {
NODE_EXEC_ARGS="--inspect";
ORIGINAL_ARGS="$#"; // assume this captures the arguments passed to this function, not the original script...
__handle_global_suman
}
# ideally there would be a way to make this function truly private
function __handle_global_suman {
# I want $1 to be node exec args and $2 to be args to node script
node ${NODE_EXEC_ARGS} ${Z}/cli.js ${ORIGINAL_ARGS};
}
hopefully you see what I am trying to do and can help, thanks
In the below, we're passing an argument list stored in a local variable by reference:
suman() {
local -a args=( --inspect --debug-brk )
__handle_global_suman args "$#"
}
__handle_global_suman() {
local ref="$1[#]"; shift
node "${!ref}" "${Z}/cli.js" "$#"
}
Why is this different? Because we could also pass:
local -a args=( --inspect --argument-with-spaces="hello cruel world" )
...and it --argument-with-spaces=... would be passed correctly, as exactly one argument.
Your explanation is a little unclear, but I think I get the gist: word splitting isn't working as you expected in Bash.
You need to quote the ${NODE_EXEC_ARGS} parameter to you second function, since in the case that it is whitespace it will be stripped out and wont form a parameter to the called function:
__handle_global_suman "${NODE_EXEC_ARGS}" ${ORIGINAL_ARGS}
Also the ${ORIGINAL_ARGS} var is redundant in your example. You should just pass "$#" directly:
__handle_global_suman "${NODE_EXEC_ARGS}" "$#"
The second proposed alternative solution definitely isn't necessary, definitely is bad practice and you can definitely achieve what you want with Bash function parameter passing.
Here is what works for me, thanks to the help #S.Pinkus whose answer I upvoted since the information contained in the answer was all I needed to fix the problem. Note the below works, but that #CDhuffy's answer is in theory more generic and therefore better.
function suman {
__handle_global_suman "--inspect --debug-brk" "$#"
}
function __handle_global_suman {
# ${1} is "--inspect --debug-brk"
# ${2} is whatever I passed to the suman function
node ${1} ${Z}/cli.js ${2};
}
bash is awesome ;) ...not
I hope that I can do something like this, and the output would be "hello"
#!/bin/bash
foo="hello"
dummy() {
local local_foo=`echo $foo`
echo $local_foo
}
foo=''
dummy
This question means that I would like to capture the value of some global values at definition time, usually used via source blablabla.bash and would like that it defines a function that captures current variable's value.
The Sane Way
Functions are evaluated when they're run, not when they're defined. Since you want to capture a variable as it exists at definition time, you'll need a separate variable assigned at that time.
foo="hello"
# By convention, global variables prefixed by a function name and double underscore are for
# the exclusive use of that function.
readonly dummy__foo="$foo" # capture foo as of dummy definition time, and prevent changes
dummy() {
local local_foo=$dummy__foo # ...and refer to that captured copy
echo "$local_foo"
}
foo=""
dummy
The Insane Way
If you're willing to commit crimes against humanity, however, it is possible to do code generation to capture a value. For instance:
# usage: with_locals functionname k1=v1 [k2=v2 [...]]
with_locals() {
local func_name func_text assignments
func_name=$1; shift || return ## fail if out of arguments
(( $# )) || return ## noop if not given at least one assignment
func_text=$(declare -f "$func_name")
for arg; do
if [[ $arg = *=* ]]; then ## if we already look like an assignment, leave be
printf -v arg_q 'local %q; ' "$arg"
else ## otherwise, assume we're a bare name and run a lookup
printf -v arg_q 'local %q=%q; ' "$arg" "${!arg}"
fi
assignments+="$arg_q"
done
# suffix first instance of { in the function definition with our assignments
eval "${func_text/{/{ $assignments}"
}
...thereafter:
foo=hello
dummy() {
local local_foo="$foo"
echo "$local_foo"
}
with_locals dummy foo ## redefine dummy to always use the current value of "foo"
foo=''
dummy
Well, you can comment out or remove the foo='' line, and that will do it. The function dummy does not execute until you call it, which is after you've blanked out the foo value, so it makes sense that you would get a blank line echoed. Hope this helps.
There is no way to execute the code inside a function unless that function gets called by bash. There is only an alternative of calling some other function that is used to define the function you want to call after.
That is what a dynamic function definition is.
I don't believe that you want that.
An alternative is to store the value of foo (calling the function) and then calling it again after the value has changed. Something hack-sh like this:
#!/bin/bash
foo="hello"
dummy() {
${global_foo+false} &&
global_foo="$foo" ||
echo "old_foo=$global_foo new_foo=$foo"
}
dummy
foo='new'
dummy
foo="a whole new foo"
dummy
Calling it will print:
$ ./script
old_foo=hello new_foo=new
old_foo=hello new_foo=a whole new foo
As I am not sure this address your real problem, just: Hope this helps.
After inspired by #CharlesDuffy, I think using eval might solve some of the problems, and the example can be modified as following:
#!/bin/bash
foo="hello"
eval "
dummy() {
local local_foo=$foo
echo \$local_foo
}
"
foo=''
dummy
Which will give the result 'hello' instead of nothing.
#CharlesDuffy pointed out that such solution is quite dangerous:
local local_foo=$foo is dangerously buggy: If your foo value contains
an expansion such as $(rm -rf $HOME), it'll be executed
Using eval is good in performance, however being bad in security. And therefore I'd suggest #CharlesDuffy 's answer.
I need to pass a function as a parameter in Bash. For example, the following code:
function x() {
echo "Hello world"
}
function around() {
echo "before"
eval $1
echo "after"
}
around x
Should output:
before
Hello world
after
I know eval is not correct in that context but that's just an example :)
Any idea?
If you don't need anything fancy like delaying the evaluation of the function name or its arguments, you don't need eval:
function x() { echo "Hello world"; }
function around() { echo before; $1; echo after; }
around x
does what you want. You can even pass the function and its arguments this way:
function x() { echo "x(): Passed $1 and $2"; }
function around() { echo before; "$#"; echo after; }
around x 1st 2nd
prints
before
x(): Passed 1st and 2nd
after
I don't think anyone quite answered the question. He didn't ask if he could echo strings in order. Rather the author of the question wants to know if he can simulate function pointer behavior.
There are a couple of answers that are much like what I'd do, and I want to expand it with another example.
From the author:
function x() {
echo "Hello world"
}
function around() {
echo "before"
($1) <------ Only change
echo "after"
}
around x
To expand this, we will have function x echo "Hello world:$1" to show when the function execution really occurs. We will pass a string that is the name of the function "x":
function x() {
echo "Hello world:$1"
}
function around() {
echo "before"
($1 HERE) <------ Only change
echo "after"
}
around x
To describe this, the string "x" is passed to the function around() which echos "before", calls the function x (via the variable $1, the first parameter passed to around) passing the argument "HERE", finally echos after.
As another aside, this is the methodology to use variables as function names. The variables actually hold the string that is the name of the function and ($variable arg1 arg2 ...) calls the function passing the arguments. See below:
function x(){
echo $3 $1 $2 <== just rearrange the order of passed params
}
Z="x" # or just Z=x
($Z 10 20 30)
gives: 30 10 20, where we executed the function named "x" stored in variable Z and passed parameters 10 20 and 30.
Above where we reference functions by assigning variable names to the functions so we can use the variable in place of actually knowing the function name (which is similar to what you might do in a very classic function pointer situation in c for generalizing program flow but pre-selecting the function calls you will be making based on command line arguments).
In bash these are not function pointers, but variables that refer to names of functions that you later use.
there's no need to use eval
function x() {
echo "Hello world"
}
function around() {
echo "before"
var=$($1)
echo "after $var"
}
around x
You can't pass anything to a function other than strings. Process substitutions can sort of fake it. Bash tends to hold open the FIFO until a command its expanded to completes.
Here's a quick silly one
foldl() {
echo $(($(</dev/stdin)$2))
} < <(tr '\n' "$1" <$3)
# Sum 20 random ints from 0-999
foldl + 0 <(while ((n=RANDOM%999,x++<20)); do echo $n; done)
Functions can be exported, but this isn't as interesting as it first appears. I find it's mainly useful for making debugging functions accessible to scripts or other programs that run scripts.
(
id() {
"$#"
}
export -f id
exec bash -c 'echowrap() { echo "$1"; }; id echowrap hi'
)
id still only gets a string that happens to be the name of a function (automatically imported from a serialization in the environment) and its args.
Pumbaa80's comment to another answer is also good (eval $(declare -F "$1")), but its mainly useful for arrays, not functions, since they're always global. If you were to run this within a function all it would do is redefine it, so there's no effect. It can't be used to create closures or partial functions or "function instances" dependent on whatever happens to be bound in the current scope. At best this can be used to store a function definition in a string which gets redefined elsewhere - but those functions also can only be hardcoded unless of course eval is used
Basically Bash can't be used like this.
A better approach is to use local variables in your functions. The problem then becomes how do you get the result to the caller. One mechanism is to use command substitution:
function myfunc()
{
local myresult='some value'
echo "$myresult"
}
result=$(myfunc) # or result=`myfunc`
echo $result
Here the result is output to the stdout and the caller uses command substitution to capture the value in a variable. The variable can then be used as needed.
You should have something along the lines of:
function around()
{
echo 'before';
echo `$1`;
echo 'after';
}
You can then call around x
eval is likely the only way to accomplish it. The only real downside is the security aspect of it, as you need to make sure that nothing malicious gets passed in and only functions you want to get called will be called (along with checking that it doesn't have nasty characters like ';' in it as well).
So if you're the one calling the code, then eval is likely the only way to do it. Note that there are other forms of eval that would likely work too involving subcommands ($() and ``), but they're not safer and are more expensive.