I have these two functions in a bash script. I am just trying to pass arguments directly from one function to another without using global vars, but I can't seem to do it.
function suman {
NODE_EXEC_ARGS= "--inspect";
__handle_global_suman "${NODE_EXEC_ARGS}" "$#"
}
function __handle_global_suman {
# I want $1 to be node exec args and $2 to be args to node script
node $1 ${Z}/cli.js $2;
}
the problem I am having: in the __handle_global_suman function,
the values for $1 and $2 seem to represent the original arguments passed to the suman function, not the arguments passed to __handle_global_suman! I want to be able to access the arguments pass to the __handle_global_suman function.
One solution is to use global variables like the following (but this is bad programming in general):
NODE_EXEC_ARGS=""; // default
ORIGINAL_ARGS=""; // default
function suman {
NODE_EXEC_ARGS="--inspect";
ORIGINAL_ARGS="$#"; // assume this captures the arguments passed to this function, not the original script...
__handle_global_suman
}
# ideally there would be a way to make this function truly private
function __handle_global_suman {
# I want $1 to be node exec args and $2 to be args to node script
node ${NODE_EXEC_ARGS} ${Z}/cli.js ${ORIGINAL_ARGS};
}
hopefully you see what I am trying to do and can help, thanks
In the below, we're passing an argument list stored in a local variable by reference:
suman() {
local -a args=( --inspect --debug-brk )
__handle_global_suman args "$#"
}
__handle_global_suman() {
local ref="$1[#]"; shift
node "${!ref}" "${Z}/cli.js" "$#"
}
Why is this different? Because we could also pass:
local -a args=( --inspect --argument-with-spaces="hello cruel world" )
...and it --argument-with-spaces=... would be passed correctly, as exactly one argument.
Your explanation is a little unclear, but I think I get the gist: word splitting isn't working as you expected in Bash.
You need to quote the ${NODE_EXEC_ARGS} parameter to you second function, since in the case that it is whitespace it will be stripped out and wont form a parameter to the called function:
__handle_global_suman "${NODE_EXEC_ARGS}" ${ORIGINAL_ARGS}
Also the ${ORIGINAL_ARGS} var is redundant in your example. You should just pass "$#" directly:
__handle_global_suman "${NODE_EXEC_ARGS}" "$#"
The second proposed alternative solution definitely isn't necessary, definitely is bad practice and you can definitely achieve what you want with Bash function parameter passing.
Here is what works for me, thanks to the help #S.Pinkus whose answer I upvoted since the information contained in the answer was all I needed to fix the problem. Note the below works, but that #CDhuffy's answer is in theory more generic and therefore better.
function suman {
__handle_global_suman "--inspect --debug-brk" "$#"
}
function __handle_global_suman {
# ${1} is "--inspect --debug-brk"
# ${2} is whatever I passed to the suman function
node ${1} ${Z}/cli.js ${2};
}
bash is awesome ;) ...not
Related
I have these two functions:
function two() {
local -a -x var=( ${var[#]} )
echo "${var[#]}"
}
function one() {
local -a -x var=(11 22 33)
two
}
If I call one, then nothing is printed. Why is that?
nothing is print. Why is that?
Here you're having the same identifier name var in both the functions
The var you defined in one could accessed by two because two is called from one. However,
when declaring and setting a local variable in a single command,
apparently the order of operations is to first set the variable, and
only afterwards restrict it to local scope.
So in
local -a -x var=( "${var[#]}" )
the ${var[#]} part will be empty as the variable var is set local first.
To verify this you could change the variable name in one to var1 and and in two do
local -a -x var=( "${var1[#]}" ) # var1 though local to one should be accessible here.
You could use #inian's answer as a work-around to pass variables easily and yet not bother about such dark corners in bash.
Your code is not reflecting what are you trying to do! The locals you've defined are only within the scope of the function Yes! but if you are passing it to the other function, pass it as positional arguments "$#". In the function below when you do two "${var[#]}", you are passing the local array as a positional argument array to be used in the other function.
two() {
local -a -x var=( "$#" )
echo "${var[#]}"
}
The argument list "$#" represents the argument list passed to the function two, now from the function one pass it as
one() {
local -a -x var=(11 22 33)
two "${var[#]}"
}
Also the use of function keyword is non-standard. POSIX does not recommend using it. If you are planning to re-use script for multiple shells, drop the keyword. Also quote the variables/array to avoid them being string-splited and glob-expanded. It could result in unexpected values in the final array.
Also worth noting that variables/arrays are global by default unless you override with local keyword inside a function.
$ x=2
$ test_local(){ local x=1; }
$ test_local; echo "$x"
2
But the same without local would print the value as 1 which proves the point explained above.
When you declare var in two that declaration hides the one in one. Curiously, local variables are visible in called functions. The easiest way to make this work is to do nothing: simply access $var in two.
two() {
echo "${var[#]}"
}
one() {
local var=(11 22 33)
two "${var[#]}"
}
I don't necessarily recommend doing this, though. It makes it hard to understand what two does just by reading it. It's better to explicitly pass the values as arguments.
two() {
local var=("$#")
echo "${var[#]}"
}
one() {
local var=(11 22 33)
two "${var[#]}"
}
By the way, you should always quote your variable expansions to prevent them from being subjected to word splitting and globbing. In your original code you should quote ${var[#]}:
local -a -x var=( "${var[#]}" )
Also, for portability you should either write one() or function one, but not both. I prefer the former.
I got two variables in a bash script. One contains the name of a function within the script while the other one is an array containing KEY=VALUE or KEY='VALUE WITH SPACES' pairs. They are the result of parsing a specific file, and I can't change this.
What I want to do is to invoke the function whose name I got. This is quite simple:
# get the value for the function
myfunc="some_function"
# invoke the function whose name is stored in $myfunc
$myfunc
Consider the function foo be defined as
function foo
{
echo "MYVAR: $MYVAR"
echo "MYVAR2: $MYVAR2"
}
If I get the variables
funcname="foo"
declare -a funcenv=(MYVAR=test "MYVAR2='test2 test3'")
How would I use them to call foo with the pairs of funcenv being added to the environment? A (non-variable) invocation would look like
MYVAR=test MYVAR2='tes2 test3' foo
I tried to script it like
"${funcenv[#]}" "$funcname"
But this leads to an error (MYVAR=test: command not found).
How do I properly call the function with the arguments of the array put in its environment (I do not want to export them, they should just be available for the invoked function)?
You can do like this:
declare -a funcenv=(MYVAR=test "MYVAR2='test2 test3'")
for pairs in "${funcenv[#]}"; do
eval "$pairs"
done
"$funcname"
Note however that the variables will be visible outside the function too.
If you want to avoid that, then you can wrap all the above in a (...) subshell.
why don't you pass them as arguments to your function?
function f() { echo "first: $1"; echo "second: $2"; }
fn=f; $fn oneword "two words"
I hope that I can do something like this, and the output would be "hello"
#!/bin/bash
foo="hello"
dummy() {
local local_foo=`echo $foo`
echo $local_foo
}
foo=''
dummy
This question means that I would like to capture the value of some global values at definition time, usually used via source blablabla.bash and would like that it defines a function that captures current variable's value.
The Sane Way
Functions are evaluated when they're run, not when they're defined. Since you want to capture a variable as it exists at definition time, you'll need a separate variable assigned at that time.
foo="hello"
# By convention, global variables prefixed by a function name and double underscore are for
# the exclusive use of that function.
readonly dummy__foo="$foo" # capture foo as of dummy definition time, and prevent changes
dummy() {
local local_foo=$dummy__foo # ...and refer to that captured copy
echo "$local_foo"
}
foo=""
dummy
The Insane Way
If you're willing to commit crimes against humanity, however, it is possible to do code generation to capture a value. For instance:
# usage: with_locals functionname k1=v1 [k2=v2 [...]]
with_locals() {
local func_name func_text assignments
func_name=$1; shift || return ## fail if out of arguments
(( $# )) || return ## noop if not given at least one assignment
func_text=$(declare -f "$func_name")
for arg; do
if [[ $arg = *=* ]]; then ## if we already look like an assignment, leave be
printf -v arg_q 'local %q; ' "$arg"
else ## otherwise, assume we're a bare name and run a lookup
printf -v arg_q 'local %q=%q; ' "$arg" "${!arg}"
fi
assignments+="$arg_q"
done
# suffix first instance of { in the function definition with our assignments
eval "${func_text/{/{ $assignments}"
}
...thereafter:
foo=hello
dummy() {
local local_foo="$foo"
echo "$local_foo"
}
with_locals dummy foo ## redefine dummy to always use the current value of "foo"
foo=''
dummy
Well, you can comment out or remove the foo='' line, and that will do it. The function dummy does not execute until you call it, which is after you've blanked out the foo value, so it makes sense that you would get a blank line echoed. Hope this helps.
There is no way to execute the code inside a function unless that function gets called by bash. There is only an alternative of calling some other function that is used to define the function you want to call after.
That is what a dynamic function definition is.
I don't believe that you want that.
An alternative is to store the value of foo (calling the function) and then calling it again after the value has changed. Something hack-sh like this:
#!/bin/bash
foo="hello"
dummy() {
${global_foo+false} &&
global_foo="$foo" ||
echo "old_foo=$global_foo new_foo=$foo"
}
dummy
foo='new'
dummy
foo="a whole new foo"
dummy
Calling it will print:
$ ./script
old_foo=hello new_foo=new
old_foo=hello new_foo=a whole new foo
As I am not sure this address your real problem, just: Hope this helps.
After inspired by #CharlesDuffy, I think using eval might solve some of the problems, and the example can be modified as following:
#!/bin/bash
foo="hello"
eval "
dummy() {
local local_foo=$foo
echo \$local_foo
}
"
foo=''
dummy
Which will give the result 'hello' instead of nothing.
#CharlesDuffy pointed out that such solution is quite dangerous:
local local_foo=$foo is dangerously buggy: If your foo value contains
an expansion such as $(rm -rf $HOME), it'll be executed
Using eval is good in performance, however being bad in security. And therefore I'd suggest #CharlesDuffy 's answer.
I need to pass a function as a parameter in Bash. For example, the following code:
function x() {
echo "Hello world"
}
function around() {
echo "before"
eval $1
echo "after"
}
around x
Should output:
before
Hello world
after
I know eval is not correct in that context but that's just an example :)
Any idea?
If you don't need anything fancy like delaying the evaluation of the function name or its arguments, you don't need eval:
function x() { echo "Hello world"; }
function around() { echo before; $1; echo after; }
around x
does what you want. You can even pass the function and its arguments this way:
function x() { echo "x(): Passed $1 and $2"; }
function around() { echo before; "$#"; echo after; }
around x 1st 2nd
prints
before
x(): Passed 1st and 2nd
after
I don't think anyone quite answered the question. He didn't ask if he could echo strings in order. Rather the author of the question wants to know if he can simulate function pointer behavior.
There are a couple of answers that are much like what I'd do, and I want to expand it with another example.
From the author:
function x() {
echo "Hello world"
}
function around() {
echo "before"
($1) <------ Only change
echo "after"
}
around x
To expand this, we will have function x echo "Hello world:$1" to show when the function execution really occurs. We will pass a string that is the name of the function "x":
function x() {
echo "Hello world:$1"
}
function around() {
echo "before"
($1 HERE) <------ Only change
echo "after"
}
around x
To describe this, the string "x" is passed to the function around() which echos "before", calls the function x (via the variable $1, the first parameter passed to around) passing the argument "HERE", finally echos after.
As another aside, this is the methodology to use variables as function names. The variables actually hold the string that is the name of the function and ($variable arg1 arg2 ...) calls the function passing the arguments. See below:
function x(){
echo $3 $1 $2 <== just rearrange the order of passed params
}
Z="x" # or just Z=x
($Z 10 20 30)
gives: 30 10 20, where we executed the function named "x" stored in variable Z and passed parameters 10 20 and 30.
Above where we reference functions by assigning variable names to the functions so we can use the variable in place of actually knowing the function name (which is similar to what you might do in a very classic function pointer situation in c for generalizing program flow but pre-selecting the function calls you will be making based on command line arguments).
In bash these are not function pointers, but variables that refer to names of functions that you later use.
there's no need to use eval
function x() {
echo "Hello world"
}
function around() {
echo "before"
var=$($1)
echo "after $var"
}
around x
You can't pass anything to a function other than strings. Process substitutions can sort of fake it. Bash tends to hold open the FIFO until a command its expanded to completes.
Here's a quick silly one
foldl() {
echo $(($(</dev/stdin)$2))
} < <(tr '\n' "$1" <$3)
# Sum 20 random ints from 0-999
foldl + 0 <(while ((n=RANDOM%999,x++<20)); do echo $n; done)
Functions can be exported, but this isn't as interesting as it first appears. I find it's mainly useful for making debugging functions accessible to scripts or other programs that run scripts.
(
id() {
"$#"
}
export -f id
exec bash -c 'echowrap() { echo "$1"; }; id echowrap hi'
)
id still only gets a string that happens to be the name of a function (automatically imported from a serialization in the environment) and its args.
Pumbaa80's comment to another answer is also good (eval $(declare -F "$1")), but its mainly useful for arrays, not functions, since they're always global. If you were to run this within a function all it would do is redefine it, so there's no effect. It can't be used to create closures or partial functions or "function instances" dependent on whatever happens to be bound in the current scope. At best this can be used to store a function definition in a string which gets redefined elsewhere - but those functions also can only be hardcoded unless of course eval is used
Basically Bash can't be used like this.
A better approach is to use local variables in your functions. The problem then becomes how do you get the result to the caller. One mechanism is to use command substitution:
function myfunc()
{
local myresult='some value'
echo "$myresult"
}
result=$(myfunc) # or result=`myfunc`
echo $result
Here the result is output to the stdout and the caller uses command substitution to capture the value in a variable. The variable can then be used as needed.
You should have something along the lines of:
function around()
{
echo 'before';
echo `$1`;
echo 'after';
}
You can then call around x
eval is likely the only way to accomplish it. The only real downside is the security aspect of it, as you need to make sure that nothing malicious gets passed in and only functions you want to get called will be called (along with checking that it doesn't have nasty characters like ';' in it as well).
So if you're the one calling the code, then eval is likely the only way to do it. Note that there are other forms of eval that would likely work too involving subcommands ($() and ``), but they're not safer and are more expensive.
This question already has answers here:
Make a Bash alias that takes a parameter?
(24 answers)
Closed 5 years ago.
Is it possible to do the following:
I want to run the following:
mongodb bin/mongod
In my bash_profile I have
alias = "./path/to/mongodb/$1"
An alias will expand to the string it represents. Anything after the alias will appear after its expansion without needing to be or able to be passed as explicit arguments (e.g. $1).
$ alias foo='/path/to/bar'
$ foo some args
will get expanded to
$ /path/to/bar some args
If you want to use explicit arguments, you'll need to use a function
$ foo () { /path/to/bar "$#" fixed args; }
$ foo abc 123
will be executed as if you had done
$ /path/to/bar abc 123 fixed args
To undefine an alias:
unalias foo
To undefine a function:
unset -f foo
To see the type and definition (for each defined alias, keyword, function, builtin or executable file):
type -a foo
Or type only (for the highest precedence occurrence):
type -t foo
to use parameters in aliases, i use this method:
alias myalias='function __myalias() { echo "Hello $*"; unset -f __myalias; }; __myalias'
its a self-destructive function wrapped in an alias, so it pretty much is the best of both worlds, and doesnt take up an extra line(s) in your definitions... which i hate, oh yeah and if you need that return value, you'll have to store it before calling unset, and then return the value using the "return" keyword in that self destructive function there:
alias myalias='function __myalias() { echo "Hello $*"; myresult=$?; unset -f __myalias; return $myresult; }; __myalias'
so..
you could, if you need to have that variable in there
alias mongodb='function __mongodb() { ./path/to/mongodb/$1; unset -f __mongodb; }; __mongodb'
of course...
alias mongodb='./path/to/mongodb/'
would actually do the same thing without the need for parameters, but like i said, if you wanted or needed them for some reason (for example, you needed $2 instead of $1), you would need to use a wrapper like that. If it is bigger than one line you might consider just writing a function outright since it would become more of an eyesore as it grew larger. Functions are great since you get all the perks that functions give (see completion, traps, bind, etc for the goodies that functions can provide, in the bash manpage).
I hope that helps you out :)
Usually when I want to pass arguments to an alias in Bash, I use a combination of an alias and a function like this, for instance:
function __t2d {
if [ "$1x" != 'x' ]; then
date -d "#$1"
fi
}
alias t2d='__t2d'
This is the solution which can avoid using function:
alias addone='{ num=$(cat -); echo "input: $num"; echo "result:$(($num+1))"; }<<<'
test result
addone 200
input: 200
result:201
In csh (as opposed to bash) you can do exactly what you want.
alias print 'lpr \!^ -Pps5'
print memo.txt
The notation \!^ causes the argument to be inserted in the command at this point.
The ! character is preceeded by a \ to prevent it being interpreted as a history command.
You can also pass multiple arguments:
alias print 'lpr \!* -Pps5'
print part1.ps glossary.ps figure.ps
(Examples taken from http://unixhelp.ed.ac.uk/shell/alias_csh2.1.html .)
To simplify leed25d's answer, use a combination of an alias and a function. For example:
function __GetIt {
cp ./path/to/stuff/$* .
}
alias GetIt='__GetIt'