Evaluate variable at time of function declaration in shell - bash

I'm setting up my shell environments and I want to be able to use some of the same functions/aliases in zsh as in bash. One of these functions opens either .bashrc or .zshrc in an editor (whichever file is relevant), waits for the editor to close, then reloads the rc file.
# a very simplified version of this function
editrc() {
local rcfile=".$(basename $SHELL)rc"
code -w ~/$rcfile
. ~/$rcfile
}
I use the value of rcfile in a few other functions, so I've pulled it out of the function declaration.
_rc=".$(basename $SHELL)rc"
editrc() {
code -w ~/$_rc
. ~/$_rc
}
# ... other functions that use it ...
unset _rc
However, because I'm a neat freak, I want to unset _rc at the end of my script, but I still want my functions to run correctly. Is there a clever way to evaluate $_rc at the time the function is declared?
I know I could use eval and place everything except $_rc instances within single quotes, but that seems like a pain, since the full version of my function uses both single-quotes and double-quotes.
_rc=".$(basename $SHELL)rc"
eval 'editrc() {
echo Here'"'"'s a thing that uses single quotes. As you can see it'"'"'s a pain.
code -w ~/'$_rc'
. ~/'$_rc'
}'
# ... other functions using `_rc`
unset _rc
I'm guessing I could declare my functions, then do some magic with eval "$(declare -f editrc | awk)". It very well be more pain than it's worth, but I'm always interested in learning new things.
Note: I'd love to generalize this into a utility function that does this.
_myvar=foo
anothervar=bar
myfunc() {
echo $_myvar $anothervar
}
# redeclares myfunc with `$_myvar` expanded, but leaves `$anothervar` as-is
expandfunctionvars myfunc '$_myvar'

Is there a clever way to evaluate $_rc at the time the function is declared?
_rc=".$(basename "$SHELL")rc"
# while you could eval here, source lets you work with a stream
source <(
cat <<EOF
editrc() {
local _rc
# first safely trasfer context
$(declare -p _rc)
EOF
# use quoted here string to do anything inside without caring.
cat <<'EOF'
# do anything else
echo "Here's a thing that uses single quotes. As you can see it's not a pain, just choose proper quoting."
code -w "~/$_rc"
. "~/$_rc"
}
EOF
)
unset _rc
Generally first use declare -p to transfer variables as strings to be evaluated. Then after you "import" variables, use a quoted here document to do anything as in a normal script.
References to read:
<<EOF is a here document. Note the difference in parsing when the here delimiter is quoted vs unquoted.
<(..) is a process substitution
The source command reads a pipe created by process substitution. Inside the process subtitution I output the function to be sourced. With the first here document I output the function name definition, with a local of the variable so that it doesn't pollute global namespace. Then with declare -p I output the variable definition as a properly quoted string later to be sourced by source. Then with a quoted here document I output the rest of the function, so that I do not need to care about quoting.
The code is bash specific, I know nothing about zsh and don't use it.
You could do it with eval too:
eval '
editrc() {
local _rc
# first safely trasfer context
'"$(declare -p _rc)"'
# use quoted here string to do anything inside without caring.
# do anything else
echo "Here'\''s a thing that uses single quotes. As you can see it'\''s not a pain, just choose proper quoting."
code -w "~/$_rc"
. "~/$_rc"
}'
But for me using a quoted here document delimiter allows for easier writing.

While KamilCuck was working on their answer, I devised a function that will take in any function name and a set of variable names, expand just those variables, and redeclare the function.
expandFnVars() {
if [[ $# -lt 2 ]]; then
>&2 echo 'expandFnVars requires at least two arguments: the function name and the variable(s) to be expanded'
return 1
fi
local fn="$1"
shift
local vars=("$#")
if [[ -z "$(declare -F $fn 2> /dev/null)" ]]; then
>&2 echo $fn is not a function.
return 1
fi
foundAllVars=true
for v in $vars; do
if [[ -z "$(declare -p $v 2> /dev/null)" ]]; then
>&2 echo $v is not a declared value.
foundAllVars=false
fi
done
[[ $foundAllVars != true ]] && return 1
fn="$(declare -f $fn)"
for v in $vars; do
local val="$(eval 'echo $'$v)" # get the value of the varable represented by $v
val="${val//\"/\\\"}" # escape any double-quotes
val="${val//\\/\\\\\\}" # escape any backslashes
fn="$(echo "$fn" | sed -r 's/"?\$'$v'"?/"'"$val"'"/g')" # replace instances of "$$v" and $$v with $val
done
eval "$fn"
}
Usage:
foo="foo bar"
bar='$foo'
baz=baz
fn() {
echo $bar $baz
}
expandFnVars fn bar
declare -f fn
# prints:
# fn ()
# {
# echo "$foo" $baz
# }
expandFnVars fn foo
declare -f fn
# prints:
# fn ()
# {
# echo "foo bar" $baz
# }
Looking at it now, I see one flaw. Suppose $bar in the original function was in single-quotes. We probably would not want its value to be replaced. This could be fixed by some clever regex lookbehinds to count the number of unescaped 's, but I'm happy with it as-is.

Related

Are there lisp-like macros in the shell?

I have a set of shell commands that look like this
if check-some-condition $a then;
do stuff
run-exit-code $a
fi
where check-some-condition and run-exit-code could be replaced by functions taking a single argument $a, while do stuff is a placeholder for possibly several shell commands. Is it possible to emulate the Lisp functionality of a macro where I could just write
(my-macro $a stuff)
and have it replaced by the code above? I am using Bash but I can use any other shell if they have features that make this easier. I thought at first of using functions but I don't think I can pass in a block of commands.
There isn't a macro definition system in the shell, and consequently shell syntax is not walked in order to expand macros. However, the shell has a textual eval command. You can write a function which synthesizes shell syntax, such as by inserting arguments it has been given, into a template. The function can print that syntax, which the caller can capture using $(...) command substitution syntax and pass to eval:
eval $(macro-like foo bar)
The expansion will happen every time that line of code is executed.
I've done something like this on a very small number of occasions. I don't remember all the details, but I remember that the code was also taking advantage of Bash local variables, which have dynamic scope, like ancient Lisp dialects and defvar variables in Common Lisp.
In Bash, eval takes place in a dynamic environment which sees the surrounding local variables, which is something you can exploit; it can help bring about some macro-like semantics. In a Lisp with lexically scoped local variables, eval-ed code has no access to those variables, but macro-substituted code does. Under dynamic scope, evaled code can access and assign surrounding locals.
Here is an example. Note that because eval is a command which has no control over expansions taking place in its argument space because it is called (analogously to eval in Lisp being a function which doesn't control argument evaluation), the client code is encumbered with quoting responsibilities.
# $1 = variable
# $2 = low
# $3 = high
# $4 = body
dofor()
{
cat <<!
$1=$2 ;
while [ \$$1 -lt $3 ] ; do
$4
$1=\$(( $1 + 1 ))
done
!
}
eval "$(dofor i 0 100 'printf "[%d]\n" $i')"
We could make it so that
eval $(dofor i 0 100 'printf "[%d]\n" $i')
works without the quotes, at the cost of more heaps of arcane escapery inside dofor.
Imagine we extended the shell with a built-in command evalcmd, which let us write this instead of the above:
evalcmd dofor i 0 100 'printf "[%d]\n" $i'
Can we write that as a shell function? It turns out, yes:
# run the command specified in the arguments
# capturing its output, which is evaled in quotes
evcmd()
{
eval "$("$#")"
}
evcmd dofor i 0 100 'printf "[%d]\n" $i'
Now, though still monstrously inefficient, it's substantially more ergonomic.
Finally, let's ask: could we split dofor into an a dofor_impl which generates the code, and a dofor command which calls dofor_impl and invokes the evcmd semantics? Also, yes:
dofor_impl()
{
cat <<!
$1=$2 ;
while [ \$$1 -lt $3 ] ; do
$4
$1=\$(( $1 + 1 ))
done
!
}
dofor()
{
# like evcmd, but inserting an operator into the left position
eval "$(dofor_impl "$#")"
}
dofor i 0 100 'printf "[%d]\n" $i'
This is not bad for some simple uses, but what we can't achieve is not having to put the $i into a quote so that the substitution doesn't take place before dofor is invoked.
In Bash you can define functions as follows:
function run_exit_code () {
echo "EXIT $1"
}
function check_some_condition () {
echo "CHECKING $1";
true
}
And your code can execute commands associated with variables:
function my_code () {
var=$1
stuff=$2
if check_some_condition $var; then
echo "OK";
$stuff;
run_exit_code $var
fi
}
So you can write, for example:
$ my_code /tmp 'ls /'
CHECKING /tmp
OK
bin boot cdrom dev etc home lib lib32 lib64 libx32 lost+found media mnt opt proc root run sbin srv swapfile sys tmp usr var
EXIT /tmp
If you want stuff to refer to $var, then you need to add eval:
function my_code () {
var=$1
stuff=$2
if check_some_condition $var; then
echo "OK";
eval $stuff; # <<< eval
run_exit_code $var
fi
}
This allows you to write a quoted bash expression and have it beeing evaluated in the context of your function:
$ my_code / 'ls $var'
CHECKING /
OK
bin boot cdrom dev etc home lib lib32 ...
EXIT /

Quoting parameters with spaces for later execution

I have this (test) script:
#!/bin/bash
my_cmd_bad_ ( ) {
cmd="$#"
$cmd
}
my_cmd_good_ ( ) {
"$#"
}
my_cmd_bad_ ls -l "file with space"
my_cmd_good_ ls -l "file with space"
The output is (the file does not exist, which is not the point of this question):
» ~/test.sh
ls: cannot access file: No such file or directory
ls: cannot access with: No such file or directory
ls: cannot access space: No such file or directory
ls: cannot access file with space: No such file or directory
I am surprised that the first version does not work as expected: the parameter is not quoted, and instead of processing one file, it processes three. Why?
How can I save the command that I want to execute, properly quoted? I need to execute it later, where I do not have "$#" anymore.
A simple rework of this test script would be appreciated.
See similar question: How to pass command line parameters with quotes stored in single variable?
Use those utility functions ho save a command to a string for later execution:
bash_escape() {
# backtick indirection strictly necessary here: we use it to strip the
# trailing newline from sed's output, which Solaris/BSD sed *always* output
# (unlike GNU sed, which outputs "test": printf %s test | sed -e s/dummy//)
out=`echo "$1" | sed -e s/\\'/\\''\\\\'\\'\\'/g`
printf \'%s\' "$out"
}
append_bash_escape() {
printf "%s " "$1"
bash_escape "$2"
}
your_cmd_fixed_ ( ) {
cmd="$#"
while [ $# -gt 0 ] ; do
cmd=`append_bash_escape "$cmd" "$1"` ; shift
done
$cmd
}
You can quote any single parameter and evaluate it later:
my_cmd_bad_ ( ) {
j=0
for i in "$#"; do
cmd["$j"]=\"$"$i"\"
j=$(( $j + 1 ))
done;
eval ${cmd[*]}
}
You are combining three space-delimited strings "ls", "-l", and "file with space" into a single space-delimited string cmd. There's no way to know which spaces were originally quoted (in "file with space") and which spaces were introduced during the assignment to cmd.
Typically, it is not a good idea to try to build up command lines into a single string. Use functions, or isolate the actual command and leave the arguments in $#.
Rewrite the command like this:
my_cmd_bad_ () {
cmd=$1; shift
$cmd "$#"
}
See http://mywiki.wooledge.org/BashFAQ/050
Note that your second version is greatly preferred most of the time. The only exceptions are if you need to do something special. For example, you can't bundle an assignment or redirect or compound command into a parameter list.
The correct way to handle the quoting issue requires non-standard features. Semi-realistic example involving a template:
function myWrapper {
typeset x IFS=$' \t\n'
{ eval "$(</dev/fd/0)"; } <<-EOF
for x in $(printf '%q ' "$#"); do
echo "\$x"
done
EOF
}
myWrapper 'foo bar' $'baz\nbork'
Make sure you understand exactly what's going on here and that you really have a good reason for doing this. It requires ensuring side-effects can't affect the arguments. This specific example doesn't demonstrate a very good use case because everything is hard-coded so you're able to correctly escape things in advance and expand the arguments quoted if you wanted.

Is there a way to avoid positional arguments in bash?

I have to write a function in bash. The function will take about 7 arguments. I know that I can call a function like this:
To call a function with parameters:
function_name $arg1 $arg2
And I can refer my parameters like this inside the function:
function_name () {
echo "Parameter #1 is $1"
}
My question is, is there a better way refer to the parameters inside the function? Can I avoid the $1, $2, $3, .... thing and simply use the $arg1, $arg2, ...?
Is there a proper method for this or do I need to re-assign these parameters to some other variables inside the function? E.g.:
function_name () {
$ARG1=$1
echo "Parameter #1 is $ARG1"
}
Any example would be much appreciated.
The common way of doing that is assigning the arguments to local variables in the function, i.e.:
copy() {
local from=${1}
local to=${2}
# ...
}
Another solution may be getopt-style option parsing.
copy() {
local arg from to
while getopts 'f:t:' arg
do
case ${arg} in
f) from=${OPTARG};;
t) to=${OPTARG};;
*) return 1 # illegal option
esac
done
}
copy -f /tmp/a -t /tmp/b
Sadly, bash can't handle long options which would be more readable, i.e.:
copy --from /tmp/a --to /tmp/b
For that, you either need to use the external getopt program (which I think has long option support only on GNU systems) or implement the long option parser by hand, i.e.:
copy() {
local from to
while [[ ${1} ]]; do
case "${1}" in
--from)
from=${2}
shift
;;
--to)
to=${2}
shift
;;
*)
echo "Unknown parameter: ${1}" >&2
return 1
esac
if ! shift; then
echo 'Missing parameter argument.' >&2
return 1
fi
done
}
copy --from /tmp/a --to /tmp/b
Also see: using getopts in bash shell script to get long and short command line options
You can also be lazy, and just pass the 'variables' as arguments to the function, i.e.:
copy() {
local "${#}"
# ...
}
copy from=/tmp/a to=/tmp/b
and you'll have ${from} and ${to} in the function as local variables.
Just note that the same issue as below applies — if a particular variable is not passed, it will be inherited from parent environment. You may want to add a 'safety line' like:
copy() {
local from to # reset first
local "${#}"
# ...
}
to ensure that ${from} and ${to} will be unset when not passed.
And if something very bad is of your interest, you could also assign the arguments as global variables when invoking the function, i.e.:
from=/tmp/a to=/tmp/b copy
Then you could just use ${from} and ${to} within the copy() function. Just note that you should then always pass all parameters. Otherwise, a random variable may leak into the function.
from= to=/tmp/b copy # safe
to=/tmp/b copy # unsafe: ${from} may be declared elsewhere
If you have bash 4.1 (I think), you can also try using associative arrays. It will allow you to pass named arguments but it will be ugly. Something like:
args=( [from]=/tmp/a [to]=/tmp/b )
copy args
And then in copy(), you'd need to grab the array.
You can always pass things through the environment:
#!/bin/sh
foo() {
echo arg1 = "$arg1"
echo arg2 = "$arg2"
}
arg1=banana arg2=apple foo
All you have to do is name variables on the way in to the function call.
function test() {
echo $a
}
a='hello world' test
#prove variable didnt leak
echo $a .
This isn't just a feature of functions, you could have that function in it's own script and call a='hello world' test.sh and it would work just the same
As an extra little bit of fun, you can combine this method with positional arguments (say you were making a script and some users mightn't know the variable names).
Heck, why not let it have defaults for those arguments too? Well sure, easy peasy!
function test2() {
[[ -n "$1" ]] && local a="$1"; [[ -z "$a" ]] && local a='hi'
[[ -n "$2" ]] && local b="$2"; [[ -z "$b" ]] && local b='bye'
echo $a $b
}
#see the defaults
test2
#use positional as usual
test2 '' there
#use named parameter
a=well test2
#mix it up
b=one test2 nice
#prove variables didnt leak
echo $a $b .
Note that if test was its own script, you don't need to use the local keyword.
Shell functions have full access to any variable available in their calling scope, except for those variable names that are used as local variables inside the function itself. In addition, any non-local variable set within a function is available on the outside after the function has been called. Consider the following example:
A=aaa
B=bbb
echo "A=$A B=$B C=$C"
example() {
echo "example(): A=$A B=$B C=$C"
A=AAA
local B=BBB
C=CCC
echo "example(): A=$A B=$B C=$C"
}
example
echo "A=$A B=$B C=$C"
This snippet has the following output:
A=aaa B=bbb C=
example(): A=aaa B=bbb C=
example(): A=AAA B=BBB C=CCC
A=AAA B=bbb C=CCC
The obvious disadvantage of this approach is that functions are not self-contained any more and that setting a variable outside a function may have unintended side-effects. It would also make things harder if you wanted to pass data to a function without assigning it to a variable first, since this function is not using positional parameters any more.
The most common way to handle this is to use local variables for arguments and any temporary variable within a function:
example() {
local A="$1" B="$2" C="$3" TMP="/tmp"
...
}
This avoids polluting the shell namespace with function-local variables.
I think I have a solution for you.
With a few tricks you can actually pass named parameters to functions, along with arrays.
The method I developed allows you to access parameters passed to a function like this:
testPassingParams() {
#var hello
l=4 #array anArrayWithFourElements
l=2 #array anotherArrayWithTwo
#var anotherSingle
#reference table # references only work in bash >=4.3
#params anArrayOfVariedSize
test "$hello" = "$1" && echo correct
#
test "${anArrayWithFourElements[0]}" = "$2" && echo correct
test "${anArrayWithFourElements[1]}" = "$3" && echo correct
test "${anArrayWithFourElements[2]}" = "$4" && echo correct
# etc...
#
test "${anotherArrayWithTwo[0]}" = "$6" && echo correct
test "${anotherArrayWithTwo[1]}" = "$7" && echo correct
#
test "$anotherSingle" = "$8" && echo correct
#
test "${table[test]}" = "works"
table[inside]="adding a new value"
#
# I'm using * just in this example:
test "${anArrayOfVariedSize[*]}" = "${*:10}" && echo correct
}
fourElements=( a1 a2 "a3 with spaces" a4 )
twoElements=( b1 b2 )
declare -A assocArray
assocArray[test]="works"
testPassingParams "first" "${fourElements[#]}" "${twoElements[#]}" "single with spaces" assocArray "and more... " "even more..."
test "${assocArray[inside]}" = "adding a new value"
In other words, not only you can call your parameters by their names (which makes up for a more readable core), you can actually pass arrays (and references to variables - this feature works only in bash 4.3 though)! Plus, the mapped variables are all in the local scope, just as $1 (and others).
The code that makes this work is pretty light and works both in bash 3 and bash 4 (these are the only versions I've tested it with). If you're interested in more tricks like this that make developing with bash much nicer and easier, you can take a look at my Bash Infinity Framework, the code below was developed for that purpose.
Function.AssignParamLocally() {
local commandWithArgs=( $1 )
local command="${commandWithArgs[0]}"
shift
if [[ "$command" == "trap" || "$command" == "l="* || "$command" == "_type="* ]]
then
paramNo+=-1
return 0
fi
if [[ "$command" != "local" ]]
then
assignNormalCodeStarted=true
fi
local varDeclaration="${commandWithArgs[1]}"
if [[ $varDeclaration == '-n' ]]
then
varDeclaration="${commandWithArgs[2]}"
fi
local varName="${varDeclaration%%=*}"
# var value is only important if making an object later on from it
local varValue="${varDeclaration#*=}"
if [[ ! -z $assignVarType ]]
then
local previousParamNo=$(expr $paramNo - 1)
if [[ "$assignVarType" == "array" ]]
then
# passing array:
execute="$assignVarName=( \"\${#:$previousParamNo:$assignArrLength}\" )"
eval "$execute"
paramNo+=$(expr $assignArrLength - 1)
unset assignArrLength
elif [[ "$assignVarType" == "params" ]]
then
execute="$assignVarName=( \"\${#:$previousParamNo}\" )"
eval "$execute"
elif [[ "$assignVarType" == "reference" ]]
then
execute="$assignVarName=\"\$$previousParamNo\""
eval "$execute"
elif [[ ! -z "${!previousParamNo}" ]]
then
execute="$assignVarName=\"\$$previousParamNo\""
eval "$execute"
fi
fi
assignVarType="$__capture_type"
assignVarName="$varName"
assignArrLength="$__capture_arrLength"
}
Function.CaptureParams() {
__capture_type="$_type"
__capture_arrLength="$l"
}
alias #trapAssign='Function.CaptureParams; trap "declare -i \"paramNo+=1\"; Function.AssignParamLocally \"\$BASH_COMMAND\" \"\$#\"; [[ \$assignNormalCodeStarted = true ]] && trap - DEBUG && unset assignVarType && unset assignVarName && unset assignNormalCodeStarted && unset paramNo" DEBUG; '
alias #param='#trapAssign local'
alias #reference='_type=reference #trapAssign local -n'
alias #var='_type=var #param'
alias #params='_type=params #param'
alias #array='_type=array #param'
I was personally hoping to see some sort of syntax like
func(a b){
echo $a
echo $b
}
But since that's not a thing, and a I see quite a few references to global variables (not without the caveat of scoping and naming conflicts), I'll share my approach.
Using the copy function from Michal's answer:
copy(){
cp $from $to
}
from=/tmp/a
to=/tmp/b
copy
This is bad, because from and to are such broad words that any number of functions could use this. You could quickly end up with a naming conflict or a "leak" on your hands.
letter(){
echo "From: $from"
echo "To: $to"
echo
echo "$1"
}
to=Emily
letter "Hello Emily, you're fired for missing two days of work."
# Result:
# From: /tmp/a
# To: Emily
# Hello Emily, you're fired for missing two days of work.
So my approach is to "namespace" them. I name the variable after the function and delete it after the function is done with it. Of course, I only use it for optional values that have default values. Otherwise, I just use positional args.
copy(){
if [[ $copy_from ]] && [[ $copy_to ]]; then
cp $copy_from $copy_to
unset copy_from copy_to
fi
}
copy_from=/tmp/a
copy_to=/tmp/b
copy # Copies /tmp/a to /tmp/b
copy # Does nothing, as it ought to
letter "Emily, you're 'not' re-hired for the 'not' bribe ;)"
# From: (no /tmp/a here!)
# To:
# Emily, you're 'not' re-hired for the 'not' bribe ;)
I would make a terrible boss...
In practice, my function names are more elaborate than "copy" or "letter".
The most recent example to my memory is get_input(), which has gi_no_sort and gi_prompt.
gi_no_sort is a true/false value that determines whether the completion suggestions are sorted or not. Defaults to true
gi_prompt is a string that is...well, that's self-explanatory. Defaults to "".
The actual arguments the function takes are the source of the aforementioned 'completion suggestions' for the input prompt, and as said list is taken from $# in the function, the "named args" are optional[1], and there's no obvious way to distinguish between a string meant as a completion and a boolean/prompt-message, or really anything space-separated in bash, for that matter[2]; the above solution ended up saving me a lot of trouble.
notes:
So a hard-coded shift and $1, $2, etc. are out of the question.
E.g. is "0 Enter a command: {1..9} $(ls)" a value of 0, "Enter a command:", and a set of 1 2 3 4 5 6 7 8 9 <directory contents>? Or are "0", "Enter", "a", and "command:" part of that set as well? Bash will assume the latter whether you like it or not.
Arguments get sent to functions as an tuple of individual items, so they have no names as such, just positions. this allows some interesting possibilities like below, but it does mean that you are stuck with $1. $2, etc. as to whether to map them to better names, the question comes down to how big the function is, and how much clearer it will make reading the code. if its complex, then mapping meaningful names ($BatchID, $FirstName, $SourceFilePath) is a good idea. for simple stuff though, it probably isn't necessary. I certianly wouldn't bother if you are using names like $arg1.
now, if you just want to echo back the parameters, you can iterate over them:
for $arg in "$#"
do
echo "$arg"
done
just a fun fact; unless you are processing a list, you are probably interested in somthing more useful
this is an older topic, but still i'd like to share the function below (requires bash 4). It parses named arguments and sets the variables in the scripts environment. Just make sure you have sane default values for all parameters you need. The export statement at the end could also just be an eval. It's great in combination with shift to extend existing scripts which already take a few positional parameters and you dont want to change the syntax, but still add some flexibility.
parseOptions()
{
args=("$#")
for opt in "${args[#]}"; do
if [[ ! "${opt}" =~ .*=.* ]]; then
echo "badly formatted option \"${opt}\" should be: option=value, stopping..."
return 1
fi
local var="${opt%%=*}"
local value="${opt#*=}"
export ${var}="${value}"
done
return 0
}

Store shell arguments in file while preserving quoting

How can shell arguments be stored in a file for later use while preserving quoting?
To be clear: I don't want to pass on the arguments in place, which could be easily done using "$#". But actually need to store them in a file for later use.
#!/bin/sh
storeargs() {
: #-)
}
if "$1"
then
# useargs is actuall 'git filter-branch'
useargs "$#"
storeargs "$#"
else
# without args use those from previous invocation
eval useargs $(cat store)
fi
.
$ foo 'a "b"' "c 'd'" '\'' 'd
e'
$ foo # behave as if called with same arguments again
The question likely comes down to how to quote a string using common tools in general (awk, perl, ...). I would prefer a solution that does not make the quoted string unreadable. The content of store should look more or less like what I would specify on the commandline.
The question is complicated by the fact that the arguments/strings to be quoted might already contain any kind of valid (shell) quoting and/or any kind of (significant) whitespace, so unconditionally putting single or double quotes around every argument or storing one argument per line won't work.
Why do the heavy lifting?
storeargs() {
while [ $# -gt 0 ]
do
printf "%q " "$1"
shift
done
}
You can now
storeargs "some" "weird $1 \`bunch\` of" params > myparams.txt
storeargs "some" 'weird $1 \`bunch\` of' params >> myparams.txt
cat myparams.txt
Output
some weird\ \ \`bunch\`\ of params
some weird\ \$1\ \\\`bunch\\\`\ of params
This version stores the arguments one per line, so may be a bit ugly in terms of storage. I doubt that it is completely robust, but it satisfies your example (for useargs() { for i in "$#"; do echo $i; done; } ):
storeargs() { printf "%q\n" "$#"; } > store
if test -n "$1"; then
useargs "$#"
storeargs "$#"
else
eval useargs $args
fi
--EDIT--
Use %q in printf to quote the strings (shamelessly copied from sehe's answer). Note that %q is available in the bash built-in printf, but not in standard printf.

How to quotes in bash function parameters?

What I'd like to do is take, as an input to a function, a line that may include quotes (single or double) and echo that line exactly as it was provided to the function. For instance:
function doit {
printf "%s " ${#}
eval "${#}"
printf " # [%3d]\n" ${?}
}
Which, given the following input
doit VAR=42
doit echo 'single quote $VAR'
doit echo "double quote $VAR"
Yields the following:
VAR=42 # [ 0]
echo single quote $VAR # [ 0]
echo double quote 42 # [ 0]
So the semantics of the variable expansion are preserved as I'd expect, but I can not get the exact format of the line as it was provided to the function. What I'd like is to have doit echo 'single quote $VAR' result in echo 'single quote $VAR'.
I'm sure this has to do with bash processing the arguments before they are passed to the function; I'm just looking for a way around that (if possible).
Edit
So what I had intended was to shadow the execution of a script while providing an exact replica of the execution that could be used as a diagnostic tool including exit status of each step.
While I can get the desired behavior described above by doing something like
while read line ; do
doit ${line}
done < ${INPUT}
That approach fails in the face of control structures (i.e. if, while, etc). I thought about using set -x but that has it's limitations as well: " becomes ' and exit status is not visible for commands that fail.
I was in a similar position to you in that I needed a script to wrap around an existing command and pass arguments preserving quoting.
I came up with something that doesn't preserve the command line exactly as typed but does pass the arguments correctly and show you what they were.
Here's my script set up to shadow ls:
CMD=ls
PARAMS=""
for PARAM in "$#"
do
PARAMS="${PARAMS} \"${PARAM}\""
done
echo Running: ${CMD} ${PARAMS}
bash -c "${CMD} ${PARAMS}"
echo Exit Code: $?
And this is some sample output:
$ ./shadow.sh missing-file "not a file"
Running: ls "missing-file" "not a file"
ls: missing-file: No such file or directory
ls: not a file: No such file or directory
Exit Code: 1
So as you can see it adds quotes which weren't originally there but it does preserve arguments with spaces in which is what I needed.
The reason this happens is because bash interprets the arguments, as you thought. The quotes simply aren't there any more when it calls the function, so this isn't possible. It worked in DOS because programs could interpret the command line themselves, not that it helps you!
Although #Peter Westlake's answer is correct, and there are no quotes to preserve one can try to deduce if the quotes where required and thus passed in originally. Personally I used this requote function when I needed a proof in my logs that a command ran with the correct quoting:
function requote() {
local res=""
for x in "${#}" ; do
# try to figure out if quoting was required for the $x:
grep -q "[[:space:]]" <<< "$x" && res="${res} '${x}'" || res="${res} ${x}"
done
# remove first space and print:
sed -e 's/^ //' <<< "${res}"
}
And here is how I use it:
CMD=$(requote "${#}")
# ...
echo "${CMD}"
doit echo "'single quote $VAR'"
doit echo '"double quote $VAR"'
Both will work.
bash will only strip the outside set of quotes when entering the function.
Bash will remove the quote when you pass a string with quote in as command line argument. The quote is simply not there anymore when the string is pass to your script. You have no way to know there is a single quote or double quote.
What you probably can do is sth like this:
doit VAR=42
doit echo \'single quote $VAR\'
doit echo \"double quote $VAR\"
In your script you get
echo 'single quote $VAR'
echo "double quote $VAR"
Or do this
doit VAR=42
doit echo 'single quote $VAR'
doit echo '"double quote $VAR"'
In your script you get
echo single quote $VAR
echo "double quote $VAR"
This:
ponerApostrofes1 ()
{
for (( i=1; i<=$#; i++ ));
do
eval VAR="\${$i}";
echo \'"${VAR}"\';
done;
return;
}
As an example has problems when the parameters have apostrophes.
This function:
ponerApostrofes2 ()
{
for ((i=1; i<=$#; i++ ))
do
eval PARAM="\${$i}";
echo -n \'${PARAM//\'/\'\\\'\'}\'' ';
done;
return
}
solves the mentioned problem and you can use parameters including apostrophes inside, like "Porky's", and returns, apparently(?), the same string of parameters when each parameter is quoted; if not, it quotes it. Surprisingly, I don't understand why, if you use it recursively, it doesn't return the same list but each parameter is quoted again. But if you do echo of each one you recover the original parameter.
Example:
$ ponerApostrofes2 'aa aaa' 'bbbb b' 'c'
'aa aaa' 'bbbb b' 'c'
$ ponerApostrofes2 $(ponerApostrofes2 'aa aaa' 'bbbb b' 'c' )
''\''aa' 'aaa'\''' ''\''bbbb' 'b'\''' ''\''c'\'''
And:
$ echo ''\''bbbb' 'b'\'''
'bbbb b'
$ echo ''\''aa' 'aaa'\'''
'aa aaa'
$ echo ''\''c'\'''
'c'
And this one:
ponerApostrofes3 ()
{
for ((i=1; i<=$#; i++ ))
do
eval PARAM="\${$i}";
echo -n ${PARAM//\'/\'\\\'\'} ' ';
done;
return
}
returning one level of quotation less,
doesn't work either, neither alternating both recursively.
If one's shell does not support pattern substitution, i.e. ${param/pattern/string} then the following sed expression can be used to safely quote any string such that it will eval into a single parameter again:
sed "s/'/'\\\\''/g;1s/^/'/;\$s/\$/'/"
Combining this with printf it is possible to write a little function that will take any list of strings produced by filename expansion or "$#" and turn it into something that can be safely passed to eval to expand it into arguments for another command while safely preserving parameter separation.
# Usage: quotedlist=$(shell_quote args...)
#
# e.g.: quotedlist=$(shell_quote *.pdf) # filenames with spaces
#
# or: quotedlist=$(shell_quote "$#")
#
# After building up a quoted list, use it by evaling it inside
# double quotes, like this:
#
# eval "set -- $quotedlist"
# for str in "$#"; do
# # fiddle "${str}"
# done
#
# or like this:
#
# eval "\$a_command $quotedlist \$another_parameter"
#
shell_quote()
{
local result=''
local arg
for arg in "$#" ; do
# Append a space to our result, if necessary
#
result=${result}${result:+ }
# Convert each embedded ' to \' , then insert ' at the
# beginning of the line, and append ' at the end of
# the line.
#
result=${result}$(printf "%s\n" "$arg" | \
sed -e "s/'/'\\\\''/g" -e "1s/^/'/" -e "\$s/\$/'/")
done
# use printf(1) instead of echo to avoid weird "echo"
# implementations.
#
printf "%s\n" "$result"
}
It may be easier (and maybe safer, i.e. avoid eval) in some situations to use an "impossible" character as the field separator and then use IFS to control expansion of the value again.
The shell is going to interpret the quotes and the $ before it passes it to your function. There's not a lot your function can do to get the special characters back, because it has no way of knowing (in the double-quote example) whether 42 was hard-coded or if it came from a variable. You will have to escape the special characters if you want them to survive long enough to make it to your function.

Resources