Quoting parameters with spaces for later execution - bash

I have this (test) script:
#!/bin/bash
my_cmd_bad_ ( ) {
cmd="$#"
$cmd
}
my_cmd_good_ ( ) {
"$#"
}
my_cmd_bad_ ls -l "file with space"
my_cmd_good_ ls -l "file with space"
The output is (the file does not exist, which is not the point of this question):
» ~/test.sh
ls: cannot access file: No such file or directory
ls: cannot access with: No such file or directory
ls: cannot access space: No such file or directory
ls: cannot access file with space: No such file or directory
I am surprised that the first version does not work as expected: the parameter is not quoted, and instead of processing one file, it processes three. Why?
How can I save the command that I want to execute, properly quoted? I need to execute it later, where I do not have "$#" anymore.
A simple rework of this test script would be appreciated.

See similar question: How to pass command line parameters with quotes stored in single variable?
Use those utility functions ho save a command to a string for later execution:
bash_escape() {
# backtick indirection strictly necessary here: we use it to strip the
# trailing newline from sed's output, which Solaris/BSD sed *always* output
# (unlike GNU sed, which outputs "test": printf %s test | sed -e s/dummy//)
out=`echo "$1" | sed -e s/\\'/\\''\\\\'\\'\\'/g`
printf \'%s\' "$out"
}
append_bash_escape() {
printf "%s " "$1"
bash_escape "$2"
}
your_cmd_fixed_ ( ) {
cmd="$#"
while [ $# -gt 0 ] ; do
cmd=`append_bash_escape "$cmd" "$1"` ; shift
done
$cmd
}

You can quote any single parameter and evaluate it later:
my_cmd_bad_ ( ) {
j=0
for i in "$#"; do
cmd["$j"]=\"$"$i"\"
j=$(( $j + 1 ))
done;
eval ${cmd[*]}
}

You are combining three space-delimited strings "ls", "-l", and "file with space" into a single space-delimited string cmd. There's no way to know which spaces were originally quoted (in "file with space") and which spaces were introduced during the assignment to cmd.
Typically, it is not a good idea to try to build up command lines into a single string. Use functions, or isolate the actual command and leave the arguments in $#.
Rewrite the command like this:
my_cmd_bad_ () {
cmd=$1; shift
$cmd "$#"
}

See http://mywiki.wooledge.org/BashFAQ/050
Note that your second version is greatly preferred most of the time. The only exceptions are if you need to do something special. For example, you can't bundle an assignment or redirect or compound command into a parameter list.
The correct way to handle the quoting issue requires non-standard features. Semi-realistic example involving a template:
function myWrapper {
typeset x IFS=$' \t\n'
{ eval "$(</dev/fd/0)"; } <<-EOF
for x in $(printf '%q ' "$#"); do
echo "\$x"
done
EOF
}
myWrapper 'foo bar' $'baz\nbork'
Make sure you understand exactly what's going on here and that you really have a good reason for doing this. It requires ensuring side-effects can't affect the arguments. This specific example doesn't demonstrate a very good use case because everything is hard-coded so you're able to correctly escape things in advance and expand the arguments quoted if you wanted.

Related

bash variable with or without double quotes [duplicate]

I have a variable in my bash script whose value is something like this:
~/a/b/c
Note that it is unexpanded tilde. When I do ls -lt on this variable (call it $VAR), I get no such directory. I want to let bash interpret/expand this variable without executing it. In other words, I want bash to run eval but not run the evaluated command. Is this possible in bash?
How did I manage to pass this into my script without expansion? I passed the argument in surrounding it with double quotes.
Try this command to see what I mean:
ls -lt "~"
This is exactly the situation I am in. I want the tilde to be expanded. In other words, what should I replace magic with to make these two commands identical:
ls -lt ~/abc/def/ghi
and
ls -lt $(magic "~/abc/def/ghi")
Note that ~/abc/def/ghi may or may not exist.
If the variable var is input by the user, eval should not be used to expand the tilde using
eval var=$var # Do not use this!
The reason is: the user could by accident (or by purpose) type for example var="$(rm -rf $HOME/)" with possible disastrous consequences.
A better (and safer) way is to use Bash parameter expansion:
var="${var/#\~/$HOME}"
Due to the nature of StackOverflow, I can't just make this answer unaccepted, but in the intervening 5 years since I posted this there have been far better answers than my admittedly rudimentary and pretty bad answer (I was young, don't kill me).
The other solutions in this thread are safer and better solutions. Preferably, I'd go with either of these two:
Charle's Duffy's solution
Håkon Hægland's solution
Original answer for historic purposes (but please don't use this)
If I'm not mistaken, "~" will not be expanded by a bash script in that manner because it is treated as a literal string "~". You can force expansion via eval like this.
#!/bin/bash
homedir=~
eval homedir=$homedir
echo $homedir # prints home path
Alternatively, just use ${HOME} if you want the user's home directory.
Plagarizing myself from a prior answer, to do this robustly without the security risks associated with eval:
expandPath() {
local path
local -a pathElements resultPathElements
IFS=':' read -r -a pathElements <<<"$1"
: "${pathElements[#]}"
for path in "${pathElements[#]}"; do
: "$path"
case $path in
"~+"/*)
path=$PWD/${path#"~+/"}
;;
"~-"/*)
path=$OLDPWD/${path#"~-/"}
;;
"~"/*)
path=$HOME/${path#"~/"}
;;
"~"*)
username=${path%%/*}
username=${username#"~"}
IFS=: read -r _ _ _ _ _ homedir _ < <(getent passwd "$username")
if [[ $path = */* ]]; then
path=${homedir}/${path#*/}
else
path=$homedir
fi
;;
esac
resultPathElements+=( "$path" )
done
local result
printf -v result '%s:' "${resultPathElements[#]}"
printf '%s\n' "${result%:}"
}
...used as...
path=$(expandPath '~/hello')
Alternately, a simpler approach that uses eval carefully:
expandPath() {
case $1 in
~[+-]*)
local content content_q
printf -v content_q '%q' "${1:2}"
eval "content=${1:0:2}${content_q}"
printf '%s\n' "$content"
;;
~*)
local content content_q
printf -v content_q '%q' "${1:1}"
eval "content=~${content_q}"
printf '%s\n' "$content"
;;
*)
printf '%s\n' "$1"
;;
esac
}
How about this:
path=`realpath "$1"`
Or:
path=`readlink -f "$1"`
A safe way to use eval is "$(printf "~/%q" "$dangerous_path")". Note that is bash specific.
#!/bin/bash
relativepath=a/b/c
eval homedir="$(printf "~/%q" "$relativepath")"
echo $homedir # prints home path
See this question for details
Also, note that under zsh this would be as as simple as echo ${~dangerous_path}
Here is a ridiculous solution:
$ echo "echo $var" | bash
An explanation of what this command does:
create a new instance of bash, by... calling bash;
take the string "echo $var" and substitute $var with the value of the variable (thus after the substitution the string will contain the tilde);
take the string produced by step 2 and send it to the instance of bash created in step one, which we do here by calling echo and piping its output with the | character.
Basically the current bash instance we're running takes our place as the user of another bash instance and types in the command "echo ~..." for us.
Expanding (no pun intended) on birryree's and halloleo's answers: The general approach is to use eval, but it comes with some important caveats, namely spaces and output redirection (>) in the variable. The following seems to work for me:
mypath="$1"
if [ -e "`eval echo ${mypath//>}`" ]; then
echo "FOUND $mypath"
else
echo "$mypath NOT FOUND"
fi
Try it with each of the following arguments:
'~'
'~/existing_file'
'~/existing file with spaces'
'~/nonexistant_file'
'~/nonexistant file with spaces'
'~/string containing > redirection'
'~/string containing > redirection > again and >> again'
Explanation
The ${mypath//>} strips out > characters which could clobber a file during the eval.
The eval echo ... is what does the actual tilde expansion
The double-quotes around the -e argument are for support of filenames with spaces.
Perhaps there's a more elegant solution, but this is what I was able to come up with.
why not delve straight into getting the user's home directory with getent?
$ getent passwd mike | cut -d: -f6
/users/mike
I believe this is what you're looking for
magic() { # returns unexpanded tilde express on invalid user
local _safe_path; printf -v _safe_path "%q" "$1"
eval "ln -sf ${_safe_path#\\} /tmp/realpath.$$"
readlink /tmp/realpath.$$
rm -f /tmp/realpath.$$
}
Example usage:
$ magic ~nobody/would/look/here
/var/empty/would/look/here
$ magic ~invalid/this/will/not/expand
~invalid/this/will/not/expand
Here is the POSIX function equivalent of Håkon Hægland's Bash answer
expand_tilde() {
tilde_less="${1#\~/}"
[ "$1" != "$tilde_less" ] && tilde_less="$HOME/$tilde_less"
printf '%s' "$tilde_less"
}
2017-12-10 edit: add '%s' per #CharlesDuffy in the comments.
Here's my solution:
#!/bin/bash
expandTilde()
{
local tilde_re='^(~[A-Za-z0-9_.-]*)(.*)'
local path="$*"
local pathSuffix=
if [[ $path =~ $tilde_re ]]
then
# only use eval on the ~username portion !
path=$(eval echo ${BASH_REMATCH[1]})
pathSuffix=${BASH_REMATCH[2]}
fi
echo "${path}${pathSuffix}"
}
result=$(expandTilde "$1")
echo "Result = $result"
Simplest: replace 'magic' with 'eval echo'.
$ eval echo "~"
/whatever/the/f/the/home/directory/is
Problem: You're going to run into issues with other variables because eval is evil. For instance:
$ # home is /Users/Hacker$(s)
$ s="echo SCARY COMMAND"
$ eval echo $(eval echo "~")
/Users/HackerSCARY COMMAND
Note that the issue of the injection doesn't happen on the first expansion. So if you were to simply replace magic with eval echo, you should be okay. But if you do echo $(eval echo ~), that would be susceptible to injection.
Similarly, if you do eval echo ~ instead of eval echo "~", that would count as twice expanded and therefore injection would be possible right away.
For anyone's reference, a function to mimic python's os.path.expanduser() behavior (no eval usage):
# _expand_homedir_tilde ~/.vim
/root/.vim
# _expand_homedir_tilde ~myuser/.vim
/home/myuser/.vim
# _expand_homedir_tilde ~nonexistent/.vim
~nonexistent/.vim
# _expand_homedir_tilde /full/path
/full/path
And the function:
function _expand_homedir_tilde {
(
set -e
set -u
p="$1"
if [[ "$p" =~ ^~ ]]; then
u=`echo "$p" | sed 's|^~\([a-z0-9_-]*\)/.*|\1|'`
if [ -z "$u" ]; then
u=`whoami`
fi
h=$(set -o pipefail; getent passwd "$u" | cut -d: -f6) || exit 1
p=`echo "$p" | sed "s|^~[a-z0-9_-]*/|${h}/|"`
fi
echo $p
) || echo $1
}
Just to extend birryree's answer for paths with spaces: You cannot use the eval command as is because it seperates evaluation by spaces. One solution is to replace spaces temporarily for the eval command:
mypath="~/a/b/c/Something With Spaces"
expandedpath=${mypath// /_spc_} # replace spaces
eval expandedpath=${expandedpath} # put spaces back
expandedpath=${expandedpath//_spc_/ }
echo "$expandedpath" # prints e.g. /Users/fred/a/b/c/Something With Spaces"
ls -lt "$expandedpath" # outputs dir content
This example relies of course on the assumption that mypath never contains the char sequence "_spc_".
You might find this easier to do in python.
(1) From the unix command line:
python -c 'import os; import sys; print os.path.expanduser(sys.argv[1])' ~/fred
Results in:
/Users/someone/fred
(2) Within a bash script as a one-off - save this as test.sh:
#!/usr/bin/env bash
thepath=$(python -c 'import os; import sys; print os.path.expanduser(sys.argv[1])' $1)
echo $thepath
Running bash ./test.sh results in:
/Users/someone/fred
(3) As a utility - save this as expanduser somewhere on your path, with execute permissions:
#!/usr/bin/env python
import sys
import os
print os.path.expanduser(sys.argv[1])
This could then be used on the command line:
expanduser ~/fred
Or in a script:
#!/usr/bin/env bash
thepath=$(expanduser $1)
echo $thepath
Just use eval correctly: with validation.
case $1${1%%/*} in
([!~]*|"$1"?*[!-+_.[:alnum:]]*|"") ! :;;
(*/*) set "${1%%/*}" "${1#*/}" ;;
(*) set "$1"
esac&& eval "printf '%s\n' $1${2+/\"\$2\"}"
I have done this with variable parameter substitution after reading in the path using read -e (among others). So the user can tab-complete the path, and if the user enters a ~ path it gets sorted.
read -rep "Enter a path: " -i "${testpath}" testpath
testpath="${testpath/#~/${HOME}}"
ls -al "${testpath}"
The added benefit is that if there is no tilde nothing happens to the variable, and if there is a tilde but not in the first position it is also ignored.
(I include the -i for read since I use this in a loop so the user can fix the path if there is a problem.)
for some reason when the string is already quoted only perl saves the day
#val="${val/#\~/$HOME}" # for some reason does not work !!
val=$(echo $val|perl -ne 's|~|'$HOME'|g;print')
I think that
thepath=( ~/abc/def/ghi )
is easier than all the other solutions... or I am missing something? It works even if the path does not really exists.

Evaluate variable at time of function declaration in shell

I'm setting up my shell environments and I want to be able to use some of the same functions/aliases in zsh as in bash. One of these functions opens either .bashrc or .zshrc in an editor (whichever file is relevant), waits for the editor to close, then reloads the rc file.
# a very simplified version of this function
editrc() {
local rcfile=".$(basename $SHELL)rc"
code -w ~/$rcfile
. ~/$rcfile
}
I use the value of rcfile in a few other functions, so I've pulled it out of the function declaration.
_rc=".$(basename $SHELL)rc"
editrc() {
code -w ~/$_rc
. ~/$_rc
}
# ... other functions that use it ...
unset _rc
However, because I'm a neat freak, I want to unset _rc at the end of my script, but I still want my functions to run correctly. Is there a clever way to evaluate $_rc at the time the function is declared?
I know I could use eval and place everything except $_rc instances within single quotes, but that seems like a pain, since the full version of my function uses both single-quotes and double-quotes.
_rc=".$(basename $SHELL)rc"
eval 'editrc() {
echo Here'"'"'s a thing that uses single quotes. As you can see it'"'"'s a pain.
code -w ~/'$_rc'
. ~/'$_rc'
}'
# ... other functions using `_rc`
unset _rc
I'm guessing I could declare my functions, then do some magic with eval "$(declare -f editrc | awk)". It very well be more pain than it's worth, but I'm always interested in learning new things.
Note: I'd love to generalize this into a utility function that does this.
_myvar=foo
anothervar=bar
myfunc() {
echo $_myvar $anothervar
}
# redeclares myfunc with `$_myvar` expanded, but leaves `$anothervar` as-is
expandfunctionvars myfunc '$_myvar'
Is there a clever way to evaluate $_rc at the time the function is declared?
_rc=".$(basename "$SHELL")rc"
# while you could eval here, source lets you work with a stream
source <(
cat <<EOF
editrc() {
local _rc
# first safely trasfer context
$(declare -p _rc)
EOF
# use quoted here string to do anything inside without caring.
cat <<'EOF'
# do anything else
echo "Here's a thing that uses single quotes. As you can see it's not a pain, just choose proper quoting."
code -w "~/$_rc"
. "~/$_rc"
}
EOF
)
unset _rc
Generally first use declare -p to transfer variables as strings to be evaluated. Then after you "import" variables, use a quoted here document to do anything as in a normal script.
References to read:
<<EOF is a here document. Note the difference in parsing when the here delimiter is quoted vs unquoted.
<(..) is a process substitution
The source command reads a pipe created by process substitution. Inside the process subtitution I output the function to be sourced. With the first here document I output the function name definition, with a local of the variable so that it doesn't pollute global namespace. Then with declare -p I output the variable definition as a properly quoted string later to be sourced by source. Then with a quoted here document I output the rest of the function, so that I do not need to care about quoting.
The code is bash specific, I know nothing about zsh and don't use it.
You could do it with eval too:
eval '
editrc() {
local _rc
# first safely trasfer context
'"$(declare -p _rc)"'
# use quoted here string to do anything inside without caring.
# do anything else
echo "Here'\''s a thing that uses single quotes. As you can see it'\''s not a pain, just choose proper quoting."
code -w "~/$_rc"
. "~/$_rc"
}'
But for me using a quoted here document delimiter allows for easier writing.
While KamilCuck was working on their answer, I devised a function that will take in any function name and a set of variable names, expand just those variables, and redeclare the function.
expandFnVars() {
if [[ $# -lt 2 ]]; then
>&2 echo 'expandFnVars requires at least two arguments: the function name and the variable(s) to be expanded'
return 1
fi
local fn="$1"
shift
local vars=("$#")
if [[ -z "$(declare -F $fn 2> /dev/null)" ]]; then
>&2 echo $fn is not a function.
return 1
fi
foundAllVars=true
for v in $vars; do
if [[ -z "$(declare -p $v 2> /dev/null)" ]]; then
>&2 echo $v is not a declared value.
foundAllVars=false
fi
done
[[ $foundAllVars != true ]] && return 1
fn="$(declare -f $fn)"
for v in $vars; do
local val="$(eval 'echo $'$v)" # get the value of the varable represented by $v
val="${val//\"/\\\"}" # escape any double-quotes
val="${val//\\/\\\\\\}" # escape any backslashes
fn="$(echo "$fn" | sed -r 's/"?\$'$v'"?/"'"$val"'"/g')" # replace instances of "$$v" and $$v with $val
done
eval "$fn"
}
Usage:
foo="foo bar"
bar='$foo'
baz=baz
fn() {
echo $bar $baz
}
expandFnVars fn bar
declare -f fn
# prints:
# fn ()
# {
# echo "$foo" $baz
# }
expandFnVars fn foo
declare -f fn
# prints:
# fn ()
# {
# echo "foo bar" $baz
# }
Looking at it now, I see one flaw. Suppose $bar in the original function was in single-quotes. We probably would not want its value to be replaced. This could be fixed by some clever regex lookbehinds to count the number of unescaped 's, but I'm happy with it as-is.

Parse ${} placeholder into absolute path in shell script

I have a app.properties file something like below
Base.dir="/user/test/application"
Result.dir="${base.dir}/result"
and i've create bash script to parse above properties
function readConfigFile()
{
(grep -E "^${2}=" -m 1 "${1}" 2>/dev/null || echo "VAR=__UNDEFINED__") | head -n 1 | cut -d '=' -f 2-;
}
function setConfigFile()
{
sourceFile=${1}
}
function configGet()
{
if [ ! -z $sourceFile ]; then
val="$(readConfigFile $sourceFile "${1}")";
if [ "${val}" = "__UNDEFINED__" ]; then
echo "${1} value not exist"
# return empty string
printf -- "%s" "";
fi
printf -- "%s" "${val}";
else
echo "config file not exist"
# return empty string
printf -- "%s" "";
fi
}
and the way i call above parser is something like below
$Result_dir=$(configGet Result.dir)
however, i cant really translate placeholder ${} into base_dir
and i got following error
ls $Result_dir
ls: cannot access ${Base_dir}/result: No such file or directory
Is there any way that i can translate ${Base.dir} into /user/test/application?
I guess you're not going to be able to substitute ${base.dir} (btw shouldn't it be ${Base.dir}?) the way you were hoping mainly because, as far as I know, dots are not allowed in variable names in bash.
What you could do is manually substitute the ${base.dir} part with the corresponding path using bash's substitution syntax. For example:
setConfigFile 'app.properties'
Result_dir_raw=$(configGet Result.dir)
Result_dir=${Result_dir_raw/'${base.dir}'/$(configGet Base.dir)}
echo ${Result_dir}
I say "manually" because you still specify in your source code that the pattern you want to replace is ${base.dir} which I'm guessing isn't what you wanted.
Now if you run this you'll see that the ${Result_dir} variable evaluates to ""/user/test/application"/result" which obviously isn't a path, and this is because you're surrounding the paths in app.properties with double quotes, so you either need to get rid of them in your readConfigFile function or lose them altogether in your config file, which to me makes more sense.
Why have you a . in your variable name, which is not allowed in bash:
$ Base.dir="/user/test/application"
-bash: Base.dir=/user/test/application: No such file or directory
$ Base_dir="/user/test/application"
$
So, why do you get No such file or directory? Here is an explanation:
Create a file called Base.dir=gash.sh, yes, that's a legal filename
$ echo 'echo Hello World' > Base.dir=gash.sh
Make the file executable:
$ PATH=$PATH:.
$ chmod u+x Base.dir=gash.sh
Now type the command:
$ Base.dir="gash.sh"
Hello World
Use an underscore, not a dot. By the Way, ksh Korn shell not only allows the dot, it has a special meaning, it is a compound variable.

Store shell arguments in file while preserving quoting

How can shell arguments be stored in a file for later use while preserving quoting?
To be clear: I don't want to pass on the arguments in place, which could be easily done using "$#". But actually need to store them in a file for later use.
#!/bin/sh
storeargs() {
: #-)
}
if "$1"
then
# useargs is actuall 'git filter-branch'
useargs "$#"
storeargs "$#"
else
# without args use those from previous invocation
eval useargs $(cat store)
fi
.
$ foo 'a "b"' "c 'd'" '\'' 'd
e'
$ foo # behave as if called with same arguments again
The question likely comes down to how to quote a string using common tools in general (awk, perl, ...). I would prefer a solution that does not make the quoted string unreadable. The content of store should look more or less like what I would specify on the commandline.
The question is complicated by the fact that the arguments/strings to be quoted might already contain any kind of valid (shell) quoting and/or any kind of (significant) whitespace, so unconditionally putting single or double quotes around every argument or storing one argument per line won't work.
Why do the heavy lifting?
storeargs() {
while [ $# -gt 0 ]
do
printf "%q " "$1"
shift
done
}
You can now
storeargs "some" "weird $1 \`bunch\` of" params > myparams.txt
storeargs "some" 'weird $1 \`bunch\` of' params >> myparams.txt
cat myparams.txt
Output
some weird\ \ \`bunch\`\ of params
some weird\ \$1\ \\\`bunch\\\`\ of params
This version stores the arguments one per line, so may be a bit ugly in terms of storage. I doubt that it is completely robust, but it satisfies your example (for useargs() { for i in "$#"; do echo $i; done; } ):
storeargs() { printf "%q\n" "$#"; } > store
if test -n "$1"; then
useargs "$#"
storeargs "$#"
else
eval useargs $args
fi
--EDIT--
Use %q in printf to quote the strings (shamelessly copied from sehe's answer). Note that %q is available in the bash built-in printf, but not in standard printf.

How to quotes in bash function parameters?

What I'd like to do is take, as an input to a function, a line that may include quotes (single or double) and echo that line exactly as it was provided to the function. For instance:
function doit {
printf "%s " ${#}
eval "${#}"
printf " # [%3d]\n" ${?}
}
Which, given the following input
doit VAR=42
doit echo 'single quote $VAR'
doit echo "double quote $VAR"
Yields the following:
VAR=42 # [ 0]
echo single quote $VAR # [ 0]
echo double quote 42 # [ 0]
So the semantics of the variable expansion are preserved as I'd expect, but I can not get the exact format of the line as it was provided to the function. What I'd like is to have doit echo 'single quote $VAR' result in echo 'single quote $VAR'.
I'm sure this has to do with bash processing the arguments before they are passed to the function; I'm just looking for a way around that (if possible).
Edit
So what I had intended was to shadow the execution of a script while providing an exact replica of the execution that could be used as a diagnostic tool including exit status of each step.
While I can get the desired behavior described above by doing something like
while read line ; do
doit ${line}
done < ${INPUT}
That approach fails in the face of control structures (i.e. if, while, etc). I thought about using set -x but that has it's limitations as well: " becomes ' and exit status is not visible for commands that fail.
I was in a similar position to you in that I needed a script to wrap around an existing command and pass arguments preserving quoting.
I came up with something that doesn't preserve the command line exactly as typed but does pass the arguments correctly and show you what they were.
Here's my script set up to shadow ls:
CMD=ls
PARAMS=""
for PARAM in "$#"
do
PARAMS="${PARAMS} \"${PARAM}\""
done
echo Running: ${CMD} ${PARAMS}
bash -c "${CMD} ${PARAMS}"
echo Exit Code: $?
And this is some sample output:
$ ./shadow.sh missing-file "not a file"
Running: ls "missing-file" "not a file"
ls: missing-file: No such file or directory
ls: not a file: No such file or directory
Exit Code: 1
So as you can see it adds quotes which weren't originally there but it does preserve arguments with spaces in which is what I needed.
The reason this happens is because bash interprets the arguments, as you thought. The quotes simply aren't there any more when it calls the function, so this isn't possible. It worked in DOS because programs could interpret the command line themselves, not that it helps you!
Although #Peter Westlake's answer is correct, and there are no quotes to preserve one can try to deduce if the quotes where required and thus passed in originally. Personally I used this requote function when I needed a proof in my logs that a command ran with the correct quoting:
function requote() {
local res=""
for x in "${#}" ; do
# try to figure out if quoting was required for the $x:
grep -q "[[:space:]]" <<< "$x" && res="${res} '${x}'" || res="${res} ${x}"
done
# remove first space and print:
sed -e 's/^ //' <<< "${res}"
}
And here is how I use it:
CMD=$(requote "${#}")
# ...
echo "${CMD}"
doit echo "'single quote $VAR'"
doit echo '"double quote $VAR"'
Both will work.
bash will only strip the outside set of quotes when entering the function.
Bash will remove the quote when you pass a string with quote in as command line argument. The quote is simply not there anymore when the string is pass to your script. You have no way to know there is a single quote or double quote.
What you probably can do is sth like this:
doit VAR=42
doit echo \'single quote $VAR\'
doit echo \"double quote $VAR\"
In your script you get
echo 'single quote $VAR'
echo "double quote $VAR"
Or do this
doit VAR=42
doit echo 'single quote $VAR'
doit echo '"double quote $VAR"'
In your script you get
echo single quote $VAR
echo "double quote $VAR"
This:
ponerApostrofes1 ()
{
for (( i=1; i<=$#; i++ ));
do
eval VAR="\${$i}";
echo \'"${VAR}"\';
done;
return;
}
As an example has problems when the parameters have apostrophes.
This function:
ponerApostrofes2 ()
{
for ((i=1; i<=$#; i++ ))
do
eval PARAM="\${$i}";
echo -n \'${PARAM//\'/\'\\\'\'}\'' ';
done;
return
}
solves the mentioned problem and you can use parameters including apostrophes inside, like "Porky's", and returns, apparently(?), the same string of parameters when each parameter is quoted; if not, it quotes it. Surprisingly, I don't understand why, if you use it recursively, it doesn't return the same list but each parameter is quoted again. But if you do echo of each one you recover the original parameter.
Example:
$ ponerApostrofes2 'aa aaa' 'bbbb b' 'c'
'aa aaa' 'bbbb b' 'c'
$ ponerApostrofes2 $(ponerApostrofes2 'aa aaa' 'bbbb b' 'c' )
''\''aa' 'aaa'\''' ''\''bbbb' 'b'\''' ''\''c'\'''
And:
$ echo ''\''bbbb' 'b'\'''
'bbbb b'
$ echo ''\''aa' 'aaa'\'''
'aa aaa'
$ echo ''\''c'\'''
'c'
And this one:
ponerApostrofes3 ()
{
for ((i=1; i<=$#; i++ ))
do
eval PARAM="\${$i}";
echo -n ${PARAM//\'/\'\\\'\'} ' ';
done;
return
}
returning one level of quotation less,
doesn't work either, neither alternating both recursively.
If one's shell does not support pattern substitution, i.e. ${param/pattern/string} then the following sed expression can be used to safely quote any string such that it will eval into a single parameter again:
sed "s/'/'\\\\''/g;1s/^/'/;\$s/\$/'/"
Combining this with printf it is possible to write a little function that will take any list of strings produced by filename expansion or "$#" and turn it into something that can be safely passed to eval to expand it into arguments for another command while safely preserving parameter separation.
# Usage: quotedlist=$(shell_quote args...)
#
# e.g.: quotedlist=$(shell_quote *.pdf) # filenames with spaces
#
# or: quotedlist=$(shell_quote "$#")
#
# After building up a quoted list, use it by evaling it inside
# double quotes, like this:
#
# eval "set -- $quotedlist"
# for str in "$#"; do
# # fiddle "${str}"
# done
#
# or like this:
#
# eval "\$a_command $quotedlist \$another_parameter"
#
shell_quote()
{
local result=''
local arg
for arg in "$#" ; do
# Append a space to our result, if necessary
#
result=${result}${result:+ }
# Convert each embedded ' to \' , then insert ' at the
# beginning of the line, and append ' at the end of
# the line.
#
result=${result}$(printf "%s\n" "$arg" | \
sed -e "s/'/'\\\\''/g" -e "1s/^/'/" -e "\$s/\$/'/")
done
# use printf(1) instead of echo to avoid weird "echo"
# implementations.
#
printf "%s\n" "$result"
}
It may be easier (and maybe safer, i.e. avoid eval) in some situations to use an "impossible" character as the field separator and then use IFS to control expansion of the value again.
The shell is going to interpret the quotes and the $ before it passes it to your function. There's not a lot your function can do to get the special characters back, because it has no way of knowing (in the double-quote example) whether 42 was hard-coded or if it came from a variable. You will have to escape the special characters if you want them to survive long enough to make it to your function.

Resources