eval bash function arguments without misescaping - bash

In a Bash script, I want to create a function that wraps commands, printing them before executing them.
So, in a script command like this:
mkdir -p "~/new/dir/tree/"
rsync -e 'ssh -p 22' -av "src/" "user#${HOST}:dest/"
I could put a command in front of the others like this:
run mkdir -p "~/new/dir/tree/"
run rsync -e 'ssh -p 22' -av "src/" "user#${HOST}:dest/"
I define the function run as:
run (){
echo -e "FANCY FORMATING CMD> $# FANCY FORMAT ENDING"
eval "$#"
return $?
}
It works fine for most cases. When I need the command in front of run it should work with it or without it whether you remove the run part or unset the run. I mean, all three commands should work exactly the same way:
run original_comand arg1 "arg2 subarg2" etc;
unset run;
run original_comand arg1 "arg2 subarg2" etc;
original_comand arg1 "arg2 subarg2" etc;
But if I try with this line:
run rsync -e 'ssh -p 22' -av "src/" "user#${HOST}:dest/"
the -e argument became unquoted and the command actually run is
run rsync -e ssh -p 22 -av src/ user#${HOST}:dest/
If I escape the quote on the -e argument, it may work with the run command, but it would not work without it.
As far as I could read from Bash documentation, the "$#" should work, but clearly I'm missing something here.

The arguments you pass to run already underwent all sorts of expansion when they were passed to run, and are also already properly positioned, so you don't need eval to invoke the command in $1. In fact, it is an error to use eval in this case (which erroneously applies expansions and word-splitting of sorts yet another time) as you can tell by the plenty error messages you get.
The proper way to invoke the command in $1 and passing it all the arguments in "$#" (except for the first, i.e. "${#:2}"), is to simply put "$#" on a single line without eval.

Your arguments are not "unquoted" by calling your function. They are "unquoted" by putting them in a string and echoing it. The quotes are not part of the arguments, but special characters in the shell that prevents word-splitting (and in the case of single quotes, variable expansion).
The shell does word-splitting (which quotes protect against), then quote removal. This means that the quotes would be removed, yes, but that the arguments would not be split on quoted whitespace.
In
utility a "b c"
the utility would not get "b c" as its second argument, but b c (not b and c, but b<space>c). The same goes for your function.
This shows that you still get properly separated arguments in the function:
#!/bin/sh
run () {
printf 'Arg: %s\n' "$#"
}
HOST=example.com
run mkdir -p "~/new/dir/tree/"
echo '---'
run rsync -e 'ssh -p 22' -av "src/" "user#${HOST}:dest/"
This generates
Arg: mkdir
Arg: -p
Arg: ~/new/dir/tree/
---
Arg: rsync
Arg: -e
Arg: ssh -p 22
Arg: -av
Arg: src/
Arg: user#example.com:dest/
As you can see, ssh -p 22 is still delivered to printf as a separate argument, not three arguments.
Doing eval "$#" in your function will do the right thing. Escaping the quotes would definitely not do the right thing as it would include the quotes in the actual argument (mkdir \"dir\" would create a directory called "dir", including the quotes, and rsync -e \"ssh -p 22\" would be delivered to rsync as -e, "ssh, -p, 22").
Also, note that ~ is not expanded since it was in double quotes. Use $HOME instead.

Related

Commands executed with arguments in shell script is escaped with single quotes [duplicate]

I have managed to track done a weird problem in an init script I am working on. I have simplified the problem down in the following example:
> set -x # <--- Make Bash show the commands it runs
> cmd="echo \"hello this is a test\""
+ cmd='echo "hello this is a test"'
> $cmd
+ echo '"hello' this is a 'test"' # <--- Where have the single quotes come from?
"hello this is a test"
Why is bash inserting those extra single quotes into the executed command?
The extra quotes don't cause any problems in the above example, but they are really giving me a headache.
For the curious, the actual problem code is:
cmd="start-stop-daemon --start $DAEMON_OPTS \
--quiet \
--oknodo \
--background \
--make-pidfile \
$* \
--pidfile $CELERYD_PID_FILE
--exec /bin/su -- -c \"$CELERYD $CELERYD_OPTS\" - $CELERYD_USER"
Which produces this:
start-stop-daemon --start --chdir /home/continuous/ci --quiet --oknodo --make-pidfile --pidfile /var/run/celeryd.pid --exec /bin/su -- -c '"/home/continuous/ci/manage.py' celeryd -f /var/log/celeryd.log -l 'INFO"' - continuous
And therefore:
/bin/su: invalid option -- 'f'
Note: I am using the su command here as I need to ensure the user's virtualenv is setup before celeryd is run. --chuid will not provide this
Because when you try to execute your command with
$cmd
only one layer of expansion happens. $cmd contains echo "hello this is a test", which is expanded into 6 whitespace-separated tokens:
echo
"hello
this
is
a
test"
and that's what the set -x output is showing you: it's putting single quotes around the tokens that contain double quotes, in order to be clear about what the individual tokens are.
If you want $cmd to be expanded into a string which then has all the bash quoting rules applied again, try executing your command with:
bash -c "$cmd"
or (as #bitmask points out in the comments, and this is probably more efficient)
eval "$cmd"
instead of just
$cmd
Use Bash arrays to achieve the behavior you want, without resorting to the very dangerous (see below) eval and bash -c.
Using arrays:
declare -a CMD=(echo --test-arg \"Hello\ there\ friend\")
set -x
echo "${CMD[#]}"
"${CMD[#]}"
outputs:
+ echo echo --test-arg '"Hello there friend"'
echo --test-arg "Hello there friend"
+ echo --test-arg '"Hello there friend"'
--test-arg "Hello there friend"
Be careful to ensure that your array invocation is wrapped by double-quotes; otherwise, Bash tries to perform the same "bare-minimum safety" escaping of special characters:
declare -a CMD=(echo --test-arg \"Hello\ there\ friend\")
set -x
echo "${CMD[#]}"
${CMD[#]}
outputs:
+ echo echo --test-arg '"Hello there friend"'
echo --test-arg "Hello there friend"
+ echo --test-arg '"Hello' there 'friend"'
--test-arg "Hello there friend"
ASIDE: Why is eval dangerous?
eval is only safe if you can guarantee that every input passed to it will not unexpectedly change the way that the command under eval works.
Example: As a totally contrived example, let's say we have a script that runs as part of our automated code deployment process. The script sorts some input (in this case, three lines of hardcoded text), and outputs the sorted text to a file whose named is based on the current directory name. Similar to the original SO question posed here, we want to dynamically construct the --output= parameter passed to sort, but we must (must? not really) rely on eval because of Bash's auto-quoting "safety" feature.
echo $'3\n2\n1' | eval sort -n --output="$(pwd | sed 's:.*/::')".txt
Running this script in the directory /usr/local/deploy/project1/ results in a new file being created at /usr/local/deploy/project1/project1.txt.
So somehow, if a user were to create a project subdirectory named owned.txt; touch hahaha.txt; echo, the script would actually run the following series of commands:
echo $'3\n2\n1'
sort -n --output=owned.txt; touch hahaha.txt; echo .txt
As you can see, that's totally not what we want. But you may ask, in this contrived example, isn't it unlikely that the user could create a project directory owned.txt; touch hahaha.txt; echo, and if they could, aren't we already in trouble already?
Maybe, but what about a scenario where the script is parsing not the current directory name, but instead the name of a remote git source code repository branch? Unless you plan to be extremely diligent about restricting or sanitizing every user-controlled artifact whose name, identifier, or other data is used by your script, stay well clear of eval.

Does semicolon split multiple commands in double quoted string?

The command
cd /tmp; echo Hello
generates
Hello
Quoted, the command
"cd /tmp; echo Hello"
generates
-bash: cd /tmp; echo Hello: No such file or directory
Any idea why this is so? I am trying to use the quotes so I can build up a command chain and pass it through ssh on to a remote host. Thank you.
Quotes don't define strings; they define words, so in this case your command consists of exactly one word (containing lots of whitespace in addition to a ;). The first (non-assignment) word on a command line is treated as the name of the command, resulting in the error you see.
ssh works differently because the entire string is passed to a second shell on the remote end to be evaluated again. Just like you can run sh -c "cd /tmp; echo hello" on your local host, the following two commands are roughly equivalent:
ssh host "cd /tmp; echo hello"
ssh host sh -c "cd /tmp; echo hello"
Semicolon sign interpreted literally inside double quotes.
More explanation can't be found here https://www.gnu.org/software/bash/manual/html_node/Double-Quotes.html

Pass all args to a command called in a new shell using bash -c

I've simplified my example to the following:
file1.sh:
#!/bin/bash
bash -c "./file2.sh $#"
file2.sh:
#!/bin/bash
echo "first $1"
echo "second $2"
I expect that if I call ./file1.sh a b to get:
first a
second b
but instead I get:
first a
second
In other words, my later arguments after the first one are not getting passed through to the command that I'm executing inside a new bash shell. I've tried many variations of removing and moving around the quotation marks in the file1.sh file, but haven't got this to work.
Why is this happening, and how do I get the behavior I want?
(UPDATE - I realize it seems pointless that I'm calling bash -c in this example, my actual file1.sh is a proxy script for a command that gets called locally to run in a docker container so it's actually docker exec -i mycontainer bash -c '')
Change file1.sh to this with different quoting:
#!/bin/bash
bash -c './file2.sh "$#"' - "$#"
- "$#" is passing hyphen to populate $0 and $# is being passed in to populate all other positional parameters in bash -c command line.
You can also make it:
bash -c './file2.sh "$#"' "$0" "$#"
However there is no real need to use bash -c here and you can just use:
./file2.sh "$#"

How do I get Tcl's exec to run a command whose arguments have quoted strings with spaces?

I want to use pgrep to find the pid of a process from its command-line. In the shell, this is done as so:
pgrep -u andrew -fx 'some_binary -c some_config.cfg'
But when I try this from Tcl, like this:
exec pgrep -u $user -fx $cmdLine
I get:
pgrep: invalid option -- 'c'
Which makes sense, because it's seeing this:
pgrep -u andrew -fx some_binary -c some_config.cfg
But it's the same when I add single quotes:
exec pgrep -u $user -fx '$cmdLine'
And that also makes sense, because single quotes aren't special to Tcl. I think it's consider 'some_binary an argument, then the -c, then the some_config.cfg'.
I've also tried:
exec pgrep -u $user -fx {$cmdLine}
and
set cmd "pgrep -u $user -fx '$cmdLine'"
eval exec $cmd
to no avail.
From my reading it seems the {*} feature in Tcl 8.5+ might help, but my company infrastructure runs Tcl 8.0.5.
The problem is partially that ' means nothing at all to Tcl, and partially that you're losing control of where the word boundaries are.
Firstly, double check that this actually works:
exec pgrep -u $user -fx "some_binary -c some_config.cfg"
or perhaps this (Tcl uses {…} like Unix shells use single quotes but with the added benefit of being nestable; that's what braces really do in Tcl):
exec pgrep -u $user -fx {some_binary -c some_config.cfg}
What ought to work is this:
set cmdLine "some_binary -c some_config.cfg"
exec pgrep -u $user -fx $cmdLine
where you have set cmdLine to exactly the characters that you want to have in it (check by printing out if you're unsure; what matters is the value in the variable, not the quoted version that you write in your script). I'll use the set cmdLine "…" form below, but really use whatever you need for things to work.
Now, if you are going to be passing this past eval, then you should use list to add in the extra quotes that you need to make things safe:
set cmdLine "some_binary -c some_config.cfg"
set cmd [list pgrep -u $user -fx $cmdLine]
eval exec $cmd
The list command produces lists, but it uses a canonical form that is also a script fragment that is guaranteed to lack “surprise” substitutions or word boundaries.
If you were on a more recent version of Tcl (specifically 8.5 or later), you'd be able to use expansion. That's designed to specifically work very well with list, and gets rid of the need to use eval in about 99% of all cases. That'd change the:
eval exec $cmd
into:
exec {*}$cmd
The semantics are a bit different except when cmd is holding a canonical list, when they actually run the same operation. (The differences come when you deal with non-canonical lists, where eval would do all sorts of things — imagine the havoc with set cmd {ab [exit] cd}, which is a valid but non-canonical list — whereas expansion just forces things to be a list and uses the words in the list without further interpretation.)
Since you are on a old version, you have to make sure that what eval sees will be converted to a properly quoted Tcl string.
Single quotes do nothing. They are not used by exec, nor are they passed on. exec utilizes the underlying exec(3) system call, and no argument interpretation will take place unless you purposely use something like: /bin/sh -c "some-cmd some-arg" where the shell is invoked and will reinterpret the command line.
What you have to do is construct a string that eval will interpret as a quoted Tcl string. You can use "{part1 part2}" or "\"part1 part2\"" for these constructs.
First, a little test script to verify that the arguments are being passed correctly:
#!/bin/bash
for i in "$#"; do
echo $i
done
Then the Tcl script:
#!/usr/bin/tclsh
exec ./t.sh -u andrew -fx "some_binary -c some_config.cfg" ># stdout
eval exec ./t.sh -u andrew -fx "{some_binary -c some_config.cfg}" \
># stdout
eval exec ./t.sh -u andrew -fx "\"some_binary -c some_config.cfg\"" \
># stdout
# the list will be converted to a string that is already properly
# quoted for interpretation by eval.
set cmd [list ./t.sh -u andrew -fx "some_binary -c some_config.cfg"]
eval exec $cmd ># stdout

Safely pass arguments to a command executed via su

With sudo, it is possible to execute a command as an other user and really safely pass arguments to that command.
Example nasty argument:
nastyArg='"double quoted" `whoami` $(whoami) '"'simple quoted "'$(whoami)'"'"
Expected output, run via a termninal as congelli501:
% echo "$nastyArg"
"double quoted" `whoami` $(whoami) 'simple quoted $(whoami)'
Execute as congelli501, via sudo:
# sudo -u congelli501 -- echo "$nastyArg"
"double quoted" `whoami` $(whoami) 'simple quoted $(whoami)'
Execute as congelli501, via su (usual escape method):
# su congelli501 -c "echo '$nastyArg'"
"double quoted" `whoami` $(whoami) simple quoted congelli501
As you can see, the argument is not safely passed as it is re-interpreted by a shell.
Is there a way to launch a command via su and pass its arguments directly, as you can do with sudo ?
Passing the command as the shell script argument in su seems to work:
# su congelli501 -s "$(which echo)" -- "$nastyArg" "'another arg'"
"double quoted" `whoami` $(whoami) 'simple quoted $(whoami)' 'another arg'
Example usage:
# Safely execute a command as an other user, via su
#
# $1 -> username
# $2 -> program to run
# $3 .. n -> arguments
function execAs() {
user="$1"; shift
cmd="$1"; shift
su "$user" -s "$(which -- "$cmd")" -- "$#"
}
execAs congelli501 echo "$nastyArg" "'another arg'"
The fundamental problem here is that you are trying to nest quotes. You would hope su -c "stuff "with" double "quotes" to be parsed as |su|, |-c|, |stuff "with" double "quotes"| but it actually gets parsed as |su|, |-c|, |stuff |, |with|, | double|, |quotes| where however the last four tokens are pasted together as one string after evaluation (notice the spaces where the "inner" quotes terminated the ostensible "outer" quotes instead of wrapping inside them).
Within double quotes, `whoami` gets expanded by the shell before anything gets passed to su.
What you can do instead is either (a) add yet more quoting around the values in $nastyArg (pretty much doomed) or (b) define it in the context of the shell executed by su. A distinct third option is to (c) pass in the value in a way which disarms it, such as on standard input.
printf '%s\n' "$nastyArg" | su -c 'cat' congelli501
This may seem overtly simplistic, but is really the way to go when you do not have complete trust over what will possibly be evaluated by root somehow.
This is one of the reasons sudo is preferred over su.

Resources