Escaping dollar sign in sh -c - bash

I am working with an application that runs commands in bash. This is the "template" it is using:
sh -c '<command> "<argument>"'
Please note that I cannot edit the quotes, only thing I can edit is the command and argument. I cannot escape the dollar sign either.
This "template" works fine unless there is a dollar sign in the argument:
sh -c 'echo "x=test$test"'
gives the following output:
x=test
How can I get the exact output, which is:
x=test$test
I could do this if I could switch the quotes:
sh -c "echo '"'x=test$test'"'"
x=test$test
Any way to accomplish this?

If your sh -c command is being run by bash (not by a baseline POSIX shell), and your argument doesn't contain ' literals, then you can do some trickery, as follows:
#!/bin/bash
sh -c 'echo ""'$'\'x=test$test\'""'
That is to say, that the string "'$'\' should be prepended to <argument>, and \'" should be appended.
Note that this is ABSOLUTELY NOT SECURE against shell injection attacks. To make it so, we'd need to modify the inner contents of argument; at bare minimum, replacing \ with \\ and ' with \'.
Even better (particularly from a security perspective) would be to fix the program you're working with to pass arguments out-of-band from code. To do that would mean an argv looking like the following (given below in Python syntax suitable for subprocess.Popen(cmd, shell=False)):
['sh', '-c', 'echo "$1"', '_', 'x=test$test']

Related

Strange issue resolving bash environmental variable in nested double quotes

I have a setup script that needs to be run remotely on an arbitrary machine (can be windows). So I had something along the lines of bash -c "do things that need environmental variables".
I found some strange things happening with nested quotes + enviornmental variables that I don't understand (demonstrated below)
# This worked because my environment was polluted.
bash -c "NAME=me echo $NAME"
> me
# I think this was a weird cross platform issue with how I was running.
# I couldn't reproduce it locally.
bash -c "NAME=me echo "Hi $NAME""
> Hi $NAME
# This was my workaround, and I have no clue why this works.
# I get that "Start "" end" does string concatenation in bash,
# but I have no clue why that would make this print 'Hi me' instead
# of 'Hi'.
#
# This works because echo Hi name prints "Hi name". I thought echo only
# took the first argument passed in.
bash -c "NAME=me echo Hi "" $NAME"
> Hi me
# This is the same as the first case. NAME was just empty this time.
bash -c "NAME=me echo Hi $NAME"
> Hi
Edit: A bunch of people have pointed out that the variables get expanded in double quotes before bash -c gets run. This makes sense, but I feel like it doesn't explain why case 1 works.
shouldn't bash -c "NAME=me echo $NAME" be expanded to bash -c "NAME=me echo ", since NAME isn't set before we run this?
Edit 2: A bunch of this stuff worked because my environment was polluted. I've tried to describe what mistakes I made in my assumptions
There are at least three sources of confusion here: quotes don't (generally) nest, $variable references are expanded by the shell even if they're in double-quotes, and variable references are resolved before var=value assignments are done.
Let me look at the second problem first. Here's an interactive example showing the effect:
$ NAME=Gordon
$ bash -c "NAME=me echo $NAME"
Gordon
Here, the outer (interactive) shell expanded $NAME before passing it to bash -c, so the command essentially became bash -c "NAME=me echo Gordon". There are several ways to avoid this: you can escape the $ to remove its normal effect (but the escape gets removed, so the inner shell will see it and apply it normally), or use single-quotes instead of double (which remove the special effect of all characters, except for another single-quote which ends the single-quoted string). So let's try those:
$ bash -c "NAME=me echo \$NAME"
$ bash -c 'NAME=me echo $NAME'
(You can't really see it, but there's a blank line after the second command as well, because it didn't print anything either.) What happened here is that the inner shell (the one created by bash -c) indeed got the command NAME=me echo $NAME, but when executing it expands $NAME first (giving nothing, because it's not defined in that shell), and then executes NAME=me echo which runs the echo command with NAME set to "me" in its environment. Let's try that interactively:
$ NAME=me echo $NAME
Gordon
(Remember that I set NAME=Gordon in my interactive shell earlier.) To get the intended effect, you'd need to set NAME and then as a separate command use it in an echo command:
$ bash -c "NAME=me; echo \$NAME"
me
$ bash -c 'NAME=me; echo $NAME'
me
Ok, with that out of the way let's move on to the original question about quoting. As I said, quotes don't (generally) nest. To understand what's going on, let's analyze some of the example commands. You can get a better idea how the shell interprets things by using set -x, which makes the shell print each command's equivalent just before it's executed:
$ set -x
$ bash -c "NAME=me echo "Hi $NAME""
+ bash -c 'NAME=me echo Hi' Gordon
Hi
What happened here is that the shell parsed "NAME=me echo "Hi as a double-quoted string immediately followed by two unquoted characters; since there's no gap between them, they get merged into a single argument to bash -c. It may seem a little weird having only part of an argument quoted, but it's actually entirely normal in shell syntax. It's even normal to have part of a single argument be unquoted, part single-quoted, part double-quoted, and even part in ANSI-C mode ($'ANSI-c-escaped stuff goes here').
With set -x, bash will print something equivalent to the command being executed. All of these commands are equivalent in shell syntax:
bash -c "NAME=me echo "Hi Gordon
bash -c "NAME=me echo Hi" Gordon
bash -c 'NAME=me echo Hi' Gordon
bash -c NAME=me\ echo\ Hi Gordon
bash -c NAME=me' 'echo' 'Hi Gordon
bash -c 'NAME=me'\ "echo Hi" Gordon
...and lots more. With set -x, bash will print one of these equivalents, and it just happens to choose the one with single-quotes around the entire argument.
Just for completeness, what happened to $NAME""? It's treated as an unquoted variable reference (which expands to Gordon) immediately followed by a zero-length double-quoted string, which doesn't do anything at all.
But... why does that just print "Hi"? Well, bash -c treats the next argument as a command to run, and any further arguments as the argument vector ($0, $1, etc) for that command's environment. Here's an illustration:
$ bash -c 'echo "Args: $0 $1 $2"' zeroth first second third
+ bash -c 'echo "Args: $0 $1 $2"' zeroth first second third
Args: zeroth first second
("third" doesn't get printed because the command doesn't print $3.)
Thus, when you run bash -c 'NAME=me echo Hi' Gordon, it executes NAME=me echo Hi with $0 set to "Gordon".
Ok, here's the last example I'll look at:
$ bash -c "NAME=me echo Hi "" $NAME"
+ bash -c 'NAME=me echo Hi Gordon'
Hi Gordon
What's happening here is that there's a double-quoted section "NAME=me echo Hi " immediately followed by another one, " $NAME", so they get merged into a single long argument (which happens to contain two spaces in a row -- one part of the first quoted section, one part of the second). Essentially, the "" in the middle ends one double-quotes section and immediately starts another, thus having no overall effect. And again, the shell decided to print a single-quoted equivalent rather than any of the various other possible equivalents.
So how do we actually get this to work right? Here's what I'd actually recommend:
$ bash -c 'NAME=me; echo "Hi $NAME"'
+ bash -c 'NAME=me; echo "Hi $NAME"'
Hi me
Since the entire command string is in single-quotes, none of these problems occur. The double-quotes are just normal characters being passed as part of the argument (so double-quotes sort of nest inside single-quotes -- and vice versa -- but it's really just 'cause they're ignored), and the $ doesn't get its special meaning to the outer shell either. Oh, and the ; makes this two separate commands, so the NAME=me part can take effect before the echo "$NAME" part uses it.
Another equivalent would be:
$ bash -c "NAME=me; echo \"Hi \$NAME\""
+ bash -c 'NAME=me; echo "Hi $NAME"'
Hi me
Here the escapes remove the special meanings of the $ and enclosed double-quotes. Note that the shell prints exactly the same thing as last time for its set -x output, indicating that this really is equivalent to the single-quoted version.

How to use echo command to output escape sequence for color

domain="www.google.com"
echo -e "\e[1;34m"$domain"\e[0m"
I expected this to output www.google.com in green letters.
Instead I got
-e \e[1;34mwww.google.com\e[0m
Depending the environment or shell used can have an effect, one thing you could probably do is to use ANSI-C quoting:
echo $'\e[1;34m'${domain}$'\e[0m'
Words of the form $'string' are treated specially. The word expands to
string, with backslash-escaped characters replaced as specified by the
ANSI C standard.
https://www.gnu.org/software/bash/manual/html_node/ANSI_002dC-Quoting.html
If you run a script with sh script.sh, you're explicitly using sh as the shell rather than the one in the shebang line. That's bad news if sh isn't a link to bash. A plain sh shell may not support echo -e.
Type ./script.sh to use the interpreter in the shebang line.

Assign a Variable in bash login shell

i am trying to do this from a Windows command prompt.
C:\cygwin64\bin\bash --login -c "$var="<hallo>" &&
echo "$var""
and i get error :
The system cannot find the file specified.
but this works:
C:\cygwin64\bin\bash --login -c
"var="hello" && echo "$hello""
The login shell seems to cause the problem when it gets a '<'. how can i still assign the string with angle brackets to the shell variable?
When you write
C:\cygwin64\bin\bash --login -c "$var="<hallo>" && echo "$var""
You are expecting the shell to strip off the outer quotes from that argument to -c and end up with a string that looks like
$var="<hallo>" && echo "$var"
but that's not what the shell does.
The shell just matches quotes as it goes along. So the shell sees.
["$var="][<hallo>][" && echo "][$var][""].
You need to escape the inner quotes from the current shell or use different quotes to avoid this parsing problem.
C:\cygwin64\bin\bash --login -c 'var="<hallo>" && echo "$var"'
Note also that I removed the $ from the start of the variable name in the assignment and that I used single quotes on the outside so that the current shell didn't expand $var.
With double quotes on the outside you'd need to use something like this instead.
C:\cygwin64\bin\bash --login -c "var='<hallo>' && echo \"\$var\""
For a similar discussion of shell parsing and how things nest (or don't) with backticks you can see my answer here.

"bash -c" doesn't export vars from sourced scripts

I have an inclusion file test.inc:
export XXX=xxx
I use it when call bash to interpret a string:
bash -c ". test.inc; echo $XXX"
But the variable is not set at the point of echo command. If I do 'export' I can see it though:
bash -c ". test.inc; export"
Shows
declare -x XXX="XXX"
How do I make my first command see the exported variables from sourced files when I use bash -c syntax?
You are using double quotes. Therefore your current shell expands $XXX long before the bash -c instance sees it. Switch to single quotes, or escape the dollar sign.

Emulating sudo's behaviour with su

I'm trying to write a wrapper around su to make it more like sudo, so that su_wrapper foo bar baz == su -c "foo bar baz".
However, I'm not sure how to approach this problem. I came up with this:
su_wrapper ()
{
su -c "$#"
}
However, in the above, only a single argument can be there; this fails with multiple arguments (as su sees them as its own arguments).
There's also another problem: since the argument is passed through the shell, I think I must explicitly specify the shell to avoid other problems. Perhaps what I want to do could be expressed in pseudo-bash(!) as su -c 'bash -c "$#"'.
So, how could I make it accept multiple arguments?
Use printf "%q" to escape the parameters so they can be used as string input to your function:
su_wrapper() {
su -s /bin/bash -c "$(printf "%q " "$#")"
}
Unlike $*, this works even when the parameters contain special characters and whitespace.
You need $* and not $#:
su_wrapper() {
local IFS=' '
su -c "$*"
}
See the Special Parameters section in the Bash Reference Manual for the difference between $* and $#.
I added local IFS=' ' just in case IFS is set to something else (after reading what $* does, it should be clear why you want to make sure that IFS is set to a space).

Resources