Everybody says eval is evil, and you should use $() as a replacement. But I've run into a situation where the unquoting isn't handled the same inside $().
Background is that I've been burned too often by file paths with spaces in them, and so like to quote all such paths. More paranoia about wanting to know where all my executables are coming from. Even more paranoid, not trusting myself, and so like being able to display the created commands I'm about to run.
Below I try variations on using eval vs. $(), and whether the command name is quoted (cuz it could contain spaces)
BIN_LS="/bin/ls"
thefile="arf"
thecmd="\"${BIN_LS}\" -ld -- \"${thefile}\""
echo -e "\n Running command '${thecmd}'"
$($thecmd)
Running command '"/bin/ls" -ld -- "arf"'
./foo.sh: line 8: "/bin/ls": No such file or directory
echo -e "\n Eval'ing command '${thecmd}'"
eval $thecmd
Eval'ing command '"/bin/ls" -ld -- "arf"'
/bin/ls: cannot access arf: No such file or directory
thecmd="${BIN_LS} -ld -- \"${thefile}\""
echo -e "\n Running command '${thecmd}'"
$($thecmd)
Running command '/bin/ls -ld -- "arf"'
/bin/ls: cannot access "arf": No such file or directory
echo -e "\n Eval'ing command '${thecmd}'"
eval $thecmd
Eval'ing command '/bin/ls -ld -- "arf"'
/bin/ls: cannot access arf: No such file or directory
$("/bin/ls" -ld -- "${thefile}")
/bin/ls: cannot access arf: No such file or directory
So... this is confusing. A quoted command path is valid everywhere except inside a $() construct? A shorter, more direct example:
$ c="\"/bin/ls\" arf"
$ $($c)
-bash: "/bin/ls": No such file or directory
$ eval $c
/bin/ls: cannot access arf: No such file or directory
$ $("/bin/ls" arf)
/bin/ls: cannot access arf: No such file or directory
$ "/bin/ls" arf
/bin/ls: cannot access arf: No such file or directory
How does one explain the simple $($c) case?
The use of " to quote words is part of your interaction with Bash. When you type
$ "/bin/ls" arf
at the prompt, or in a script, you're telling Bash that the command consists of the words /bin/ls and arf, and the double-quotes are really emphasizing that /bin/ls is a single word.
When you type
$ eval '"/bin/ls" arf'
you're telling Bash that the command consists of the words eval and "/bin/ls" arf. Since the purpose of eval is to pretend that its argument is an actual human-input command, this is equivalent to running
$ "/bin/ls" arf
and the " gets processed just like at the prompt.
Note that this pretense is specific to eval; Bash doesn't usually go out of its way to pretend that something was an actual human-typed command.
When you type
$ c='"/bin/ls" arf'
$ $c
the $c gets substituted, and then undergoes word splitting (see §3.5.7 "Word Splitting" in the Bash Reference Manual), so the words of the command are "/bin/ls" (note the double-quotes!) and arf. Needless to say, this doesn't work. (It's also not very safe, since in addition to word-splitting, $c also undergoes filename-expansion and whatnot. Generally your parameter-expansions should always be in double-quotes, and if they can't be, then you should rewrite your code so they can be. Unquoted parameter-expansions are asking for trouble.)
When you type
$ c='"/bin/ls" arf'
$ $($c)
this is the same as before, except that now you're also trying to use the output of the nonworking command as a new command. Needless to say, that doesn't cause the nonworking command to suddenly work.
As Ignacio Vazquez-Abrams says in his answer, the right solution is to use an array, and handle the quoting properly:
$ c=("/bin/ls" arf)
$ "${c[#]}"
which sets c to an array with two elements, /bin/ls and arf, and uses those two elements as the word of a command.
With the fact that it doesn't make sense in the first place. Use an array instead.
$ c=("/bin/ls" arf)
$ "${c[#]}"
/bin/ls: cannot access arf: No such file or directory
From the man page for bash, regarding eval:
eval [arg ...]:
The args are read and concatenated together into a single command.
This command is then read and executed by the shell, and its exit
status is returned as the value of eval.
When c is defined as "\"/bin/ls\" arf", the outer quotes will cause the entire thing to be processed as the first argument to eval, which is expected to be a command or program. You need to pass your eval arguments in such a way that the target command and its arguments are listed separately.
The $(...) construct behaves differently than eval because it is not a command that takes arguments. It can process the entire command at once instead of processing arguments one at a time.
A note on your original premise: The main reason that people say that eval is evil was because it is commonly used by scripts to execute a user-provided string as a shell command. While handy at times, this is a major security problem (there's typically no practical way to safety-check the string before executing it). The security problem doesn't apply if you are using eval on hard-coded strings inside your script, as you are doing. However, it's typically easier and cleaner to use $(...) or `...` inside of scripts for command substitution, leaving no real use case left for eval.
Using set -vx helps us understand how bash process the command string.
As seen in the picture, "command" works cause quotes will be stripped when processing. However, when $c(quoted twice) is used, only the outside single quotes are removed. eval can process the string as the argument and outside quotes are removed step by step.
It is probably just related to how bash semanticallly process the string and quotes.
Bash does have many weird behaviours about quotes processing:
Bash inserting quotes into string before execution
How do you stop bash from stripping quotes when running a variable as a command?
Bash stripping quotes - how to preserve quotes
Related
What is the correct way to call some command stored in variable?
Are there any differences between 1 and 2?
#!/bin/sh
cmd="ls -la $APPROOTDIR | grep exception"
#1
$cmd
#2
eval "$cmd"
Unix shells operate a series of transformations on each line of input before executing them. For most shells it looks something like this (taken from the Bash man page):
initial word splitting
brace expansion
tilde expansion
parameter, variable and arithmetic expansion
command substitution
secondary word splitting
path expansion (aka globbing)
quote removal
Using $cmd directly gets it replaced by your command during the parameter expansion phase, and it then undergoes all following transformations.
Using eval "$cmd" does nothing until the quote removal phase, where $cmd is returned as is, and passed as a parameter to eval, whose function is to run the whole chain again before executing.
So basically, they're the same in most cases and differ when your command makes use of the transformation steps up to parameter expansion. For example, using brace expansion:
$ cmd="echo foo{bar,baz}"
$ $cmd
foo{bar,baz}
$ eval "$cmd"
foobar foobaz
If you just do eval $cmd when we do cmd="ls -l" (interactively and in a script), you get the desired result. In your case, you have a pipe with a grep without a pattern, so the grep part will fail with an error message. Just $cmd will generate a "command not found" (or some such) message.
So try use to eval (near "The args are read and concatenated together") and use a finished command, not one that generates an error message.
$cmd would just replace the variable with it's value to be executed on command line.
eval "$cmd" does variable expansion & command substitution before executing the resulting value on command line
The 2nd method is helpful when you wanna run commands that aren't flexible eg.
for i in {$a..$b} format loop won't work because it doesn't allow variables. In this case, a pipe to bash or eval is a workaround.
Tested on Mac OSX 10.6.8, Bash 3.2.48
I think you should put
`
(backtick) symbols around your variable.
What else could be going wrong? Sorry I'm pretty new to programming so I'm not sure if this is the proper way to frame my question.
Here is the code from the terminal file:
echo "Patcher Coded by _Retro_"
PLACE=`dirname $0`
ROM=`ls ${PLACE}/Rom/*.nds | head -n 1`
PATCH=`ls ${PLACE}/Patch/*.* | head -n 1`
NAME=${ROM%.[^.]*}
$PLACE/xdelta3 -dfs $ROM $PATCH $NAME-patched.nds
Your script says this:
PLACE=`dirname $0`
First, the shell performs parameter expansion. That means (in this case) it expands $0. The variable $0 expands to the path used by the shell to execute your script, so that line expands to this:
PLACE=`dirname /Users/ShakeyBanks/Desktop/Perfect Heart CE./DS_Rom_Patcher/Rom_Patcher`
Note that there are no backslashes in the expansion! The backslashes were consumed by your interactive shell before starting the script.
Then the shell performs command substitution: it executes the command enclosed in `...`. The shell splits the command on spaces, so the command contains four words. The first word is the program to run, and the remaining three words are arguments to that command:
dirname
/Users/ShakeyBanks/Desktop/Perfect
Heart
CE./DS_Rom_Patcher/Rom_Patcher
The problem here is that the dirname program only wants one argument, but you're passing it three. It detects this and fails with an error:
usage: dirname path
To fix this, quote the $0 with double-quotes, like this:
PLACE=`dirname "$0"`
You also need to quote all subsequent uses of $PLACE, ${PLACE}, $ROM, $PATCH, and $NAME with double-quotes, because they will have the same problem.
OR, rename your directory to not contain spaces.
The POSIX shell standard at
http://pubs.opengroup.org/onlinepubs/9699919799/utilities/V3_chap02.html#tag_18_07_04
says in section 2.6.3:
Command substitution allows the output of a command to be substituted in place of the command name itself
This would seem to imply that it is only guaranteed to work, if you substitute for the whole command name; if you substitute for a part of it, or something else, then it may or may not work.
Experimenting:
$ echo ;
$ $(echo echo) ;
$
So far so good...
$ e$(echo cho) ;
$ echo $(echo ';')
;
$ echo $(echo foobar)
foobar
The first and third example above seem to "work" but the second "does not work". Is this all simply undocumented and random behavior, as the standard seems to imply, and in reality for some other POSIX shell, none of these three are guaranteed to "work"?
(By "work", I mean "produce the same result as if the results of the substitution were typed in the command itself on the terminal")
The reason why this:
$ echo $(echo ';')
Does not output the same as this:
$ echo ;
Is the same reason why this:
$ ;
bash: syntax error near unexpected token `;'
Does not output the same as this:
$ `echo ';'`
;: command not found
The latest case, the output from command substitution (the (?)expected-to-be(?) token ;) is interpreted as a command, because it is passed [and interpreted] as a command in a subshell, and not as a command interpreter built-in token.
As my interpretation, this isn't against the standard.
EDIT: Answering the question
You stated:
This would seem to imply that it is only guaranteed to work, if you
substitute for the whole command name; if you substitute for a part of
it, or something else, then it may or may not work.
And the POSIX standard states:
The shell shall expand the command substitution by executing command
in a subshell environment (see Shell Execution Environment) and
replacing the command substitution (the text of command plus the
enclosing "$()" or backquotes) with the standard output of the
command, removing sequences of one or more <newline> characters at the
end of the substitution.
The standard seems clear regarding to what happens. This is not a question of what's present in a command containing a command substitution, only the command that's inside the enclosing $() or or ``.
I found this as a suggestion of how to store the output of "eval" into a variable called line. So, what's the use of \$$?
command = "some command"
line = $(eval \$$command)
The \$ prevents the shell from trying to treat the $ as the beginning of a parameter expansion. However, the code as a whole doesn't do anything useful. After fixing the whitespace issues and adding a real command to the example, your code looks like
command="ls -l"
line=$(eval \$$command)
command is simply a string ls -l. To evaluate the next line, the shell first evaluates the command substitution. The first step is to expand the parameter command, yielding line=$(eval \$ls -l). Quote removal gets rid of the backslash, so eval receives the arguments $ls and -l. Since ls presumably is not a variable, $ls is expanded to the empty string, and eval is left simply with -l to execute. There being no such command, you get an error.
You might think, then, that the correct form is simply
line=$(eval $command)
or slightly better
line=$(eval "$command")
That will work for simple cases, but not in general. This has been hashed over many times in many questions; see Bash FAQ 50, "I'm trying to put a command in a variable, but the complex cases always fail!" for the details.
To answer the literal question, though, \$$ is useful for outputing the string $$, instead of expanding it to the current process ID:
# The exact output will vary
$ echo $$
86542
# Literal quotes
$ echo \$\$
$$
# Escaping either quote is sufficient
$ echo \$$ $\$
$$ $$
I used this command in my Bash Shell:
printf $VAR1 >> `printf $VAR2`
and it normally worked. But when I write this into the script file and run it in Shell, it does not work. File "script.sh" contains this:
#!/bin/bash
printf $VAR1 >> `printf $VAR2`
and the output in Shell is:
script.sh: line2: `printf $VAR2`: ambiguous redirect
I don´t know, how is this possible, because the command is absolutely the same. And of course, I run the script on the same system and in the same Shell window.
Thank you for your help.
There are 3 points worth addressing here:
Shell variables vs. environment variables:
Scripts (unless invoked with . / source) run in a child process that only sees the parent [shell]'s environment variables, not its regular shell variables.
This is what likely happened in the OP's case: $VAR1 and $VAR2 existed as regular shell variables, but not environment variables, so script script.sh didn't see them.
Therefore, for a child process to see a parent shell's shell variables, the parent must export them first, as a result of which they (also) become environment variables: export VAR1=... VAR2=...
Bash's error messages relating to output redirection (>, >>):
If the filename argument to a an output redirection is an - unquoted command substitution (`...`, or its modern equivalent, $(...)) - i.e., the output from a command - Bash reports error ambiguous redirect in the following cases:
The command output has embedded whitespace, i.e., contains more than one word.
The command output is empty, which is what likely happened in the OP's case.
As an aside: In this case, the error message's wording is unfortunate, because there's nothing ambiguous about a missing filename - it simply cannot work, because files need names.
It is generally advisable to double-quote command substitutions (e.g., >> "$(...)") and also variable references (e.g., "$VAR2"): this will allow you to return filenames with embedded whitespace, and, should the output be unexpectedly empty, you'll get the (slightly) more meaningful error message No such file or directory.
Not double-quoting a variable reference or command substitution subjects its value / to so-called shell expansions: further, often unintended interpretation by the shell.
The wisdom of using a command substitution to generate a filename:
Leaving aside that printf $VAR2 is a fragile way to print the value of variable $VAR2 in general (the robust form again involves double-quoting: printf "$VAR2", or, even more robustly, to rule out inadvertent interpretation of escape sequences in the variable value, printf %s "$VAR2"), there is no good reason to employ a command substitution to begin with if all that's needed is a variable's value:
>> "$VAR2" is enough to robustly specify the value of variable $VAR2 as the target filename.
I tried this on my Mac (10.11.1) in a terminal window and it worked fine.
Are you sure your default shell is bash?
echo $SHELL
Did you use EXPORT to set your shell vars?
$ export VAR1="UselessData"
$ export VAR2="FileHoldingUselessData"
$ ./script.sh
$ cat FileHoldingUselessData
UselessData$
However.... echo I think does a better job since with printf the output terminates with the first space so....
$ cat script.sh
#!/bin/bash
echo $VAR1 >> `printf $VAR2`
$ ./script.sh
$ cat FileHoldingUselessData
Some Useless Data
Which leads me to believe you might want to just use echo instead of printf all together..
#!/bin/bash
echo $VAR1 >> `echo $VAR2`