Is it necessary to quote command substitutions during variable assignment in bash? - bash

Nearly everywhere I've read, including Google's bash scripting style guide mention to necessity of quoting command substitutions (except when specifically desired of course).
I understand the when/where/why of quoting command substitutions during general use. For example: echo "$(cat <<< "* useless string *")" rather than echo $(...)
However for variable assignments specifically, I have seen so many examples as such:
variable="$(command)"
Yet I have found no instances where variable=$(command) is not equivalent.
variable="$(echo "*")" and variable=$(echo "*") both set the value to '*'.
Can anyone give any situations where leaving the substitution unquoted during variable assigment would actually cause a problem?

The shell does not perform word splitting for variable assignments (it is standardized that way by POSIX and you can rely on it). Thus you do not need double quotes (but you can use them without making the result different) in
variable=$(command) # same as variable="$(command)"
However, word-splitting is performed before executing commands, so in
echo $(command)
echo "$(command)"
the result may be different. The latter keeps all multi-space sequences, while the former makes each word a different argument to echo. It is up to you to decide which is the desired behavior.
Interesting shell quirk: there is one more place where quoting a substitution or not makes no difference, namely the expression in a case expr in construct.
case $FOO in
(frob) ...;;
esac
is indistinguishable from
case "$FOO" in
(frob) ...;;
esac

When using BASH, those two lines are 100% equivalent:
variable="$(command)"
variable=$(command)
while these two aren't:
echo $(command)
echo "$(command)"
Alas, the human brain isn't a computer. It's especially not as reliable when it comes to repeat a job.
So chances are that if you mix styles (i.e. you quote when you use command arguments but you don't quote when you assign a variable), that once in a while, you'll get it wrong.
Worse, once in a while you will want the command result to be expanded into individual words. And the next guy who reads your code will wonder if he's looking at a bug.
Conculsion: Since the first two lines are the same, it's easier on the average brain to always quote $() (even when it's not necessary) to make sure you always quote when you have to.

Testing with so many different commands, it is the same. Can't find a difference.

Related

Why are quotes preserved when using bash $() syntax, but not if executed manually?

I have the following bash script:
$ echo $(dotnet run --project Updater)
UPDATE_NEEDED='0' MD5_SUM="7e3ad68397421276a205ac5810063e0a"
$ export UPDATE_NEEDED='0' MD5_SUM="7e3ad68397421276a205ac5810063e0a"
$ echo $UPDATE_NEEDED
0
$ export $(dotnet run --project Updater)
$ echo $UPDATE_NEEDED
'0'
Why is it $UPDATE_NEEDED is 0 on the 3rd command, but '0' on the 5th command?
What would I need to do to get it to simply set 0? Using UPDATE_NEEDED=0 instead is not an option, as some of the other variables may contain a space (And I'd like to optimistically quote them to have it properly parse spaces).
Also, this is a bit of a XY problem. If anyone knows an easier way to export multiple variables from an executable that can be used later on in the bash script, that could also be useful.
To expand on the answer by Glenn:
When you write something like export UPDATE_NEEDED='0' in Bash code, this is 100% identical to export UPDATE_NEEDED=0. The quotes are used by Bash to parse the command expression, but they are then discarded immediately. Their only purpose is to prevent word splitting and to avoid having to escape special characters. In the same vein, the code fragment 'foo bar' is exactly identical to foo\ bar as far as Bash is concerned: both lead to space being treated as literal rather than as a word splitter.
Conversely, parameter expansion and command substitution follows different rules, and preserves literal quotes.
When you use eval, the command line arguments passed to eval are treated as if they were Bash code, and thus follow the same rules of expansion as regular Bash code, which leads to the same result as (1).
Apparently that Updater project is doing the equivalent of
echo "UPDATE_NEEDED=\'0\' MD5_SUM=\"7e3ad68397421276a205ac5810063e0a\""
It's explicitly outputting the quotes.
When you do export UPDATE_NEEDED='0' MD5_SUM="7e3ad68397421276a205ac5810063e0a",
bash will eventually remove the quotes before actually setting the variables.
I agree with #pynexj, eval is warranted here, although additional quoting is recommended:
eval export "$(dotnet ...)"

zsh substituion - what's the difference between $VAR and ${VAR}?

I recently converted a shell script from bash to zsh and got a strange error. I had a command like
HOST="User#1.1.1.1"
scp "$BASE_DIR/path/to/file" $HOST:some\\path
This worked fine in bash, but zsh failed with a bad substitution. I fixed this by change $HOST to ${HOST}, but I'm curious as to why this was necessary. Also, strangely, I had a few such scp commands, and all of them "worked" except the first one. However, I ended up with a file called User#1.1.1.1 on my filesystem which was really unexpected. Why did this subtle change make such a big difference?
Two possible problems (1) Extra '$' at the beginning of the assignment, and (2) embedded spaces.
The first potential problem is the assignment in the style $var=foo. In zsh like in other sh-like engines (ksh, bash, ...), the assignment operation is VAR=value - no $.
The second potential problem are the spaces. No spaces are allowed between the variables name, the '=' and the value. Spaces in the value must be escaped (with quotes, or backslash)
Potential correction:
HOST=User#1.1.1.1
scp "$BASE_DIR/path/to/file" $HOST:some\\path
As chepner mentioned in the commments, zsh has modifiers that are added via :. So $HOST:some was interpreted as $HOST:s by zsh.
A list of modifiers can be found here: https://web.cs.elte.hu/local/texinfo/zsh/zsh_23.html

prevent script injection when spawning command line with input arguments from external source

I've got a python script that wraps a bash command line tool, that gets it's variables from external source (environment variables). is there any way to perform some soft of escaping to prevent malicious user from executing bad code in one of those parameters.
for example if the script looks like this
/bin/sh
/usr/bin/tool ${VAR1} ${VAR2}
and someone set VAR2 as follows
export VAR2=123 && \rm -rf /
so it may not treat VAR2 as pure input, and perform the rm command.
Is there any way to make the variable non-executable and take the string as-is to the command line tool as input ?
The correct and safe way to pass the values of variables VAR1 and VAR2 as arguments to /usr/bin/tool is:
/usr/bin/tool -- "$VAR1" "$VAR2"
The quotes prevent any special treatment of separator or pattern matching characters in the strings.
The -- should prevent the variable values being treated as options if they begin with - characters. You might have to do something else if tool is badly written and doesn't accept -- to terminate command line options.
See Quotes - Greg's Wiki for excellent information about quoting in shell programming.
Shellcheck can detect many cases where quotes are missing. It's available as either an online tool or an installable program. Always use it if you want to eliminate many common bugs from your shell code.
The curly braces in the line of code in the question are completely redundant, as they usually are. Some people mistakenly think that they act as quotes. To understand their use, see When do we need curly braces around shell variables?.
I'm guessing that the /bin/sh in the question was intended to be a #! /bin/sh shebang. Since the question was tagged bash, note that #! /bin/sh should not be used with code that includes Bashisms. /bin/sh may not be Bash, and even if it is Bash it behaves differently when invoked as /bin/sh rather than /bin/bash.
Note that even if you forget the quotes the line of code in the question will not cause commands (like rm -rf /) embedded in the variable values to be run at that point. The danger is that badly-written code that uses the variables will create and run commands that include the variable values in unsafe ways. See should I avoid bash -c, sh -c, and other shells' equivalents in my shell scripts? for an explanation of (only) some of the dangers.
To avoid injections at best, consider switching to [T]csh.
Unlike Bourne Shells, the C Shell is "limited", thus instructing one to take different, safer paths to write scripts. The "limitations" imposed by the C Shell make it one of the most reliable Shells to work with.
(E.g: Nesting is minimal to impossible, thus preventing injections at all costs; there are better ways to achieve what one want.)

Bash tilde not expanding in certain arguments, such as --home_dir=~

Bash is not expanding the ~ character in the argument --home_dir=~. For example:
$ echo --home_dir=~
--home_dir=~
Bash does expand ~ when I leave out the hyphens:
$ echo home_dir=~
home_dir=/home/reedwm
Why does Bash have this behavior? This is irritating, as paths with ~ are not expanded when I specify that path as an argument to a command.
bash is somewhat mistakenly treating home_dir=~ as an assignment. As such, the ~ is eligible for expansion:
Each variable assignment is checked for unquoted tilde-prefixes immediately following a : or the first =. In these cases, tilde expansion is
also performed.
Since --home_dir is not a valid identifier, that string is not mistaken for an assignment.
Arguably, you have uncovered a bug in bash. (I say arguably, because if you use set -k, then home_dir=~ is an assignment, even though it is after, not before, the command name.)
However, when in doubt, quote a string that is meant to be treated literally whether or not it is subject to any sort of shell processing.
echo '--home_dir=~'
Update: This is intentional, according to the maintainer, to allow assignment-like argument for commands like make to take advantage of tilde-expansion. (And commands like export, which for some reason I was thinking were special because they are builtins, but tilde expansion would have to occur before the actual command is necessarily known.)
Like chepner says in their answer, according to the documentation, it shouldn't expand it even in echo home_dir=~. But for some reason it does expand it in any word that even looks like an assignment, and has done so at least as far back as in 3.2.
Most other shells also don't expand the tilde except in cases where it really is at the start of the word, so depending on it working might not be such a good idea.
Use "$HOME" instead if you want it to expand, and "~" if you want a literal tilde. E.g.
$ echo "~" --foo="$HOME"
~ --foo=/home/itvirta
(The more complex cases are harder to do manually, but most of the time it's the running user's own home directory one wants.)
Well, that's because in echo --home_dir=~, the '~' does not begin the word and the output of echo is not considered a variable assignment. Specifically, man bash "Tilde Expansion" provides expansion if
If a word begins with an unquoted tilde character (~); or
variable assignment is checked for unquoted tilde-prefixes immediately following a : or the first =.
You case doesn't qualify as either.

Brace expansion with range in fish shell

In bash, I can do the following
$ echo bunny{1..6}
bunny1 bunny2 bunny3 bunny4 bunny5 bunny6
Is there a way to achieve the same result in fish?
The short answer is echo bunny(seq 6)
Longer answer: In keeping with fish's philosophy of replacing magical syntax with concrete commands, we should hunt for a Unix command that substitutes for the syntactic construct {1..6}. seq fits the bill; it outputs numbers in some range, and in this case, integers from 1 to 6. fish (to its shame) omits a help page for seq, but it is a standard Unix/Linux command.
Once we have found such a command, we can leverage command substitutions. The command (foo)bar performs command substitution, expanding foo into an array, and may result in multiple arguments. Each argument has 'bar' appended.

Resources