I have a script like this -
#!/usr/bin/env bash
set -e
volume=$(docker volume inspect volume_name | jq '.[0].Mountpoint')
echo $volume
createdir=$volume/testdir
mkdir -p ${createdir}
But it does not create any directory, in the volume path. echo $volume does print the correct path - /var/lib/docker/volumes/volume_name/_data
When I give mkdir -p /var/lib/docker/volumes/volume_name/_data/testdir. It creates it. What am I doing wrong with substituting?
Your issue is because your jq call is missing a -r option switch to output a raw string rather than a JSON string that is not usable as a path.
See man jq:
--raw-output / -r:
With this option, if the filter´s result is a string then it will be written directly to standard output rather than being formatted as a JSON string with quotes. This can be useful for making jq filters talk to non-JSON-based systems.
Also, to prevent word splitting of paths, always add double quotes around variables's expansion.
I detail cases where double quotes are optional or mandatory in the code's comments; although, in doubt, adding double quotes is safe, except for special cases of:
Explicitly desirable word splitting or globbing match of pattern.
Variable expansion as a RegEx pattern.
Here is your code with fixes:
#!/usr/bin/env bash
# Expansion of sub-shell output does not need double quotes,
# because it is a simple variable assignment.
# It would be mandatory if the output was an argument to a command.
volume=$(docker volume inspect volume_name | jq -r '.[0].Mountpoint')
# Here double quotes are fancy optional but a good practice anyway.
# If not present and volume contained a globbing pattern,
# it would be expanded. It would also generate a path check access
# to the file-system. Better be safe with double quotes.
echo "$volume"
# Here double quotes are optional because it is an assignment and not
# an argument to a command.
createdir=$volume/testdir
# Here double quotes are mandatory,
# as the variable is an argument to the mkdir command.
mkdir -p -- "$createdir"
Related
So I got the following alias:
alias current_dir="pwd | sed -e 's/ /\\ /'"
In another place, I first safe my returned string in order to use parameter expansion to lowercase the string, like so:
CURRENT_DIR=$(current_dir)
echo "${CURRENT_DIR,,}"
But I wonder if it is possible to directly use parameter expansion on a alias/function call? I tried the following possibilities but they all didn't work for me:
echo "${current_dir,,}" # just an empty echo
echo "${$(current_dir),,}" # bad substitution
echo "${"$(current_dir)",,}" # bad substitution
No, it's not possible. You have to save the output in an intermediate variable. It's unavoidable.
You could use
declare -l CURRENT_DIR=$(current_dir)
Although Shellcheck has some sage words about declare and command substitution on the same line
However, to get a properly shell-quoted/escaped version of the string, use one of
$ mkdir '/tmp/a dir "with quotes and spaces"'
$ cd !$
$ printf -v CURRENT_DIR "%q" "$PWD"
$ echo "$CURRENT_DIR"
/tmp/a\ dir\ \"with\ quotes\ and\ spaces\"
$ CURRENT_DIR=${PWD#Q}
$ echo "$CURRENT_DIR"
'/tmp/a dir "with quotes and spaces"'
Get out of the habit of using ALLCAPS variable names, leave those as
reserved by the shell. One day you'll write PATH=something and then
wonder why
your script is broken.
${var#operator} showed up in bash 4.4:
${parameter#operator}
Parameter transformation. The expansion is either a transformation of the
value of parameter or information about parameter itself, depending on the
value of operator. Each operator is a single letter:
Q The expansion is a string that is the value of parameter quoted in
a format that can be reused as input.
E The expansion is a string that is the value of parameter with back-
slash escape sequences expanded as with the $'...' quoting mechan-
sim.
P The expansion is a string that is the result of expanding the value
of parameter as if it were a prompt string (see PROMPTING below).
A The expansion is a string in the form of an assignment statement or
declare command that, if evaluated, will recreate parameter with
its attributes and value.
a The expansion is a string consisting of flag values representing
parameter's attributes.
If parameter is # or *, the operation is applied to each positional param-
eter in turn, and the expansion is the resultant list. If parameter is an
array variable subscripted with # or *, the case modification operation is
applied to each member of the array in turn, and the expansion is the
resultant list.
My problem boils down to this:
echo $(echo '*')
That outputs the names of all the files in the current directory.
I do not want that. I want a literal asterisk (*).
How do I do this in a generic way?
My above example is simplified. The asterisk is not literally written in my bash script - it comes from the result of another command.
So this is perhaps closer to my real situation:
echo $(my-special-command)
I just want to get the literal output of my-special-command; I do not want any embedded asterisks (or other special characters) to be expanded.
How do I do this in a general-purpose way?
I suppose I could do set -f before running the command, but how can I be sure that covers everything? That turns off pathname expansion, but what about other kinds? I have zero control over what output might be produced by my-special-command, so must be able to handle everything properly.
Just enclose the Command substitution with double quotes:
echo "$(my-special-command)"
Its called globbing, you have multiply ways to prevent it:
echo * # will expand to files / directories
echo "*" # Will print *
echo '*' # Will also print *
In your example you can simple write:
echo "$(echo '*')"
You can also turn off globbing in your script by calling it with bash -f script.sh or inside your code:
#!/usr/bin/env bash
set -f
echo *
From the "Command Substitution" section of the man page:
If the [command] substitution appears within double quotes, word splitting and
pathname expansion are not performed on the results.
By quoting the command expansion, you prevent its result, *, from undergoing pathname expansion.
$ echo "$(echo "*")"
I have a bash script and I want to use that for replacing some lines with a string and add a date to the end of the line:
#! /bin/bash
today=`date '+%Y_%m_%d__%H_%M_%S'`;
sed -i '3s/.*/CONFIG_LOCALVERSION=/' ~/Desktop/file1 file2 ...
Also, can I do this for a range of files that start with a string like "file"?
To use variable expansion in bash, the variables must be non-quoted or double-quoted. Single quotes will prevent the expansion. On the other hand, you'd want to avoid expansion of * in 3s/.*/ in case you have a directory 3s containing files starting with ..
Fortunately, you can just chain strings together, so you can do
#!/bin/bash
today=$(date '+%Y_%m_%d__%H_%M_%S');
sed -i '3s/.*/CONFIG_LOCALVERSION='"$today"'/' ~/Desktop/file{1,2,Foo}
and can i do this for a range of file that start with a string like "file" ?
The glob ~/Desktop/file{1,2,Foo} will expand to ~/Desktop/file1 ~/Desktop/file2 ~/Desktop/fileFoo. If instead you want to match all files on your Desktop with a name starting with 'file', use ~/Desktop/file* instead.
Sample bash script
QRY="select * from mysql"
CMD="mysql -e \"$QRY\""
`$CMD`
I get errors because the * is getting evaluated as a glob (enumerating) files in my CWD.
I Have seen other posts that talk about quoting the "$CMD" reference for purposes of echo output, but in this case
"$CMD"
complains the whole literal string as a command.
If I
echo "$CMD"
And then copy/paste it to the command line, things seems to work.
You can just use:
qry='select * from db'
mysql -e "$qry"
This will not subject to * expansion by shell.
If you want to store mysql command line also then use BASH arrays:
cmd=(mysql -e "$qry")
"${cmd[#]}"
Note: anubhava's answer has the right solution.
This answer provides background information.
As for why your approach didn't work:
"$CMD" doesn't work, because bash sees the entire value as a single token that it interprets as a command name, which obviously fails.
`$CMD`
i.e., enclosing $CMD in backticks, is pointless in this case (and will have unintended side effects if the command produces stdout output[1]); using just:
$CMD
yields the same - broken - result (only more efficiently - by enclosing in backticks, you needlessly create a subshell; use backticks - or, better, $(...) only when embedding one command in another - see command substitution).
$CMD doesn't work,
because unquoted use of * subjects it to pathname expansion (globbing) - among other shell expansions.
\-escaping glob chars. in the string causes the \ to be preserved when the string is executed.
While it may seem that you've enclosed the * in double quotes by placing it (indirectly) between escaped double quotes (\"$QRY\") inside a double-quoted string, the shell does not see what's between these escaped double quotes as a single, double-quoted string.
Instead, these double quotes become literal parts of the tokens they abut, and the shell still performs word splitting (parsing into separate arguments by whitespace) on the string, and expansions such as globbing on the resulting tokens.
If we assume for a moment that globbing is turned off (via set -f), here is the breakdown of the arguments passed to mysql when the shell evaluates (unquoted) $CMD:
-e # $1 - all remaining arguments are the unintentionally split SQL command.
"select # $2 - note that " has become a literal part of the argument
* # $3
from # $4
mysql" # $5 - note that " has become a literal part of the argument
The only way to get your solution to work with the existing, single string variable is to use eval as follows:
eval "$CMD"
That way, the embedded escaped double-quoted string is properly parsed as a single, double-quoted string (to which no globbing is applied), which (after quote removal) is passed as a single argument to mysql.
However, eval is generally to be avoided due to its security implications (if you don't (fully) control the string's content, arbitrary commands could be executed).
Again, refer to anubhava's answer for the proper solution.
[1] A note re using `$CMD` as a command by itself:
It causes bash to execute stdout output from $CMD as another command, which is rarely the intent, and will typically result in a broken command or, worse, a command with unintended effects.
Try running `echo ha` (with the backticks - same as: $(echo ha)); you'll get -bash: ha: command not found, because bash tries to execute the command's output - ha - as a command, which fails.
I have a simple bash script to run a remote command on a given set of servers.
#!/bin/bash
echo "Command to be run:"
echo "$*"
read nothing
servers="server1 server2 server3"
for server in `echo $servers`
do
echo $server
ssh $server "$*"
echo ""
done
The problem is that the command could contain any number of arguments, hence the use of $* and could also have many different characters including quotes and regular expressions. The basic need here is for the shell to take the arguments, whatever they are, literally so they are passed to the remote server intact without removing quotes or interpreting parenthesis etc.
There are a number of variations I have seen but most deal with a specific character problem or overcomplicate the script or arguments required, and I'm looking to keep at least the arguments free of escape characters etc.
An example with using "#":
./cmd tw_query --no-headings "search Host where name matches '(?i)peter' show summary, nodecount(traverse :::Detail where name matches 'bob')"
Gives:
Command to be run:
tw_query --no-headings search Host where name matches '(?i)peter' show summary, nodecount(traverse :::Detail where name matches 'bob')
You seem to be looking for $#. Say:
ssh $server "$#"
instead. From the manual:
*
Expands to the positional parameters, starting from one. When the expansion occurs within double quotes, it expands to a single word
with the value of each parameter separated by the first character of
the IFS special variable. That is, "$*" is equivalent to "$1c$2c…",
where c is the first character of the value of the IFS variable. If
IFS is unset, the parameters are separated by spaces. If IFS is null,
the parameters are joined without intervening separators.
#
Expands to the positional parameters, starting from one. When the expansion occurs within double quotes, each parameter expands to a
separate word. That is, "$#" is equivalent to "$1" "$2" …. If the
double-quoted expansion occurs within a word, the expansion of the
first parameter is joined with the beginning part of the original
word, and the expansion of the last parameter is joined with the last
part of the original word. When there are no positional parameters,
"$#" and $# expand to nothing (i.e., they are removed).
You actually don't want the arguments passed to the remote server intact, you want them passed to the remote command intact. But that means they need to be wrapped in an extra layer of quotes/escapes/etc so that so that they will come out intact after the remote shell has parsed them.
bash actually has a feature in its printf builtin to add quoting/escaping to a string, but it quotes suitably for interpretation by bash itself -- if the remote shell were something else, it might not understand the quoting mode that it chooses. So in this case I'd recommend a simple-and-dumb quoting style: just add single-quotes around each argument, and replace each single-quote within the argument with '\'' (that'll end the current quoted string, add an escaped (literal) quote, then start another quoted string). It'll look a bit weird, but should decode properly under any POSIX-compliant shell.
Converting to this format is a bit tricky, since bash does inconsistent things with quotes in its search-and-replace patterns. Here's what I came up with:
#!/bin/bash
quotedquote="'\''"
printf -v quotedcommand "'%s' " "${#//\'/$quotedquote}"
echo "Command to be run:"
echo "$quotedcommand"
read nothing
servers="server1 server2 server3"
for server in $servers
do
echo $server
ssh $server "$quotedcommand"
echo ""
done
And here's how it quotes your example command:
'tw_query' '--no-headings' 'search Host where name matches '\''(?i)peter'\'' show summary, nodecount(traverse :::Detail where name matches '\''bob'\'')'
It looks strange to have the command itself quoted, but as long as you aren't trying to use an alias this doesn't cause any actual trouble. There is one significant limitation, though: there's no way to pass shell metacharacters (like > for output redirection) to the remote shell:
./cmd somecommand >outfile # redirect is done on local computer
./cmd somecommand '>outfile' # ">outfile" is passed to somecommand as an argument
If you need to do things like remote redirects, things get a good deal more complicated.
Besides the issue with $* versus $#, if this is for use in a production environment, you might want to consider a tool such as pdsh.
Otherwise, you can try feeding the commands to your script through stdin rather than putting them in argument so you avoid one level of parsing.
#!/bin/bash
read cmd
echo "Command to be run:"
echo $cmd
servers="server1 server2 server3"
for server in `echo $servers` do
echo $server
ssh $server "$cmd"
echo ""
done
and use it like this
$ ./cmd <<'EOT'
> tw_query --no-headings "search Host where name matches '(?i)peter' show summary, nodecount(traverse :::Detail where name matches 'bob')"
> EOT
Command to be run:
tw_query --no-headings "search Host where name matches '(?i)peter' show summary, nodecount(traverse :::Detail where name matches 'bob')"
Maybe a little far-fetched, but it could work.