Bash arguments literal interpretation - bash

I have a simple bash script to run a remote command on a given set of servers.
#!/bin/bash
echo "Command to be run:"
echo "$*"
read nothing
servers="server1 server2 server3"
for server in `echo $servers`
do
echo $server
ssh $server "$*"
echo ""
done
The problem is that the command could contain any number of arguments, hence the use of $* and could also have many different characters including quotes and regular expressions. The basic need here is for the shell to take the arguments, whatever they are, literally so they are passed to the remote server intact without removing quotes or interpreting parenthesis etc.
There are a number of variations I have seen but most deal with a specific character problem or overcomplicate the script or arguments required, and I'm looking to keep at least the arguments free of escape characters etc.
An example with using "#":
./cmd tw_query --no-headings "search Host where name matches '(?i)peter' show summary, nodecount(traverse :::Detail where name matches 'bob')"
Gives:
Command to be run:
tw_query --no-headings search Host where name matches '(?i)peter' show summary, nodecount(traverse :::Detail where name matches 'bob')

You seem to be looking for $#. Say:
ssh $server "$#"
instead. From the manual:
*
Expands to the positional parameters, starting from one. When the expansion occurs within double quotes, it expands to a single word
with the value of each parameter separated by the first character of
the IFS special variable. That is, "$*" is equivalent to "$1c$2c…",
where c is the first character of the value of the IFS variable. If
IFS is unset, the parameters are separated by spaces. If IFS is null,
the parameters are joined without intervening separators.
#
Expands to the positional parameters, starting from one. When the expansion occurs within double quotes, each parameter expands to a
separate word. That is, "$#" is equivalent to "$1" "$2" …. If the
double-quoted expansion occurs within a word, the expansion of the
first parameter is joined with the beginning part of the original
word, and the expansion of the last parameter is joined with the last
part of the original word. When there are no positional parameters,
"$#" and $# expand to nothing (i.e., they are removed).

You actually don't want the arguments passed to the remote server intact, you want them passed to the remote command intact. But that means they need to be wrapped in an extra layer of quotes/escapes/etc so that so that they will come out intact after the remote shell has parsed them.
bash actually has a feature in its printf builtin to add quoting/escaping to a string, but it quotes suitably for interpretation by bash itself -- if the remote shell were something else, it might not understand the quoting mode that it chooses. So in this case I'd recommend a simple-and-dumb quoting style: just add single-quotes around each argument, and replace each single-quote within the argument with '\'' (that'll end the current quoted string, add an escaped (literal) quote, then start another quoted string). It'll look a bit weird, but should decode properly under any POSIX-compliant shell.
Converting to this format is a bit tricky, since bash does inconsistent things with quotes in its search-and-replace patterns. Here's what I came up with:
#!/bin/bash
quotedquote="'\''"
printf -v quotedcommand "'%s' " "${#//\'/$quotedquote}"
echo "Command to be run:"
echo "$quotedcommand"
read nothing
servers="server1 server2 server3"
for server in $servers
do
echo $server
ssh $server "$quotedcommand"
echo ""
done
And here's how it quotes your example command:
'tw_query' '--no-headings' 'search Host where name matches '\''(?i)peter'\'' show summary, nodecount(traverse :::Detail where name matches '\''bob'\'')'
It looks strange to have the command itself quoted, but as long as you aren't trying to use an alias this doesn't cause any actual trouble. There is one significant limitation, though: there's no way to pass shell metacharacters (like > for output redirection) to the remote shell:
./cmd somecommand >outfile # redirect is done on local computer
./cmd somecommand '>outfile' # ">outfile" is passed to somecommand as an argument
If you need to do things like remote redirects, things get a good deal more complicated.

Besides the issue with $* versus $#, if this is for use in a production environment, you might want to consider a tool such as pdsh.
Otherwise, you can try feeding the commands to your script through stdin rather than putting them in argument so you avoid one level of parsing.
#!/bin/bash
read cmd
echo "Command to be run:"
echo $cmd
servers="server1 server2 server3"
for server in `echo $servers` do
echo $server
ssh $server "$cmd"
echo ""
done
and use it like this
$ ./cmd <<'EOT'
> tw_query --no-headings "search Host where name matches '(?i)peter' show summary, nodecount(traverse :::Detail where name matches 'bob')"
> EOT
Command to be run:
tw_query --no-headings "search Host where name matches '(?i)peter' show summary, nodecount(traverse :::Detail where name matches 'bob')"
Maybe a little far-fetched, but it could work.

Related

How do I redirect output of echo command in bash script to script's directory?

Fairly simple, I have an echo statement in my shell (bash) script that I want to redirect to an external file as a means of logging. The script is run by the root user's crontab every hour (via sudo crontab -e).
I want this log file to reside in the same directory as the script.
My first attempt was,
echo "$(date) | blah blah" >> "$(pwd)/script.log"
However, this clearly does not work as the working directory of the root crontab (/root/) is not the same as the directory of the script. So following some advice on SO I did instead,
script_dir=$(dirname $0)
echo "$(date) | blah blah" >> "$(script_dir)/script.log"
This time, for reasons I do not yet understand, the log file is saved under /, as in /script.log.
Logically one would assume that the variable script_dir was evaluated to an empty string and so "$(script_dir)/script.log" was evaluated as "/script.log".
But as a test, I wrote a simple test script,
echo "$(dirname $0)"
and ran it from /. Sure enough, I get a proper non-empty output: /home/pi/scripts/testscripts/dirname.sh (where the test script I wrote resides).
So what's the deal, and how do I get this to work?
p.s. bash --version says I am currently running GNU bash, version 4.3.30(1)-release
You need the curly braces within double-quotes to expand variables in bash, something like,
echo "$(date) | blah blah" >> "${script_dir}/script.log"
Shell Parameter Expansion
The ‘$’ character introduces parameter expansion, command substitution, or arithmetic expansion.
The basic form of parameter expansion is ${parameter}. The value of parameter is substituted. The braces are required when parameter is a positional parameter with more than one digit, or when parameter is followed by a character that is not to be interpreted as part of its name.
More on Parameter expansion
${PARAMETER}
The easiest form is to just use a parameter's name within braces. This is identical to using $FOO like you see it everywhere, but has the advantage that it can be immediately followed by characters that would be interpreted as part of the parameter name otherwise. Compare these two expressions (WORD="car" for example), where we want to print a word with a trailing "s":
echo "The plural of $WORD is most likely $WORDs"
echo "The plural of $WORD is most likely ${WORD}s"
Why does the first one fail? It prints nothing, because a parameter (variable) named "WORDs" is undefined and thus printed as "" (nothing). Without using braces for parameter expansion, Bash will interpret the sequence of all valid characters from the introducing "$" up to the last valid character as name of the parameter. When using braces you just force Bash to only interpret the name inside your braces.
This line has an error :
echo "$(date) | blah blah" >> "$(script_dir)/script.log"
It should be :
echo "$(date) | blah blah" >> "$script_dir/script.log"
The "$(script_dir)" syntax tries to execute a command named script_dir and capture its output to use as a value inside the string. What you need is a simple variable expansion, $script_dir, which simply extracts the value of a variable named script_dir.

Escaping a literal asterisk as part of a command

Sample bash script
QRY="select * from mysql"
CMD="mysql -e \"$QRY\""
`$CMD`
I get errors because the * is getting evaluated as a glob (enumerating) files in my CWD.
I Have seen other posts that talk about quoting the "$CMD" reference for purposes of echo output, but in this case
"$CMD"
complains the whole literal string as a command.
If I
echo "$CMD"
And then copy/paste it to the command line, things seems to work.
You can just use:
qry='select * from db'
mysql -e "$qry"
This will not subject to * expansion by shell.
If you want to store mysql command line also then use BASH arrays:
cmd=(mysql -e "$qry")
"${cmd[#]}"
Note: anubhava's answer has the right solution.
This answer provides background information.
As for why your approach didn't work:
"$CMD" doesn't work, because bash sees the entire value as a single token that it interprets as a command name, which obviously fails.
`$CMD`
i.e., enclosing $CMD in backticks, is pointless in this case (and will have unintended side effects if the command produces stdout output[1]); using just:
$CMD
yields the same - broken - result (only more efficiently - by enclosing in backticks, you needlessly create a subshell; use backticks - or, better, $(...) only when embedding one command in another - see command substitution).
$CMD doesn't work,
because unquoted use of * subjects it to pathname expansion (globbing) - among other shell expansions.
\-escaping glob chars. in the string causes the \ to be preserved when the string is executed.
While it may seem that you've enclosed the * in double quotes by placing it (indirectly) between escaped double quotes (\"$QRY\") inside a double-quoted string, the shell does not see what's between these escaped double quotes as a single, double-quoted string.
Instead, these double quotes become literal parts of the tokens they abut, and the shell still performs word splitting (parsing into separate arguments by whitespace) on the string, and expansions such as globbing on the resulting tokens.
If we assume for a moment that globbing is turned off (via set -f), here is the breakdown of the arguments passed to mysql when the shell evaluates (unquoted) $CMD:
-e # $1 - all remaining arguments are the unintentionally split SQL command.
"select # $2 - note that " has become a literal part of the argument
* # $3
from # $4
mysql" # $5 - note that " has become a literal part of the argument
The only way to get your solution to work with the existing, single string variable is to use eval as follows:
eval "$CMD"
That way, the embedded escaped double-quoted string is properly parsed as a single, double-quoted string (to which no globbing is applied), which (after quote removal) is passed as a single argument to mysql.
However, eval is generally to be avoided due to its security implications (if you don't (fully) control the string's content, arbitrary commands could be executed).
Again, refer to anubhava's answer for the proper solution.
[1] A note re using `$CMD` as a command by itself:
It causes bash to execute stdout output from $CMD as another command, which is rarely the intent, and will typically result in a broken command or, worse, a command with unintended effects.
Try running `echo ha` (with the backticks - same as: $(echo ha)); you'll get -bash: ha: command not found, because bash tries to execute the command's output - ha - as a command, which fails.

Jenkins text parameter - special characters garbled (unwanted variable substitution)

I have a job in Jenkins (under Linux) with build parameter of type "Text". I use the parameter to form contents of a file which is used in build process, with bash "shell execute" step like echo "$TEXTPARAM" > file.
It works quite well if there is general text. But when characters like "$" appears - it behaves strangely.
E. g. text
Some $one and $$two and $$$more bucks and $ bucks $$ that $$$ stand alone and$ after$$ words$$$
is transformed into
Some $one and $two and $-sl bucks and $ bucks $ that $$ stand alone and$ after$ words$$
though I want the text to appear in file just as it appears in input textbox.
Is it a bug in jenkins (so I should post an issue to their tracker) or am I doing something wrong?
UPDATE
I suppose that is due to variable substitution done by Jenkins. I.e all $VARNAMEs are substituted by VARNAME values prior to any "shell execute" steps are executed. And this substitution cannot be turned off as for now.
According to a comment in this ticket https://issues.jenkins-ci.org/browse/JENKINS-16143
This appears to not be a bug. Compare the parameter values $JENKINS_URL and $$JENKINS_URL.
Jenkins internally resolves placeholders in variables,
and dollars are used for that. $$ is an escaped $.
I am observing the same behavior for string and text fields on Jenkins ver. 1.562
Expansion with jenkins-cli.jar
It's true, Jenkins expands build variable names like JENKINS_URL or BUILD_NUMBER when they are prefixed by $. However, there are additional transformations—rather unexpected ones—if you use jenkins-cli.jar.
carriage return becomes \r (backslash + "r")
new line becomes \n (backslash plus + "n")
tab becomes \t (backslash plus "t")
Here is the corresponding part of the source code of jenkins-cli.jar.
I do not know any way of escaping or quoting to keep a white space character that is part of the value of a parameter for a Jenkins job when using jenkins-cli.jar
Expansion when using "raw" ssh (without jenkins-cli.jar)
The Jenkins master handles white space, backslashes and quotes on the command line somewhat like a shell:
a remains a
'a' becomes a
"a b" becomes a b
a b is an error, because the command line parser of Jenkins will see b
a" " becomes a ("a" plus space)
My idea was to re-implement the code that does the quoting in jenkins-cli.jar (minus the bugs when handling tab characters and the like). So here's my recipe:
For each argument, escape each backslash and
each single quote with a backslash.
Then surround it with single quotes.
Example: Instead of a'"b, send 'a\'"b'.
This has proven to be protect white space and quotes. And instead of using single quotes, you can use double quotes instead.
Pipeline for Testing
This is how I tested: I created the pipeline test-quoting with the string parameter "PARAM" and the following script:
import groovy.json.JsonOutput
node {
println JsonOutput.toJson(env.PARAM)
}
Then I started the pipeline (in bash), adding an additional layer of quoting that the local shell will remove:
# for testing 'a b'
$ ssh -x -p 50022 <jenkins-server> -l <user-name> build -s -v test-quoting -p PARAM="'a b'"
# for testing the behaviour of the cli with tab character
$ java -jar jenkins-cli.jar -ssh -s <jenkins-url> -user <user-name> build -s -v test-quoting -p PARAM="$(printf '\t')"
If you are unsure about what your local shell really passes to ssh (or any other command), prefix the command with strace -e execve.
This has nothing to do with Jenkins.
Write a bash script with echo "some $$$more" > file and execute that on a linux command prompt, and you will get the same gibberish.
Special characters, like $ must be escaped, since this is a linux environment and $ means a variable. There are several ways to do it.
Option 1.
Use per-character escape, i.e. for every $ that you want to appear literally, use \$ instead. So it becomes:
echo "some \$\$\$more" > file
Option 2.
Use strong-quoting, i.e. single quotes '. Nothing within single quotes has any special meaning, except for a second single quote to close the string:
echo 'some $$$more' > file
Of course with this method, you have to make sure your $TEXTPARAM string does not have any single quotes of it's own.
In either case, you will have to sanitize your input. Before you output it to file, you will need to parse the content of $TEXTPARAM and either replace all $ with \$ and use Option 1. Or parse your $TEXTPARAM and remove all single quotes ' before outputting that to file using Option 2.
Edit
In your case, I think you just want:
echo $TEXTPARAM > file without any extra quotes
root# ~ $ cat test.sh
#!/bin/bash
TEXTPARAM='Some $one and $$two and $$$more bucks and $ bucks $$ that $$$ stand alone and$ after$$ words$$$'
echo $TEXTPARAM
echo $TEXTPARAM > file
cat file
root# ~ $ ./test.sh
Some $one and $$two and $$$more bucks and $ bucks $$ that $$$ stand alone and$ after$$ words$$$
Some $one and $$two and $$$more bucks and $ bucks $$ that $$$ stand alone and$ after$$ words$$$

Bash escaping issue with $#

I've written a script to simplify running a long launch command:
# in ~/.bash_profile
function runProgram() { sbt "run-main com.longpackagename.mainclass $# arg3"; };
export -f runProgram;
However, it fails when I try to pass multiple arguments:
$ runProgram arg1 arg2
...
[info] Running com.longpackagename.mainclass arg1
What happened to arg2 and arg3? Were they eaten by bash or by sbt?
The script works as expected if I run it like this:
$ runProgram "arg1 arg2"
--
Additionally: this type of issue happens all the time for me. I would also appreciate a reference on how to escape properly in bash. The first & second resources that I tried didn't address this situation.
The best reference for bash, including how quoting works, is the bash manual itself, which is almost certainly installed on your machine where you can read it without an internet connection by typing man bash. It's a lot to read, but there's no real substitute.
Nonetheless, I will try to explain this particular issue. There are two important things to know: first, how (and when) bash splits a command line into separate "words" (or command line arguments); second, what $# and $* mean. These are not entirely unrelated.
Word-splitting is partially controlled by the special parameter IFS, but I just mention that; I'm assuming it hasn't been altered. For more details, see man bash.
Below, I call quoting a string with double-quotes ("...") weak quoting, and quoting with apostrophes ('...') strong quoting. The backslash (\) is also a form of strong quoting.
Word-splitting happens:
after parameters (shell variables) have been substituted with their values,
wherever there is a sequence of whitespace characters,
except if the whitespace is quoted in any way, (" ", ' ', \ are three ways),
before quotes are removed.
Once a command has been split into words, the first word is used to find the program or function to invoke, and the remaining words become the program's arguments. (I'm ignoring lots of stuff like shell metacharacters, redirections, pipes, etc., etc. For more details, see man bash.)
Parameters are substituted with their values (step 1) if their name is preceded by a $ unless the $name is strongly quoted (that is, '$name' or, for example, \$name). There's lots more forms of parameter substitution. For more details, see man bash.
Now, $# and $* both mean "all of the positional parameters to the current command/function", and if they are used without quotes, they do precisely the same thing. They are replaced by all of the positional parameters, with a single space between each parameter. Since this is a type of parameter substitution (as above), word-splitting happens after the substitution except if the substitution is in quotes, as in the above list.
If the substitution is in quotes, then according to the above rules, the whitespace which was inserted between the parameters is not subject to word-splitting. And that's precisely how $* works. $* is replaced by the space-separated command-line parameters and the result is word-split; "$*" is replaced by the space-separated command-line parameters as a single word.
"$#" is an exception. And, in fact, this is why $# exists at all. If the $# is inside weak quotes ("$#"), then the quotes are removed, and each positional parameter is individually quoted. These quoted positional parameters are then spaced-separated and substituted for the $#. Since the $# is no longer quoted itself, the inserted spaces do cause word-splitting. The final result is that the individual parameters are retained as individual words.
In case that was not totally clear, here's an example. printf has the virtue of repeating the provided format until it runs out of parameters, which makes it easy to see what's going on.
showargs() {
echo -n '$*: '; printf "<%s> " $*; echo
echo -n '"$*": '; printf "<%s> " "$*"; echo
echo -n '"$#": '; printf "<%s> " "$#"; echo
}
showargs one two three
showargs "one two" three
(Try to figure out what that prints before you execute it.)
It's often said that you almost always want "$#" and almost never "$#" or $*. That's generally true, but it's also the case that you almost never want "something with $# inside of it". To understand that, you need to know what "something with $# inside of it" does. It's a bit wierd, but it shouldn't be unexpected. We'll take the invocation of sbt from the OP as an example:
sbt "run-main com.longpackagename.mainclass $# arg3"
with two positional parameters supplied to the function, so that $1 is arg1 and $2 is arg2.
First, bash removes the quotes around $#. However, it can't just remove them altogether, since there is also quoted text there. So it has to close off the quoted text and reopen the quotes afterwards, producing:
sbt "run-main com.longpackagename.mainclass "$#" arg3"
Now, it can substitute in the quoted, spaced-separated arguments:
sbt "run-main com.longpackagename.mainclass ""arg1" "arg2"" arg3"
This is now word-split:
sbt
"run-main com.longpackagename.mainclass ""arg1"
"arg2"" arg3"
and the quotes are removed:
sbt
run-main com.longpackagename.mainclass arg1
arg2 arg3
sbt is expecting only one positional parameter. You gave it two, and it ignored the second one.
Now, suppose the function were called with a single argument, "arg1 arg2". In that case, the substitution of $# results in:
sbt "run-main com.longpackagename.mainclass ""arg1 arg2"" arg3"
and word-splitting produces
sbt
"run-main com.longpackagename.mainclass ""arg1 arg2"" arg3"
without quotes:
sbt
run-main com.longpackagename.mainclass arg1 arg2 arg3"
and there is only one positional parameter for sbt.

Bash: Passing a variable into a script that has spaces

I currently have a bash script. It looks like this:
#!/bin/bash
case "$1" in
sendcommand)
do X with $2
exit
;;
esac
How would I send all of command this command with spaces into $2 without command acting as $3, with as $4 and so on? Is there something like PHP or Javascript's encodeURI for bash?
You also have to call your script with the second argument in quotes too
./scriptname sendcommand "command with spaces"
Your script should look like this
#!/bin/bash
case "$1" in
sendcommand)
something "$2"
exit
;;
esac
You can just use double quotes:
do X with "$2"
Enclose it in double quotes:
do_X_with "$2"
The double quotes preserve the internal spacing on the variable, including newlines if they are in the value in the first place. Indeed, it is important to understand the uses of double quotes with "$#" and "$*" too, not to mention when using bash arrays.
You can't easily have a command called do because the shell uses do as a keyword in its loop structure. To invoke it, you would have to specify a path to the command, such as ./do or $HOME/bin/do.
But $2 is "this" and the OP wants it to be "this command with spaces".
OK. We need to review command line invocations and desired behaviours. Let's assume that the script being executed is called script. Further, that command being executed is othercommand (can't use command; that is a standard command).
Possible invocations include:
script sendcommand "this command with spaces"
script sendcommand 'this command with spaces'
script sendcommand this command with spaces
The single-quote and double-quote invocations are equivalent in this example. They wouldn't be equivalent if there were variables to be expanded or commands to be invoked inside the argument lists.
It is possible to write script to handle all three cases:
#!/bin/bash
case "$1" in
sendcommand)
shift
othercommand "$*"
exit
;;
esac
Suppose that the invocation encloses the arguments in quotes. The shift command removes $1 from the argument list and renumbers the remaining (single) argument as $1. It then invokes othercommand with a single string argument consisting of the contents of the arguments concatenated together. If there were several arguments, the contents would be separated by a single 'space' (first character of $IFS).
Suppose that the invocation does not enclose the arguments in quotes. The shift command still removes $1 (the sendcommand) from the argument list, and then space separates the remaining arguments as a single argument.
In all three cases, the othercommand sees a single argument that consists of "this command with spaces" (where the program does not see the double quotes, of course).

Resources