Passing escaped quotes as a parameter in a shell script - bash

I'm writing a shell script with user input like so:
echo -n "Enter string for Read Group parameter: "
read readgroup
I am passing this variable to another application that requires this parameter to be a string that starts with # and is enclosed in quotes on the command line.
This is the correct command-line syntax:
application -R '#text'
application -R "#text"
These commands would produce errors if run on command line:
application -R #text
application -R "text"
application -R 'text'
In my script, I have tried putting escaped quotes in the passed variable:
echo $readgroup
'#text'
application -R "$readgroup" # error: does not start with #
Or passing it without the quotes:
echo $readgroup
#text
application -R "$readgroup" # error: no parameter entry recognized
application -R \""$readgroup"\" # error: does not start with #
application -R "'"'$readgroup'"'" # error: does not start with #
And tried other solutions from stackoverflow like arrays:
readgroup=("#text")
application -R "${readgroup[#]}" # error: does not start with #
The problem is that when I include the quotes in the original variable or escape the quotes on the command call, the application interprets it as a literal string (e.g. beginning with a quote and therefore invalid, since it must begin with an #). However when I don't include the quotes, it returns an error because there are no quotes for the application call.
Is there a way to force the shell to interpret the escaped quotes (in the script) as actual command line quotes (parameters) or to add those quotes explicitly in the application call in the script?
Or is a different approach considered better, such as using arguments for the script instead?
Thanks.

Application environment is not specified here, can you elaborate more about your application and its environment.
Also did you tried application -R "#$readgroup"

I think eval "application -R '$readgroup'" would do literally the same as you wrote in correct command line sytax, since it gets the whole command as a string.
On the other hand there might be something tricky going on. I think inside application it can't be decided if $readgroup was specified with or without quotes. At last the main() will only get unquoted char *argv[]!

Related

Bash command works when I run it myself but fails in the script

My company has a tool that dynamically generates commands to run based on an input json. It works very well when all arguments to the compiled command are single words, but is failing when we attempt multi word args. Here is the minimal example of how it fails.
# Print and execute the command.
print_and_run() { local command=("$#")
if [[ ${command[0]} == "time" ]]; then
echo "Your command: time ${command[#]:1}"
time ${command[#]:1}
fi
}
# How print_and_run is called in the script
print_and_run time docker run our-conainer:latest $generated_flags
# Output
Your command: time docker run our-container:latest subcommand --arg1=val1 --arg2="val2 val3"
Usage: our-program [OPTIONS] COMMAND1 [ARGS]... [COMMAND2 [ARGS]...]...
Try 'our-program --help' for help.
Error: No such command 'val3"'.
But if I copy the printed command and run it myself it works fine (I've omitted docker flags). Shelling into the container and running the program directly with these arguments works as well, so the parsing logic there is solid (It's a python program that uses click to parse the args).
Now, I have a working solution that uses eval, but my entire team jumped down my throat at that suggestion. I've also proposed a solution using delineating characters for multi-word arguments, but that was shot down as well.
No other solutions proposed by other engineers have worked either. So can I ask someone to perhaps explain why val3 is being treated as a separate command, or to help me find a solution to get bash to properly evaluate the dynamically determined command without using eval?
Your command after expanding $generated_flags is:
print_and_run time docker run our-conainer:latest subcommand --arg1=val1 --arg2="val2 val3"
Your specific problem is that in --arg2="val2 val3" the quotes are literal, not syntactical, because quotes are processed before variables are expanded. This means --arg2="val2 and val3" are being split into two separate arguments. Then, I assume, docker is trying to interpret val3" as some kind of docker command because it's not part of any argument, and it's throwing out an error because it doesn't know what that means.
Normally you'd fix this via an array to properly maintain the string boundary.
generated_flags=( "subcommand" "--arg1=val1" "--arg2=val2 val3" )
print_and_run time docker run our-container:latest "${generated_flags[#]}"
This will maintain --arg2=val2 val3 as a single argument as it gets passed into print_and_run, then you just have to expand your command array correctly inside the function (make sure to quote the expansion).
The question is:
why val3 is being treated as a separate command
Unquoted variable expansion undergo word splitting and filename expansion. Word splitting splits the result of the variable expansion on spcaes, tabs and newlines. Splits it into separate "words".
a="something else"
$a # results in two "words"; 'something' and 'else'
It is irrelevent what you put inside the variable value or how many quotes or escape sequences you put inside. Every consecutive spaces splits it into words. Quotes " ' and escapes \ are parsed when part of the input line, not when part of the result of unquoted expansion.
help me find a solution to
Write a parser that will actually parse the commands and split it according to the rules that you want to use and then execute the command split into separate words. For example, a very crude such parser is included in xargs:
$ echo " 'quotes quotes' not quotes" | xargs printf "'%s'\n"
'quotes quotes'
'not'
'quotes'
For example, python has shlex.split which you can just use, and at the same time introduce python which is waaaaay easier to manage than badly written Bash scripts.
tool that dynamically generates commands to run based on an input json
Overall, the proper way forward would is to upgrade the tool to generate a JSON array that represents the words of the command to be executed. Than you can just execute that array of words, which is, again, trivial to do properly in python with json and subprocess.run, and will require some gymnastics with jq and read and Bash arrays in shell.
Check your scripts with shellcheck.

How to inject a bash script with a dollar sign ($) in terraform?

I have a simple bash script that does something like the following:
#!/bin/bash
a=$(curl -s http://metadata/endpoint/internal)
echo "$a - bar"
(this is just a simplification). Note the use of the two $ signs to execute a command and resolve a variable.
Using Terraform, I want to write this file to a GCP instance during startup. Per Terraform's instructions, I'm attempting to avoid using the File Provisioner. Using the metadata_startup_script field in google_compute_instance2, I am including the script so that I can write it to a particular location.
E.g.
metadata_startup_script = <<-EOF
#!/bin/bash -xe
sudo tee /etc/myservice/serv.conf > /dev/null <<EOI
${file("${path.module}/scripts/simple_bash.sh")}
EOI
EOF
Terraform is interpolating the $ in the subscript somewhere (either in the loading into the metadata_startup_script, or in the writing out into the script to disk.
So, depending on what I use to try to escape the interpolation, it still fails to write. For example, I have tried (in the subscript):
echo "\$a - bar"
echo "${ā€œ$ā€}a - bar"
echo "$$a - bar"
According to the terraform docs, I'm supposed to use $$, but when I do it in the above, I get:
echo "1397a - bar"
All which fail to replicate the original script.
Iā€™m just looking for the exact bash script, as written, to be written to disk.
My goal would be to do the above without extra escape sequences (as detailed here - Escaping dollar sign in Terraform) so that i can continue to run the original script (for debugging purposes).
I would also prefer not to build a packer image with the original script in it.
Thanks!
I don't think it's Terraform eating your variable interpolations here, because Terraform only understands ${ (a dollar sign followed by a brace) as starting an interpolation, whereas your example contains only $a.
However, you do seem to be embedded one bash script inside another, so it seems plausible to me that the outer bash is resolving your $a before the inner bash gets a chance to look at it. If so, you can use the literal variant of bash heredoc syntax, as described in answers to How to cat <> a file containing code?, so that the outer bash will take the content as literal and leave it to the inner bash to evaluate.
metadata_startup_script = <<-EOF
#!/bin/bash -xe
sudo tee /etc/myservice/serv.conf > /dev/null <<'EOI'
${file("${path.module}/scripts/simple_bash.sh")}
EOI
EOF
Notice that I wrote <<'EOI' instead of <<EOI, following the guidance from that other question in combination with the main Bash documentation on "here documents" (bold emphasis mine):
This type of redirection instructs the shell to read input from the current source until a line containing only word (with no trailing blanks) is seen. All of the lines read up to that point are then used as the standard input (or file descriptor n if n is specified) for a command.
The format of here-documents is:
[n]<<[-]word
here-document
delimiter
No parameter and variable expansion, command substitution, arithmetic expansion, or filename expansion is performed on word. If any part of word is quoted, the delimiter is the result of quote removal on word, and the lines in the here-document are not expanded. If word is unquoted, all lines of the here-document are subjected to parameter expansion, command substitution, and arithmetic expansion, the character sequence \newline is ignored, and \ must be used to quote the characters \, $, and `.
If the redirection operator is <<-, then all leading tab characters are stripped from input lines and the line containing delimiter. This allows here-documents within shell scripts to be indented in a natural fashion.
If your machine image is configured to run cloud-init at startup -- this is often but not always what is responsible for executing metadata_startup_script -- you may be able to achieve a similar effect without so much Bash scripting indirection by using Cloud Config YAML instead of a shell script directly.
For example, if your intent is only to write the content of that file into the designated location in the filesystem, you could potentially follow the Writing out arbitrary files example:
metadata_startup_script = <<-EOF
#cloud-config
${yamlencode({
write_files = [
{
encoding = "b64"
content = filebase64("${path.module}/scripts/simple_bash.sh")
path = "/etc/myservice/serv.conf"
owner = "root:root"
permissions = "0644"
},
]
}}
EOF
Cloud-init evaluates its modules at various points in the startup lifecycle. The Write Files module used here is specified to run once on the first boot of an instance, which matches how Cloud-init would typically treat a naked shell script too.
I do not think your issue is related to TF interpolation. I think you have problems because of normal bash interpolation, as its bash which is going to try to resolve $ in your /etc/myservice/serv.conf while writing its content.
The regular solution is to use 'EOI', not EOI:
metadata_startup_script = <<-EOF
#!/bin/bash -xe
sudo tee /etc/myservice/serv.conf > /dev/null <<'EOI'
${file("${path.module}/scripts/simple_bash.sh")}
EOI
EOF

How to pass variables with special characters into a bash script when called from terminal

Hello all I have a program running on a linux OS that allows me to call a bash script upon a trigger (such as a file transfer). I will run something like:
/usr/bin/env bash -c "updatelog.sh '${filesize}' '${filename}'"
and the scripts job is to update the log file with the file name and file size. But if I pass in a file name with a single quote in its file name then it will break the script and give an error saying "Unexpected EOF while looking for matching `''"
I realize that a file name with a single quote is making the calling command an invalid one since the single quote is messing with the command itself. However I don't want to sanitize the variables if I can help it cause I would like my log to have the exact file name being displayed to easier cross reference it later. Is this possible or is sanitizing the only option here?
Thanks very much for your time and assistance.
Sanitization is absolutely not needed.
The simplest solution, assuming your script is properly executable (has +x permissions and a valid shebang line), is:
./updatelog.sh "$filesize" "$filename"
If for some reason you must use the bash -c, use single quotes instead of double quotes surrounding your code, and keep your data out-of-band from that code:
bash -c 'updatelog.sh "$#"' 'updatelog' "$filesize" "$filename"
Note that only updatelog.sh "$#" is inside the -c argument and parsed as code, and that this string is in single quotes, passed through without any changes whatsoever.
Following it are your arguments $0, $1 and $2; $0 is used when printing error messages, while $1 and $2 go into the list of arguments -- aka $# -- passed through to updatelog.sh.

How do I avoid calling part of my string as a command?

I run with the file with command line arguments:
samplebash.bsh fakeusername fakepassword&123
.bsh file:
echo "Beginning script..."
argUsername='$1'
argPassword='$2'
protractor indv.js --params.login.username=$argUsername --params.login.password=$argPassword
Output:
Beginning script...
123: command not found
The Issue: For some reason, it interprets what follows the & symbol from the password as a command, how do I avoid this?
The problem isn't happening in your script, it's happening in your original command line. & is a command terminator, which specifies that the command before it should be executed in the background. So your command was equivalent to:
samplebash.bsh fakeusername fakepassword &
123
You need to quote the argument to prevent special characters from being interpreted by the shell.
samplebash.bsh fakeusername 'fakepassword&123'
Also, you shouldn't put single quotes around a variable like you do in your assignments, that prevents the variable from being expanded. So it should be:
argUsername=$1
argPassword=$2
And you should put double quotes around the variables when you use them in the command, to prevent wildcards and whitespace from being interpreted.
protractor indv.js --params.login.username="$argUsername" --params.login.password="$argPassword"
As a general rule, you should always put double quotes around variables unless you know they're not needed.

Calling a shell command from Applescript with quotes

This seems like it should be simple, but I'm pulling out my remaining hair trying to get it to work. In a shell script I want to run some Applescript code that defines a string, then pass that string (containing a single quote) to a shell command that calls PHP's addslashes function, to return a string with that single quote escaped properly.
Here's the code I have so far - it's returning a syntax error.
STRING=$(osascript -- - <<'EOF'
set s to "It's me"
return "['test'=>'" & (do shell script "php -r 'echo addslashes(\"" & s & "\");") & "']"
EOF)
echo -e $STRING
It's supposed to return this:
['test'=>'It\'s me']
First, when asking a question like this, please include what's happening, not just what you're trying to do. When I try this, I get:
42:99: execution error: sh: -c: line 0: unexpected EOF while looking for matchin
sh: -c: line 1: syntax error: unexpected end of file (2)
(which is actually two error messages, with one partly overwriting the other.) Is that what you're getting?
If it is, the problem is that the inner shell command you're creating has quoting issues. Take a look at the AppleScript snippet that tries to run a shell command:
do shell script "php -r 'echo addslashes(\"" & s & "\");"
Since s is set to It's me, this runs the shell command:
php -r 'echo addslashes("It's me");
Which has the problem that the apostrophe in It's me is acting as a close-quote for the string that starts 'echo .... After that, the double-quote in me"); is seen as opening a new quoted string, which doesn't get closed before the end of the "file", causing the unexpected EOF problem.
The underlying problem is that you're trying to pass a string from AppleScript to shell to php... but each of those has its own rules for parsing strings (with different ideas about how quoting and escaping work). Worse, it looks like you're doing this so you can get an escaped string (following which set of escaping rules?) to pass to something else... This way lies madness.
I'm not sure what the real goal is here, but there has to be a better way; something that doesn't involve a game of telephone with players that all speak different languages. If not, you're pretty much doomed.
BTW, there are a few other dubious shell-scripting practices in the script:
Don't use all-caps variable named in shell scripts. There are a bunch of all-caps variables that have special meanings, and if you accidentally use one of those for something else, weird results can happen.
Put double-quotes around all variable references in scripts, to avoid them getting split into multiple "words" and/or expanded as shell wildcards. For example, if the variable string was set to "['test'=>'It\'s-me']", and you happened to have files named "t" and "m" in the current directory, echo -e $string will print "m t" because those are the files that match the [] pattern.
Don't use echo with options and/or to print strings that might contain escapes, since different versions treat these things differently. Some versions, for example, will print the "-e" as part of the output string. Use printf instead. The first argument to printf is a format string that tells it how to format all of the rest of the arguments. To emulate echo -e "$string" in a more reliable form, use printf '%b\n' "$string".
To complement Gordon Davisson's helpful answer with a pragmatic solution:
Shell strings cannot contain \0 (NUL) characters, but the following sed command emulates all other escaping that PHP's (oddly named) addslashes PHP function performs (\-escaping ', " and \ instances):
string=$(osascript <<'EOF'
set s to "It's me\\you and we got 3\" of rain."
return "['test'=>'" & (do shell script "sed 's/[\"\\\\'\\'']/\\\\&/g' <<<" & quoted form of s) & "']"
EOF
)
printf '%s\n' "$string"
yields
['test'=>'It\'s me\\you and we got 3\" of rain.']
Note the use of quoted form of, which is crucial for passing a string from AppleScript to a do shell script shell command with proper quoting.
Also note how the closing here-doc delimiter, EOF, is on its own line to ensure that it is properly recognized (in Bash 3.2.57, as used on macOS 10.12, (also when called as /bin/sh, which is what do shell script does), this isn't strictly necessary, but Bash 4.x would rightfully complain about EOF) with warning: here-document at line <n> delimited by end-of-file (wanted 'EOF')

Resources