how to keep nested BASH parameter untokenized - bash

Consider this trite BASH script:
g1="f 2.txt"
g2="f1.txt $g1"
cp $g2
It fails because I passed three parameters to the cp command. How do I escape the $g1 call to make it pass just two on the final line (er, make it work without passing two variables on line three)? I tried putting quotes around it with no success; it then proceeds to pass the quotes as part of the parameter, which is doubly weird.
In my real scenario I have some optional parameters that themselves take parameters. I had wanted to parse them all out at the top of the script and leave their final parsed values blank if they weren't passed in.

You can do it using shell arrays:
g1="f 2.txt"
g2=("f1.txt" "$g1")
cp "${g2[#]}"

Related

Programmatically create bash command with flags for items in array

I have a list/array like so:
['path/to/folder/a', 'path/to/folder/b']
This is an example, the array can be of any length. But for each item in the array I'd like to set up the following as a single command:
$ someTool <command> --flag <item-1> --flag <item-2> ... --flag <item-N>
At the moment I am currently doing a loop over the array but I am just wondering if doing them individually has a different behaviour to doing them all at once (which the tool specifies I should do).
for i in "${array[#]}"; do
someTool command --flag $i
done
Whether passing all flag arguments to a single invocation of the tool does the same thing as passing them one-at-a-time to separate invocations depends entirely on the tool and what it does. Without more information, it's impossible to say for sure, but if the instructions recommend passing them all at once, I'd go with that.
The simplest way to do this in bash is generally to create a second array with the flags and arguments as they need to be passed to the tool:
flagsArray=()
for i in "${array[#]}"; do
flagsArray+=(--flag "$i")
done
someTool command "${flagsArray[#]}"
Note: all of the above syntax -- all the quotes, braces, brackets, parentheses, etc -- matter to making this run properly and robustly. Don't leave anything out unless you know why it's there, and that leaving it out won't cause trouble.
BTW, if the option (--flag) doesn't have to be passed as a separate argument (i.e. if the tool allows --flag=path/to/folder/a instead of --flag path/to/folder/a), then you can use a substitution to add the --flag= bit to each element of the array in a single step:
someTool command "${array[#]/#/--flag=}"
Explanation: the /# means "replace at the beginning (of each element)", then the empty string for the thing to replace, / to delimit that from the replacement string, and --flag= as the replacement (/addition) string.

How can I adjust my bash function such that I can omit the double-quotes?

Throughout the day, I type something like this frequently:
git stash push -u -m "some phrase as a message"
I would prefer to type instead:
stpu some phrase as a message
So with help from this answer, I created a function in my ~./bashrc:
function stpu() {
git stash push -u -m "${#}"
}
Now I'm able to type stpu "some phrase as a message", which is pretty close to what I want.
How can I adjust my function such that I can omit the double-quotes?
I've tried many different variations (adding more double-quotes that are escaped, adding single-quotes, etc) but haven't gotten it to work.
You can sometimes omit the quotes if you use "$*" instead of "$#"
This will concatenate all your arguments together into a single string, separated with spaces (by default; the first character in IFS, if it's been overridden). -m expects a single string to follow it (instead of a separate argument per word), so this is exactly what it wants.
This is not reliable, and it's better to just use the quotes.
Security
Consider as an example if you want to use the commit message: Make $(rm -rf ~) safe in an argument name for a security fix. If this string is unquoted (or double quoted), the command is executed before your function is ever started (which makes sense: a function can't be called until after its argument list is known), so there's nothing your function can do to fix it. In this context, using single quotes to prevent the command substitution from taking place is the correct and safe practice.
(To single-quote a string that contains single quotes, consider using ANSI C-like strings: $'I\'m a single-quoted string that contains a single quote')
Correctness
Or, as another example: Process only files matching *.csv -- if it's not quoted, the *.csv can be replaced with a list of CSV files that exist in the directory where you ran the command. Again, this happens before your function is ever started, so nothing inside the function can prevent it.

are there security issues with using eval on an environment variable in a bash script?

I have a Bash script in which I call rsync in order to perform a backup to a remote server. To specify that my Downloads folder be backed up, I'm passing "'${HOME}/Downloads'" as an argument to rsync which produces the output:
rsync -avu '/Volumes/Norman Data/Downloads' me#example.com:backup/
Running the command with the variable expanded as above (through the terminal or in the script) works fine, but because of the space in the expanded variable and the fact that the quotes (single ticks) are ignored when included in the variable being passed as part of an argument (see here), the only way I can get it not to choke on the space is to do:
stmt="rsync -avu '${HOME}/Downloads' me#examle.com:backup/"
eval ${stmt}
It seems like there would be some vulnerabilities presented by running eval on anything not 100% private to that script. Am I correct in thinking I should be doing it a different way? If so, any hints for a bash-script-beginner would be greatly appreciated.
** EDIT ** - I actually have a bit more involved use case than. the example above. For the paths passed, I have an array of them, each containing spaces, that I'm then combining into 1 string kind of like
include_paths=(
"'${HOME}/dir_a'"
"'${HOME}/dir_b' --exclude=video"
)
for item in "${include_paths[#]}"
do
inc_args="${inc_args}" ${item}
done
inc_args evaluates to '/Volumes/Norman Data/me/dir_a' '/Volumes/Norman Data/me/dir_b' --exclude=video
which I then try to pass as an argument to rsync but the single ticks are read as literals and it breaks after the 1st /Volumes/Norman because of the space.
rsync -avu "${inc_args}" me#example.com:backup/
Using eval seems to read the single ticks as quotes and executes:
rsync -avu '/Volumes/Norman Data/me/dir_a' '/Volumes/Norman Data/me/dir_b' --exclude=video me#example.com:backup/
like I need it to. I can't seem to get any other way to work.
** EDIT 2 - SOLUTION **
So the 1st thing I needed to do was modify the include_paths array to:
remove single ticks from within double quoted items
move any path-specific flags (ex. --exclude) to their own items directly after the path it should apply to
I then built up an array containing the rsync command and its options, added the expanded include_paths and exclude_paths arrays and the connection string to the remote host.
And finally expanded that array, which ran my entire, properly quoted rsync command. In the end the modified array include_paths is:
include_paths=(
"${HOME}/dir_a"
"${HOME}/dir_b"
"--exclude=video"
"${HOME}/dir_c"
)
and I put everything together with:
cmd=(rsync -auvzP)
for item in "${exclude_paths[#]}"
do
cmd+=("--exclude=${item}")
done
for item in "${include_paths[#]}"
do
cmd+=("${item}")
done
cmd+=("me#example.com:backup/")
set -x
"${cmd[#]}"
Use an array for the commands/option instead of a plain variable.
stmt=(rsync -avu "${HOME}/Dowloads" me#example.com:backup/)
Execute it using the builtin command
command "${stmt[#]}"
...Or I personally just put the options/arguments in an array.
options=(-avu "${HOME}/Download" me#example.com:backup/)
The execute it using rsync
rsync "${options[#]}"
If you have newer version of bash which that supports the additional P.E. parameter expansion, then you could probably quote the array.
options=(-avu "${HOME}/Download" me#example.com:backup/)
Check the output by applying the P.E.
echo "${options[#]#Q}"
Should print
'-avu' '/Volumes/Norman Data/Downloads' 'me#examle.com:backup/'
Then you can just
rsync "${options[#]#Q}"

Append bash parameters and pass forward to other script

I need to pass further original parameters and also I want to add some others. Something like this:
#!/bin/bash
params="-D FOREGROUND"
params+=" -c Include conf/dev.conf"
/usr/local/apache/bin/apachectl $params "$#"
This code above don't work as expected if params contains of two or more parameters, it treated as one parameter.
The code in your example should work if the following command is valid when executed at the command line written exactly like this :
/usr/local/apache/bin/apachectl -D FOREGROUND -c Include conf/dev.conf "$#"
A quick web search leads me to think that what you want is this (notice the additional double quotes) :
/usr/local/apache/bin/apachectl -D FOREGROUND -c "Include conf/dev.conf" "$#"
Here is how to achieve that simply and reliably with arrays, in a way that sidesteps "quoting inside quotes" issues:
#!/bin/bash
declare -a params=()
params+=(-D FOREGROUND)
params+=(-c "Include conf/dev.conf")
/usr/local/apache/bin/apachectl "${params[#]}" "$#"
The params array contains 4 strings ("-D", "FOREGROUND", "-c" and "Include conf/dev/conf"). The array expansion ("${params[#]}", note that the double quotes are important here) expands them to these 4 strings, as if you had written them with double quotes around them (i.e. without further word splitting).
Using arrays with this kind of expansion is a flexible and reliable to way to build commands and then execute them with a simple expansion.
If the issue is the space in the parameter "-c Include conf/dev.conf" then you could just use a backspace to preserve the space character:
params+="-c Include\ conf/dev.conf"

Bash variable character replacement ends up to an empty string or a command not valid

I am working on a shell script to retrieve variable content from a JSON file via JQ. The JSON file is in string format (no matter whether this is a real string or a number) and to retrieve the variable in my bash script I did something like this
my_domain=$(cat /vagrant/data_bags/config.json | jq ."app"[0]."domain")
The above code once echoed results in "mydomain" with a beginning and a trailing quote sign. I though this was a normal behaviour of the echo command. However, while concatenating my variable with another shell command the system raise an error. For instance, the following command
cp /vagrant/public_html/index.php "/var/www/"+$my_domain+"/index.php"
fails with the following error
cp: cannot create regular file `/var/www/+"mydomain"+/index.php': No such file or directory
At this stage, I wasn't able to identify whether it's me doing the wrong concatenation with the plus sign or the variable is effectively including the quotes that in any case will end up generating an error.
I have tried to replace the quotes in my variable, but I ended up getting the system raising a "Command not found" error.
Can somebody suggest what am I doing wrong?
+ is not used for string concatenation in bash (or perl, or php). Just:
cp /vagrant/public_html/index.php "/var/www/$my_domain/index.php"
Embedding a variable inside a double-quoted text string is known as interpolation, and is one of the reasons why we need the $ prefix, to indicate that this is a variable. Interpolation is specifically not done inside single quoted strings.
Braces ${my_domain} are not required because the / directory separators are not valid characters in a variable name, so there is no ambiguity.
For example:
var='thing'
echo "Give me your ${var}s" # Correct, appends an 's' after 'thing'
echo "Give me your $vars" # incorrect, looks for a variable called vars.
If a variable (like 'vars') does not exist then (by default) it will not complain, it will just give an empty string. Braces (graph brackets) are required more in c-shell (csh or tcsh) because of additional syntax for modifying variables, which involves special trailing characters.
You don't need to use + to concatenate string in bash, change your command to
cp /vagrant/public_html/index.php "/var/www/"${my_domain}"/index.php"
My problem was not related only to the wrong concatenation, but also to the JQ library that after parsing the value from the JSon file was returning text between quotes.
In order to avoid JQ doing this, just add the -rawoutput parameter when calling JQ.

Resources