Append bash parameters and pass forward to other script - bash

I need to pass further original parameters and also I want to add some others. Something like this:
#!/bin/bash
params="-D FOREGROUND"
params+=" -c Include conf/dev.conf"
/usr/local/apache/bin/apachectl $params "$#"
This code above don't work as expected if params contains of two or more parameters, it treated as one parameter.

The code in your example should work if the following command is valid when executed at the command line written exactly like this :
/usr/local/apache/bin/apachectl -D FOREGROUND -c Include conf/dev.conf "$#"
A quick web search leads me to think that what you want is this (notice the additional double quotes) :
/usr/local/apache/bin/apachectl -D FOREGROUND -c "Include conf/dev.conf" "$#"
Here is how to achieve that simply and reliably with arrays, in a way that sidesteps "quoting inside quotes" issues:
#!/bin/bash
declare -a params=()
params+=(-D FOREGROUND)
params+=(-c "Include conf/dev.conf")
/usr/local/apache/bin/apachectl "${params[#]}" "$#"
The params array contains 4 strings ("-D", "FOREGROUND", "-c" and "Include conf/dev/conf"). The array expansion ("${params[#]}", note that the double quotes are important here) expands them to these 4 strings, as if you had written them with double quotes around them (i.e. without further word splitting).
Using arrays with this kind of expansion is a flexible and reliable to way to build commands and then execute them with a simple expansion.

If the issue is the space in the parameter "-c Include conf/dev.conf" then you could just use a backspace to preserve the space character:
params+="-c Include\ conf/dev.conf"

Related

Programmatically create bash command with flags for items in array

I have a list/array like so:
['path/to/folder/a', 'path/to/folder/b']
This is an example, the array can be of any length. But for each item in the array I'd like to set up the following as a single command:
$ someTool <command> --flag <item-1> --flag <item-2> ... --flag <item-N>
At the moment I am currently doing a loop over the array but I am just wondering if doing them individually has a different behaviour to doing them all at once (which the tool specifies I should do).
for i in "${array[#]}"; do
someTool command --flag $i
done
Whether passing all flag arguments to a single invocation of the tool does the same thing as passing them one-at-a-time to separate invocations depends entirely on the tool and what it does. Without more information, it's impossible to say for sure, but if the instructions recommend passing them all at once, I'd go with that.
The simplest way to do this in bash is generally to create a second array with the flags and arguments as they need to be passed to the tool:
flagsArray=()
for i in "${array[#]}"; do
flagsArray+=(--flag "$i")
done
someTool command "${flagsArray[#]}"
Note: all of the above syntax -- all the quotes, braces, brackets, parentheses, etc -- matter to making this run properly and robustly. Don't leave anything out unless you know why it's there, and that leaving it out won't cause trouble.
BTW, if the option (--flag) doesn't have to be passed as a separate argument (i.e. if the tool allows --flag=path/to/folder/a instead of --flag path/to/folder/a), then you can use a substitution to add the --flag= bit to each element of the array in a single step:
someTool command "${array[#]/#/--flag=}"
Explanation: the /# means "replace at the beginning (of each element)", then the empty string for the thing to replace, / to delimit that from the replacement string, and --flag= as the replacement (/addition) string.

are there security issues with using eval on an environment variable in a bash script?

I have a Bash script in which I call rsync in order to perform a backup to a remote server. To specify that my Downloads folder be backed up, I'm passing "'${HOME}/Downloads'" as an argument to rsync which produces the output:
rsync -avu '/Volumes/Norman Data/Downloads' me#example.com:backup/
Running the command with the variable expanded as above (through the terminal or in the script) works fine, but because of the space in the expanded variable and the fact that the quotes (single ticks) are ignored when included in the variable being passed as part of an argument (see here), the only way I can get it not to choke on the space is to do:
stmt="rsync -avu '${HOME}/Downloads' me#examle.com:backup/"
eval ${stmt}
It seems like there would be some vulnerabilities presented by running eval on anything not 100% private to that script. Am I correct in thinking I should be doing it a different way? If so, any hints for a bash-script-beginner would be greatly appreciated.
** EDIT ** - I actually have a bit more involved use case than. the example above. For the paths passed, I have an array of them, each containing spaces, that I'm then combining into 1 string kind of like
include_paths=(
"'${HOME}/dir_a'"
"'${HOME}/dir_b' --exclude=video"
)
for item in "${include_paths[#]}"
do
inc_args="${inc_args}" ${item}
done
inc_args evaluates to '/Volumes/Norman Data/me/dir_a' '/Volumes/Norman Data/me/dir_b' --exclude=video
which I then try to pass as an argument to rsync but the single ticks are read as literals and it breaks after the 1st /Volumes/Norman because of the space.
rsync -avu "${inc_args}" me#example.com:backup/
Using eval seems to read the single ticks as quotes and executes:
rsync -avu '/Volumes/Norman Data/me/dir_a' '/Volumes/Norman Data/me/dir_b' --exclude=video me#example.com:backup/
like I need it to. I can't seem to get any other way to work.
** EDIT 2 - SOLUTION **
So the 1st thing I needed to do was modify the include_paths array to:
remove single ticks from within double quoted items
move any path-specific flags (ex. --exclude) to their own items directly after the path it should apply to
I then built up an array containing the rsync command and its options, added the expanded include_paths and exclude_paths arrays and the connection string to the remote host.
And finally expanded that array, which ran my entire, properly quoted rsync command. In the end the modified array include_paths is:
include_paths=(
"${HOME}/dir_a"
"${HOME}/dir_b"
"--exclude=video"
"${HOME}/dir_c"
)
and I put everything together with:
cmd=(rsync -auvzP)
for item in "${exclude_paths[#]}"
do
cmd+=("--exclude=${item}")
done
for item in "${include_paths[#]}"
do
cmd+=("${item}")
done
cmd+=("me#example.com:backup/")
set -x
"${cmd[#]}"
Use an array for the commands/option instead of a plain variable.
stmt=(rsync -avu "${HOME}/Dowloads" me#example.com:backup/)
Execute it using the builtin command
command "${stmt[#]}"
...Or I personally just put the options/arguments in an array.
options=(-avu "${HOME}/Download" me#example.com:backup/)
The execute it using rsync
rsync "${options[#]}"
If you have newer version of bash which that supports the additional P.E. parameter expansion, then you could probably quote the array.
options=(-avu "${HOME}/Download" me#example.com:backup/)
Check the output by applying the P.E.
echo "${options[#]#Q}"
Should print
'-avu' '/Volumes/Norman Data/Downloads' 'me#examle.com:backup/'
Then you can just
rsync "${options[#]#Q}"

zip exclude subfolder passed as argument or variable [duplicate]

I want to run a command from a bash script which has single quotes and some other commands inside the single quotes and a variable.
e.g. repo forall -c '....$variable'
In this format, $ is escaped and the variable is not expanded.
I tried the following variations but they were rejected:
repo forall -c '...."$variable" '
repo forall -c " '....$variable' "
" repo forall -c '....$variable' "
repo forall -c "'" ....$variable "'"
If I substitute the value in place of the variable the command is executed just fine.
Please tell me where am I going wrong.
Inside single quotes everything is preserved literally, without exception.
That means you have to close the quotes, insert something, and then re-enter again.
'before'"$variable"'after'
'before'"'"'after'
'before'\''after'
Word concatenation is simply done by juxtaposition. As you can verify, each of the above lines is a single word to the shell. Quotes (single or double quotes, depending on the situation) don't isolate words. They are only used to disable interpretation of various special characters, like whitespace, $, ;... For a good tutorial on quoting see Mark Reed's answer. Also relevant: Which characters need to be escaped in bash?
Do not concatenate strings interpreted by a shell
You should absolutely avoid building shell commands by concatenating variables. This is a bad idea similar to concatenation of SQL fragments (SQL injection!).
Usually it is possible to have placeholders in the command, and to supply the command together with variables so that the callee can receive them from the invocation arguments list.
For example, the following is very unsafe. DON'T DO THIS
script="echo \"Argument 1 is: $myvar\""
/bin/sh -c "$script"
If the contents of $myvar is untrusted, here is an exploit:
myvar='foo"; echo "you were hacked'
Instead of the above invocation, use positional arguments. The following invocation is better -- it's not exploitable:
script='echo "arg 1 is: $1"'
/bin/sh -c "$script" -- "$myvar"
Note the use of single ticks in the assignment to script, which means that it's taken literally, without variable expansion or any other form of interpretation.
The repo command can't care what kind of quotes it gets. If you need parameter expansion, use double quotes. If that means you wind up having to backslash a lot of stuff, use single quotes for most of it, and then break out of them and go into doubles for the part where you need the expansion to happen.
repo forall -c 'literal stuff goes here; '"stuff with $parameters here"' more literal stuff'
Explanation follows, if you're interested.
When you run a command from the shell, what that command receives as arguments is an array of null-terminated strings. Those strings may contain absolutely any non-null character.
But when the shell is building that array of strings from a command line, it interprets some characters specially; this is designed to make commands easier (indeed, possible) to type. For instance, spaces normally indicate the boundary between strings in the array; for that reason, the individual arguments are sometimes called "words". But an argument may nonetheless have spaces in it; you just need some way to tell the shell that's what you want.
You can use a backslash in front of any character (including space, or another backslash) to tell the shell to treat that character literally. But while you can do something like this:
reply=\”That\'ll\ be\ \$4.96,\ please,\"\ said\ the\ cashier
...it can get tiresome. So the shell offers an alternative: quotation marks. These come in two main varieties.
Double-quotation marks are called "grouping quotes". They prevent wildcards and aliases from being expanded, but mostly they're for including spaces in a word. Other things like parameter and command expansion (the sorts of thing signaled by a $) still happen. And of course if you want a literal double-quote inside double-quotes, you have to backslash it:
reply="\"That'll be \$4.96, please,\" said the cashier"
Single-quotation marks are more draconian. Everything between them is taken completely literally, including backslashes. There is absolutely no way to get a literal single quote inside single quotes.
Fortunately, quotation marks in the shell are not word delimiters; by themselves, they don't terminate a word. You can go in and out of quotes, including between different types of quotes, within the same word to get the desired result:
reply='"That'\''ll be $4.96, please," said the cashier'
So that's easier - a lot fewer backslashes, although the close-single-quote, backslashed-literal-single-quote, open-single-quote sequence takes some getting used to.
Modern shells have added another quoting style not specified by the POSIX standard, in which the leading single quotation mark is prefixed with a dollar sign. Strings so quoted follow similar conventions to string literals in the ANSI standard version of the C programming language, and are therefore sometimes called "ANSI strings" and the $'...' pair "ANSI quotes". Within such strings, the above advice about backslashes being taken literally no longer applies. Instead, they become special again - not only can you include a literal single quotation mark or backslash by prepending a backslash to it, but the shell also expands the ANSI C character escapes (like \n for a newline, \t for tab, and \xHH for the character with hexadecimal code HH). Otherwise, however, they behave as single-quoted strings: no parameter or command substitution takes place:
reply=$'"That\'ll be $4.96, please," said the cashier'
The important thing to note is that the single string that gets stored in the reply variable is exactly the same in all of these examples. Similarly, after the shell is done parsing a command line, there is no way for the command being run to tell exactly how each argument string was actually typed – or even if it was typed, rather than being created programmatically somehow.
Below is what worked for me -
QUOTE="'"
hive -e "alter table TBL_NAME set location $QUOTE$TBL_HDFS_DIR_PATH$QUOTE"
EDIT: (As per the comments in question:)
I've been looking into this since then. I was lucky enough that I had repo laying around. Still it's not clear to me whether you need to enclose your commands between single quotes by force. I looked into the repo syntax and I don't think you need to. You could used double quotes around your command, and then use whatever single and double quotes you need inside provided you escape double ones.
just use printf
instead of
repo forall -c '....$variable'
use printf to replace the variable token with the expanded variable.
For example:
template='.... %s'
repo forall -c $(printf "${template}" "${variable}")
Variables can contain single quotes.
myvar=\'....$variable\'
repo forall -c $myvar
I was wondering why I could never get my awk statement to print from an ssh session so I found this forum. Nothing here helped me directly but if anyone is having an issue similar to below, then give me an up vote. It seems any sort of single or double quotes were just not helping, but then I didn't try everything.
check_var="df -h / | awk 'FNR==2{print $3}'"
getckvar=$(ssh user#host "$check_var")
echo $getckvar
What do you get? A load of nothing.
Fix: escape \$3 in your print function.
Does this work for you?
eval repo forall -c '....$variable'

Tricky brace expansion in shell

When using a POSIX shell, the following
touch {quick,man,strong}ly
expands to
touch quickly manly strongly
Which will touch the files quickly, manly, and strongly, but is it possible to dynamically create the expansion? For example, the following illustrates what I want to do, but does not work because of the order of expansion:
TEST=quick,man,strong #possibly output from a program
echo {$TEST}ly
Is there any way to achieve this? I do not mind constricting myself to Bash if need be. I would also like to avoid loops. The expansion should be given as complete arguments to any arbitrary program (i.e. the program cannot be called once for each file, it can only be called once for all files). I know about xargs but I'm hoping it can all be done from the shell somehow.
... There is so much wrong with using eval. What you're asking is only possible with eval, BUT what you might want is easily possible without having to resort to bash bug-central.
Use arrays! Whenever you need to keep multiple items in one datatype, you need (or, should use) an array.
TEST=(quick man strong)
touch "${TEST[#]/%/ly}"
That does exactly what you want without the thousand bugs and security issues introduced and concealed in the other suggestions here.
The way it works is:
"${foo[#]}": Expands the array named foo by expanding each of its elements, properly quoted. Don't forget the quotes!
${foo/a/b}: This is a type of parameter expansion that replaces the first a in foo's expansion by a b. In this type of expansion you can use % to signify the end of the expanded value, sort of like $ in regular expressions.
Put all that together and "${foo[#]/%/ly}" will expand each element of foo, properly quote it as a separate argument, and replace each element's end by ly.
In bash, you can do this:
#!/bin/bash
TEST=quick,man,strong
eval echo $(echo {$TEST}ly)
#eval touch $(echo {$TEST}ly)
That last line is commented out but will touch the specified files.
Zsh can easily do that:
TEST=quick,man,strong
print ${(s:,:)^TEST}ly
Variable content is splitted at commas, then each element is distributed to the string around the braces:
quickly manly strongly
Taking inspiration from the answers above:
$ TEST=quick,man,strong
$ touch $(eval echo {$TEST}ly)

Shell script input containing asterisk

How do I write a shell script (bash on HPUX) that receives a string as an argument containing an asterisk?
e.g. myscript my_db_name "SELECT * FROM table;"
The asterisk gets expanded to all the file names in the current directory, also if I assign a variable like this.
DB_QUERY="$2"
echo $DB_QUERY
The asterisk "*" is not the only character you have to watch out for, there's lots of other shell meta-charaters that can cause problems, like < > $ | ; &
The simple answer is always to put your arguments in quotes (that's the double-quote, " ) when you don't know what they might contain.
For your example, you should write:
DB_QUERY="$2"
echo "$DB_QUERY"
It starts getting awkward when you want your argument to be used as multiple parameters or you start using eval, but you can ask about that separately.
You always need to put double quotes around a variable reference if you want to prevent it from triggering filename expansion. So, in your example, use:
DB_QUERY="$2"
echo "$DB_QUERY"
In the first example, use single quotes:
myscript my_db_name 'SELECT * FROM table;'
In the second example, use double quotes:
echo "$DB_QUERY"

Resources