shell script pass argument with space - bash

I have the following script (example):
#!/bin/bash
while getopts a: opt; do
case "$opt" in
a) val="$OPTARG";;
?) echo "use the flag \"-a\""
exit 2;;
esac
done
echo "a specified with: ${val}"
When I now call this script with test.sh -a "here is a string" the output is: a specified with: here but not as I would like to have a specified with: here is a string.
I know that I can call the script with test.sh -a here\ is\ a\ string or test.sh -a "here\ is\ a\ string" and it will work. But in my case I can not manipulate the string I want to pass.
So how can I change my getopts function to make it work?
I also tried getopt, but I worked even more wors:
commandsShort="a:"
commandsLong="aval:"
TEMP=`getopt \
-o $commandsShort \
-l $commandsLong \
-q \
-n "$0" -- "$#"`
What am I doing wrong?

This got solved in comments on your question. :-)
You're calling the script with:
eval "test.sh $#"
The effect of this "eval" line, if "here is a string" is your option, is to create the command line that is in the quotes:
test.sh here is a string
and evaluate it.
Per the additional comments, if you can avoid eval, you should.
That said, if you need it, you could always quote the string within the eval:
eval "test.sh \"$#\""
Or if you don't like escaping quotes, use singles, since your $# will be expanded due to the outer quotes being double:
eval "test.sh '$#'"
And finally, as you mentioned in comments, just running directly may be the best option:
test.sh "$#"
Note that if your $# includes the -a option, you may have a new problem. Consider the command line:
test.sh "-a here is a string"
In this case, your entire string, starting with -a, is found in $1, and you will have no options for getopts and no OPTARG.

Related

Is there any alternative to using eval in a shell script to achieve variable expansion

I have the following case where exec and eval will handle variables passed as arguments differently.
Here, eval seems to output something which is intended.
But is there any alternative to using that?
$ cat arg.sh
#!/bin/bash
eval ./argtest $*
$ ./arg.sh "arg1 'subarg1 subarg2'"
Args: 2
Arg1: arg1
Arg2: subarg1 subarg2
But at the same time if I use exec instead of eval call, the single quotes are not getting honored.
$ ./arg.sh "arg1 'subarg1 subarg2'"
Args: 3
Arg1: arg1
Arg2: 'subarg1
Arg3: subarg2'
You should do:
#!/bin/bash
./argtest "$#"
To properly pass unchanged arguments.
Then do:
$ ./arg.sh arg1 'subarg1 subarg2'
As you would do with any other command.
Research when to use quoting in shell, how is $# positional arguments expansions handled specially in quotes, research how does $* and $# differ and research word splitting. Also research what is variable expansion and in which contexts it happens and how does single quotes differ from double quotes. And because exec is mentioned see bashfaq Eval command and security issues. Remember to check your scripts with https://shellcheck.net .
Is there any alternative to using eval in a shell script to achieve variable expansion
Yes - use envsubst for variable expansion, it's a tool just for that.
#!/bin/bash
arg=$(VARIABLE=something envsubst '$VARIABLE' <<<"$1")
./argtest "$arg"
$ bash -x ./arg.sh 'string with **not-expanded** $VARIABLE'
+ ./argtest 'string with **not-expanded** something'
Is there any alternative to using eval in a shell script to achieve *single quotes parsing
Yes - you would potentially write your own parser, probably in awk, that would split the string and then reload. A very very crude example:
#!/bin/bash
readfile -t args < <(sed "s/ *'\([^']*\)' */\n\1\n/; s/\n$//" <<<"$*")
./argtest "${args[#]}"
$ bash -x ./arg.sh "arg1 'subarg1 subarg2'"
+ ./argtest 'arg1' 'subarg1 subarg2'
Using $*, the shell applies word splitting to the parameters and passes the effect after word splitting to eval, repsepcitvely exec. What happens after, differs between them:
exec simply replaces the current process by a new one, based on the first parameter it gets. Than in passes the remaining parameters unmodified to this process.
eval on the other hand catenates the parameters together to a single string (using one space as a separator between those strings), then treats this resulting string as a new command where the usual expansion and word splitting mechanism of bash are applied, and finally runs this command.
The mechanism is completely different, which is not surprising, since these commands serve a different purpose.

How do I pass a string argument in bash and maintain the string?

I'm making a wrapper script for a command in bash and I want to pass string arguments with whitespace while keeping it as a string inside the bash file.
I've seen many questions which refer to passing a string as one item to a bash script, but not how to maintain the string should the intention be for it to remain as a string when being usedin the script.
I've also tried numerous different ways of quoting and escaping characters to see if it makes any difference
An example of how my script is laid out is
#!/usr/bin/env bash
exec my_program /path/to/code.py "${#}"
When I execute my_script -arg "A string", the desired behaviour is to have the wrapped command to execute as my_program /path/to/code.py -arg "A string" but instead it runs as my_program /path/to/code.py -arg A string resulting in error: unrecognized arguments: string.
Is there some way that I can ensure that, when passed a string, it will maintain the string all the way down?
A Minimal, Complete, and Verifiable example:
bash_strings.sh
echo "${#}"
Output
$ bash bash_strings.sh --test "A String"
--test A String
Your bash_strings.sh is buggy, insofar as echo "$#" throws away data (making ./yourscript "hello world" look identical to ./yourscript "hello" "world"). To accurately reflect a command's argument list, use printf %q to generate a quoted version, as follows:
#!/usr/bin/env bash
(( $# )) || exit
printf '%q ' "$#"; echo
...will, for ./bash_strings --test "A string", emit as output:
--test 'A string'
...or, potentially, a different semantically-identical representation such as:
--test A\ string
...either way, reflecting that the original quoting was genuinely passed through.
Bash will take the quotes out. You have to insert them again. But, if you escape the quotes in wrapper, every argument will be interpreted inside quotes:
exec my_program /path/to/code.py \"${#}\"
will become
my_program /path/to/code.py "-arg A string"
which you also don't want.
I think the best approach is, if your wrapper works to a specific program, do a case with the different flags, like:
while getopts "arg:r:v" arg; do
case "$arg" in
arg)
exec my_program /path/to/code.py -arg $OPTARG
;;
r)
exec my_program /path/to/code.py -r $OPTARG
;;
v)
exec my_program /path/to/code.py -v $OPTARG
;;
esac
done
Edited as OP has edited the question for clarity.
If you change your input to:
bash bash_strings.sh --test "\"A String\""
It will work, for this specific example. Bash is stripping quotes somewhere in there.

Is there any difference in bash between using `eval $cmd` and just `$cmd`?

In bash, you can treat a string as command (and run it) in two different ways:
#!/bin/bash
cmd="echo -n sometext"
eval $cmd # Not sure if quotes make a difference here
and
#!/bin/bash
cmd="echo -n sometext"
$cmd # Not sure if quotes make a difference here either
Is there any difference between the two? Is there a situation where quotes around cmd make a difference? What about performance?
Yes, there is a difference :)
You need to first understand how eval works. Basically, eval is a shell builtin command. Whatever argument passed to eval is first treated as a string.
Let's take below example:
cmd="echo -n sometext"
eval $cmd
The complete run process of this command is as follows:
eval $cmd
+ eval echo -n sometext
++ echo -n sometext
sometext
Here, first $cmd first got evaluated and then the whole string was passed to eval command as argument. Then eval evaluates the command considering the first argument as a "command or an executable file" and then run as a normal command. So, here there is 2 rounds of evaluation getting performed for the execution of the complete command.
(NOTE: The + symbol above shows the step wise execution when used in bash -x mode)
The main consequence lies in variable expansion. With eval we have two rounds of expansion. One of course, when cmd is defined, and one when eval is executed.
var="inital"
cmd="echo -n $var \$var"
var="chanded in the mean time"
eval $cmd
inital chanded in the mean time
However, when you use $cmd only without eval, bash takes care of everything from variable expansion to the final execution. Just see the debugging window details while running only $cmd
$cmd
+ echo -n sometext
sometext
Performance wise, direct use of $cmd is good enough. However, when you are trying to use some external command or a script which requires environment changes, you can use eval
In cmd="echo -n sometext", quotes are necessary, otherwise after "echo", bash will raise an error like below:
cmd=echo -n sometext
-n: command not found
I hope the explanation will be helpful.

How do I pass on script arguments that contain quotes/spaces?

I'm trying to write a script notify-finish that can be prepended to any command. When done, it will run the command given by the arguments following, then email the user when the command is complete. Here's what I have:
PROG=$1
# Run command given by arguments
$#
ECODE=$?
echo -e "Subject: `hostname`: $PROG finished\r\nTo: <$USER>\r\n\r\nExited with $ECODE\r\n" | sendmail $USER
This works most of the time, but when arguments contain spaces, the quoting is stripped off.
Working example:
notify-finished rsync -avz source/ user#remote:dest/
Failing example:
notify-finished rsync -avz -e 'ssh -c blowfish' source/ user#remote:dest/
In the second case, $# is expanded out to rsync -avz -e ssh -c blowfish source user#remote:dest/, missing the single quotes. It does not work with double-quotes either, nor with $*.
After reading other posts I tried putting the command in an array, but I get the exact same issue:
CMD=(notify-finished rsync -avz -e 'ssh -c blowfish' source/ user#remote:dest/)
${CMD[#]}
How do I make this work for all arguments?
Use "$#" with quotes:
prog="$1"
"$#"
ecode="$?"
echo "$prog exited with $ecode"
This will pass each argument exactly as it was received. If you don't include the quotes, each element will be split according to $IFS:
"$#" is like "$1" "$2" "$3" ..., passing each element as a separate argument.
"$*" is like "$1 $2 $3 ...", passing all elements concatenated as a single argument
$* and $# is like $1 $2 $3 ..., breaking up each element on whitespace, expanding all globs, and passing each resulting word as a separate element ($IFS).
The same is true for arrays, such as "${array[#]}" and "${array[*]}"
Put double-quotes around your variable substitutions to keep them from being parsed (note that this applies to all variables: $#, $1, and $PROG). Also: don't put a $ before a variable name when assigning to it; use # for comments; and, on the last line, the single-quotes will prevent variables from being substituted at all.
PROG="$1"
shift
# Run program below
"$PROG" "$#"
ECODE=$? # note: this will always be a number, so it doesn't have to be protected with double-quotes
echo -e "Subject: $(hostname): $PROG finished\r\nTo: <$USER>\r\n\r\nExited with $ECODE\r\n' | sendmail "$USER"

How to keep quotes in Bash arguments? [duplicate]

This question already has answers here:
How can I preserve quotes in printing a bash script's arguments
(7 answers)
Closed 3 years ago.
I have a Bash script where I want to keep quotes in the arguments passed.
Example:
./test.sh this is "some test"
then I want to use those arguments, and re-use them, including quotes and quotes around the whole argument list.
I tried using \"$#\", but that removes the quotes inside the list.
How do I accomplish this?
using "$#" will substitute the arguments as a list, without re-splitting them on whitespace (they were split once when the shell script was invoked), which is generally exactly what you want if you just want to re-pass the arguments to another program.
Note that this is a special form and is only recognized as such if it appears exactly this way. If you add anything else in the quotes the result will get combined into a single argument.
What are you trying to do and in what way is it not working?
There are two safe ways to do this:
1. Shell parameter expansion: ${variable#Q}:
When expanding a variable via ${variable#Q}:
The expansion is a string that is the value of parameter quoted in a format that can be reused as input.
Example:
$ expand-q() { for i; do echo ${i#Q}; done; } # Same as for `i in "$#"`...
$ expand-q word "two words" 'new
> line' "single'quote" 'double"quote'
word
'two words'
$'new\nline'
'single'\''quote'
'double"quote'
2. printf %q "$quote-me"
printf supports quoting internally. The manual's entry for printf says:
%q Causes printf to output the corresponding argument in a format that can be reused as shell input.
Example:
$ cat test.sh
#!/bin/bash
printf "%q\n" "$#"
$
$ ./test.sh this is "some test" 'new
>line' "single'quote" 'double"quote'
this
is
some\ test
$'new\nline'
single\'quote
double\"quote
$
Note the 2nd way is a bit cleaner if displaying the quoted text to a human.
Related: For bash, POSIX sh and zsh: Quote string with single quotes rather than backslashes
Yuku's answer only works if you're the only user of your script, while Dennis Williamson's is great if you're mainly interested in printing the strings, and expect them to have no quotes-in-quotes.
Here's a version that can be used if you want to pass all arguments as one big quoted-string argument to the -c parameter of bash or su:
#!/bin/bash
C=''
for i in "$#"; do
i="${i//\\/\\\\}"
C="$C \"${i//\"/\\\"}\""
done
bash -c "$C"
So, all the arguments get a quote around them (harmless if it wasn't there before, for this purpose), but we also escape any escapes and then escape any quotes that were already in an argument (the syntax ${var//from/to} does global substring substitution).
You could of course only quote stuff which already had whitespace in it, but it won't matter here. One utility of a script like this is to be able to have a certain predefined set of environment variables (or, with su, to run stuff as a certain user, without that mess of double-quoting everything).
Update: I recently had reason to do this in a POSIX way with minimal forking, which lead to this script (the last printf there outputs the command line used to invoke the script, which you should be able to copy-paste in order to invoke it with equivalent arguments):
#!/bin/sh
C=''
for i in "$#"; do
case "$i" in
*\'*)
i=`printf "%s" "$i" | sed "s/'/'\"'\"'/g"`
;;
*) : ;;
esac
C="$C '$i'"
done
printf "$0%s\n" "$C"
I switched to '' since shells also interpret things like $ and !! in ""-quotes.
If it's safe to make the assumption that an argument that contains white space must have been (and should be) quoted, then you can add them like this:
#!/bin/bash
whitespace="[[:space:]]"
for i in "$#"
do
if [[ $i =~ $whitespace ]]
then
i=\"$i\"
fi
echo "$i"
done
Here is a sample run:
$ ./argtest abc def "ghi jkl" $'mno\tpqr' $'stu\nvwx'
abc
def
"ghi jkl"
"mno pqr"
"stu
vwx"
You can also insert literal tabs and newlines using Ctrl-V Tab and Ctrl-V Ctrl-J within double or single quotes instead of using escapes within $'...'.
A note on inserting characters in Bash: If you're using Vi key bindings (set -o vi) in Bash (Emacs is the default - set -o emacs), you'll need to be in insert mode in order to insert characters. In Emacs mode, you're always in insert mode.
I needed this for forwarding all arguments to another interpreter.
What ended up right for me is:
bash -c "$(printf ' %q' "$#")"
Example (when named as forward.sh):
$ ./forward.sh echo "3 4"
3 4
$ ./forward.sh bash -c "bash -c 'echo 3'"
3
(Of course the actual script I use is more complex, involving in my case nohup and redirections etc., but this is the key part.)
Like Tom Hale said, one way to do this is with printf using %q to quote-escape.
For example:
send_all_args.sh
#!/bin/bash
if [ "$#" -lt 1 ]; then
quoted_args=""
else
quoted_args="$(printf " %q" "${#}")"
fi
bash -c "$( dirname "${BASH_SOURCE[0]}" )/receiver.sh${quoted_args}"
send_fewer_args.sh
#!/bin/bash
if [ "$#" -lt 2 ]; then
quoted_last_args=""
else
quoted_last_args="$(printf " %q" "${#:2}")"
fi
bash -c "$( dirname "${BASH_SOURCE[0]}" )/receiver.sh${quoted_last_args}"
receiver.sh
#!/bin/bash
for arg in "$#"; do
echo "$arg"
done
Example usage:
$ ./send_all_args.sh
$ ./send_all_args.sh a b
a
b
$ ./send_all_args.sh "a' b" 'c "e '
a' b
c "e
$ ./send_fewer_args.sh
$ ./send_fewer_args.sh a
$ ./send_fewer_args.sh a b
b
$ ./send_fewer_args.sh "a' b" 'c "e '
c "e
$ ./send_fewer_args.sh "a' b" 'c "e ' 'f " g'
c "e
f " g
Just use:
"${#}"
For example:
# cat t2.sh
for I in "${#}"
do
echo "Param: $I"
done
# cat t1.sh
./t2.sh "${#}"
# ./t1.sh "This is a test" "This is another line" a b "and also c"
Param: This is a test
Param: This is another line
Param: a
Param: b
Param: and also c
Changed unhammer's example to use array.
printargs() { printf "'%s' " "$#"; echo; }; # http://superuser.com/a/361133/126847
C=()
for i in "$#"; do
C+=("$i") # Need quotes here to append as a single array element.
done
printargs "${C[#]}" # Pass array to a program as a list of arguments.
My problem was similar and I used mixed ideas posted here.
We have a server with a PHP script that sends e-mails. And then we have a second server that connects to the 1st server via SSH and executes it.
The script name is the same on both servers and both are actually executed via a bash script.
On server 1 (local) bash script we have just:
/usr/bin/php /usr/local/myscript/myscript.php "$#"
This resides on /usr/local/bin/myscript and is called by the remote server. It works fine even for arguments with spaces.
But then at the remote server we can't use the same logic because the 1st server will not receive the quotes from "$#". I used the ideas from JohnMudd and Dennis Williamson to recreate the options and parameters array with the quotations. I like the idea of adding escaped quotations only when the item has spaces in it.
So the remote script runs with:
CSMOPTS=()
whitespace="[[:space:]]"
for i in "$#"
do
if [[ $i =~ $whitespace ]]
then
CSMOPTS+=(\"$i\")
else
CSMOPTS+=($i)
fi
done
/usr/bin/ssh "$USER#$SERVER" "/usr/local/bin/myscript ${CSMOPTS[#]}"
Note that I use "${CSMOPTS[#]}" to pass the options array to the remote server.
Thanks for eveyone that posted here! It really helped me! :)
Quotes are interpreted by bash and are not stored in command line arguments or variable values.
If you want to use quoted arguments, you have to quote them each time you use them:
val="$3"
echo "Hello World" > "$val"
As Gary S. Weaver shown in his source code tips, the trick is to call bash with parameter '-c' and then quote the next.
e.g.
bash -c "<your program> <parameters>"
or
docker exec -it <my docker> bash -c "$SCRIPT $quoted_args"
If you need to pass all arguments to bash from another programming language (for example, if you'd want to execute bash -c or emit_bash_code | bash), use this:
escape all single quote characters you have with '\''.
then, surround the result with singular quotes
The argument of abc'def will thus be converted to 'abc'\''def'. The characters '\'' are interpreted as following: the already existing quoting is terminated with the first first quote, then the escaped singular single quote \' comes, then the new quoting starts.
Yes, seems that it is not possible to ever preserve the quotes, but for the issue I was dealing with it wasn't necessary.
I have a bash function that will search down folder recursively and grep for a string, the problem is passing a string that has spaces, such as "find this string". Passing this to the bash script will then take the base argument $n and pass it to grep, this has grep believing these are different arguments. The way I solved this by using the fact that when you quote bash to call the function it groups the items in the quotes into a single argument. I just needed to decorate that argument with quotes and pass it to the grep command.
If you know what argument you are receiving in bash that needs quotes for its next step you can just decorate with with quotes.
Just use single quotes around the string with the double quotes:
./test.sh this is '"some test"'
So the double quotes of inside the single quotes were also interpreted as string.
But I would recommend to put the whole string between single quotes:
./test.sh 'this is "some test" '
In order to understand what the shell is doing or rather interpreting arguments in scripts, you can write a little script like this:
#!/bin/bash
echo $#
echo "$#"
Then you'll see and test, what's going on when calling a script with different strings

Resources