I don't seem to be able to assign a string that begins with "-e" or "-E" to a bash shell variable:
$ options="-e stuff"
$ echo $options
stuff
Other letters work fine:
$ options="-g stuff"
$ echo $options
-g stuff
What is the reason for this?
You should quote your variable:
echo "${options}"
otherwise it's being expanded to
echo -g stuff
which is being interpreter by echo as its -e option, which actually exists (see man echo), and that's why -e "did not work" while other letters you tried "did".
First: To reliably determine the value of a variable in bash, use declare -p, not echo. Thus:
declare -p options
will emit something like:
declare -- options="-e stuff"
This tells you much more than echo does:
Because it's declare -- rather than declare -x, you know that the variable is not exported.
Because it's not declare -a, you know it's not giving you an array (echo "$array" will print only the first element of a shell array and ignore the rest).
Because it's not declare -i, you know the value wasn't declared to be an integer... etc.
If you're only worried about the string case, but want to ensure that you get a printable value no matter which version of bash is in use (as some historical releases will not always guarantee printable escaping for values printed with declare -p), consider instead:
printf '%q=%q\n' options "$options"
...which will emit unambiguous output even if there are cursor control characters, newlines, or other non-textual contents in your string.
Follow the advice of the POSIX specification for echo, and use printf instead. To quote the APPLICATION USAGE section in full, emphasis added, noting that in bash, -e enables XSI-style interpretation of escape sequences:
It is not possible to use echo portably across all POSIX systems
unless both -n (as the first argument) and escape sequences are
omitted.
The printf utility can be used portably to emulate any of the
traditional behaviors of the echo utility as follows (assuming that
IFS has its standard value or is unset):
The historic System V echo and the requirements on XSI implementations
in this volume of POSIX.1-2008 are equivalent to:
printf "%b\n" "$*"
The BSD echo is equivalent to:
if [ "X$1" = "X-n" ]
then
shift
printf "%s" "$*"
else
printf "%s\n" "$*"
fi
New applications are encouraged to use printf instead of echo.
So, how does this apply to you? Since you want -e to be treated as data, not part of echo's setup, the BSD, non--n branch of that applies:
options="-e stuff"
printf '%s\n' "$options"
Related
Seems that the recommended way of doing indirect variable setting in bash is to use eval:
var=x; val=foo
eval $var=$val
echo $x # --> foo
The problem is the usual one with eval:
var=x; val=1$'\n'pwd
eval $var=$val # bad output here
(and since it is recommended in many places, I wonder just how many scripts are vulnerable because of this...)
In any case, the obvious solution of using (escaped) quotes doesn't really work:
var=x; val=1\"$'\n'pwd\"
eval $var=\"$val\" # fail with the above
The thing is that bash has indirect variable reference baked in (with ${!foo}), but I don't see any such way to do indirect assignment -- is there any sane way to do this?
For the record, I did find a solution, but this is not something that I'd consider "sane"...:
eval "$var='"${val//\'/\'\"\'\"\'}"'"
A slightly better way, avoiding the possible security implications of using eval, is
declare "$var=$val"
Note that declare is a synonym for typeset in bash. The typeset command is more widely supported (ksh and zsh also use it):
typeset "$var=$val"
In modern versions of bash, one should use a nameref.
declare -n var=x
x=$val
It's safer than eval, but still not perfect.
Bash has an extension to printf that saves its result into a variable:
printf -v "${VARNAME}" '%s' "${VALUE}"
This prevents all possible escaping issues.
If you use an invalid identifier for $VARNAME, the command will fail and return status code 2:
$ printf -v ';;;' '%s' foobar; echo $?
bash: printf: `;;;': not a valid identifier
2
eval "$var=\$val"
The argument to eval should always be a single string enclosed in either single or double quotes. All code that deviates from this pattern has some unintended behavior in edge cases, such as file names with special characters.
When the argument to eval is expanded by the shell, the $var is replaced with the variable name, and the \$ is replaced with a simple dollar. The string that is evaluated therefore becomes:
varname=$value
This is exactly what you want.
Generally, all expressions of the form $varname should be enclosed in double quotes, to prevent accidental expansion of filename patterns like *.c.
There are only two places where the quotes may be omitted since they are defined to not expand pathnames and split fields: variable assignments and case. POSIX 2018 says:
Each variable assignment shall be expanded for tilde expansion, parameter expansion, command substitution, arithmetic expansion, and quote removal prior to assigning the value.
This list of expansions is missing the parameter expansion and the field splitting. Sure, that's hard to see from reading this sentence alone, but that's the official definition.
Since this is a variable assignment, the quotes are not needed here. They don't hurt, though, so you could also write the original code as:
eval "$var=\"the value is \$val\""
Note that the second dollar is escaped using a backslash, to prevent it from being expanded in the first run. What happens is:
eval "$var=\"the value is \$val\""
The argument to the command eval is sent through parameter expansion and unescaping, resulting in:
varname="the value is $val"
This string is then evaluated as a variable assignment, which assigns the following value to the variable varname:
the value is value
The main point is that the recommended way to do this is:
eval "$var=\$val"
with the RHS done indirectly too. Since eval is used in the same
environment, it will have $val bound, so deferring it works, and since
now it's just a variable. Since the $val variable has a known name,
there are no issues with quoting, and it could have even been written as:
eval $var=\$val
But since it's better to always add quotes, the former is better, or
even this:
eval "$var=\"\$val\""
A better alternative in bash that was mentioned for the whole thing that
avoids eval completely (and is not as subtle as declare etc):
printf -v "$var" "%s" "$val"
Though this is not a direct answer what I originally asked...
Newer versions of bash support something called "parameter transformation", documented in a section of the same name in bash(1).
"${value#Q}" expands to a shell-quoted version of "${value}" that you can re-use as input.
Which means the following is a safe solution:
eval="${varname}=${value#Q}"
Just for completeness I also want to suggest the possible use of the bash built in read. I've also made corrections regarding -d'' based on socowi's comments.
But much care needs to be exercised when using read to ensure the input is sanitized (-d'' reads until null termination and printf "...\0" terminates the value with a null), and that read itself is executed in the main shell where the variable is needed and not a sub-shell (hence the < <( ... ) syntax).
var=x; val=foo0shouldnotterminateearly
read -d'' -r "$var" < <(printf "$val\0")
echo $x # --> foo0shouldnotterminateearly
echo ${!var} # --> foo0shouldnotterminateearly
I tested this with \n \t \r spaces and 0, etc it worked as expected on my version of bash.
The -r will avoid escaping \, so if you had the characters "\" and "n" in your value and not an actual newline, x will contain the two characters "\" and "n" also.
This method may not be aesthetically as pleasing as the eval or printf solution, and would be more useful if the value is coming in from a file or other input file descriptor
read -d'' -r "$var" < <( cat $file )
And here are some alternative suggestions for the < <() syntax
read -d'' -r "$var" <<< "$val"$'\0'
read -d'' -r "$var" < <(printf "$val") #Apparently I didn't even need the \0, the printf process ending was enough to trigger the read to finish.
read -d'' -r "$var" <<< $(printf "$val")
read -d'' -r "$var" <<< "$val"
read -d'' -r "$var" < <(printf "$val")
Yet another way to accomplish this, without eval, is to use "read":
INDIRECT=foo
read -d '' -r "${INDIRECT}" <<<"$(( 2 * 2 ))"
echo "${foo}" # outputs "4"
Seems that the recommended way of doing indirect variable setting in bash is to use eval:
var=x; val=foo
eval $var=$val
echo $x # --> foo
The problem is the usual one with eval:
var=x; val=1$'\n'pwd
eval $var=$val # bad output here
(and since it is recommended in many places, I wonder just how many scripts are vulnerable because of this...)
In any case, the obvious solution of using (escaped) quotes doesn't really work:
var=x; val=1\"$'\n'pwd\"
eval $var=\"$val\" # fail with the above
The thing is that bash has indirect variable reference baked in (with ${!foo}), but I don't see any such way to do indirect assignment -- is there any sane way to do this?
For the record, I did find a solution, but this is not something that I'd consider "sane"...:
eval "$var='"${val//\'/\'\"\'\"\'}"'"
A slightly better way, avoiding the possible security implications of using eval, is
declare "$var=$val"
Note that declare is a synonym for typeset in bash. The typeset command is more widely supported (ksh and zsh also use it):
typeset "$var=$val"
In modern versions of bash, one should use a nameref.
declare -n var=x
x=$val
It's safer than eval, but still not perfect.
Bash has an extension to printf that saves its result into a variable:
printf -v "${VARNAME}" '%s' "${VALUE}"
This prevents all possible escaping issues.
If you use an invalid identifier for $VARNAME, the command will fail and return status code 2:
$ printf -v ';;;' '%s' foobar; echo $?
bash: printf: `;;;': not a valid identifier
2
eval "$var=\$val"
The argument to eval should always be a single string enclosed in either single or double quotes. All code that deviates from this pattern has some unintended behavior in edge cases, such as file names with special characters.
When the argument to eval is expanded by the shell, the $var is replaced with the variable name, and the \$ is replaced with a simple dollar. The string that is evaluated therefore becomes:
varname=$value
This is exactly what you want.
Generally, all expressions of the form $varname should be enclosed in double quotes, to prevent accidental expansion of filename patterns like *.c.
There are only two places where the quotes may be omitted since they are defined to not expand pathnames and split fields: variable assignments and case. POSIX 2018 says:
Each variable assignment shall be expanded for tilde expansion, parameter expansion, command substitution, arithmetic expansion, and quote removal prior to assigning the value.
This list of expansions is missing the parameter expansion and the field splitting. Sure, that's hard to see from reading this sentence alone, but that's the official definition.
Since this is a variable assignment, the quotes are not needed here. They don't hurt, though, so you could also write the original code as:
eval "$var=\"the value is \$val\""
Note that the second dollar is escaped using a backslash, to prevent it from being expanded in the first run. What happens is:
eval "$var=\"the value is \$val\""
The argument to the command eval is sent through parameter expansion and unescaping, resulting in:
varname="the value is $val"
This string is then evaluated as a variable assignment, which assigns the following value to the variable varname:
the value is value
The main point is that the recommended way to do this is:
eval "$var=\$val"
with the RHS done indirectly too. Since eval is used in the same
environment, it will have $val bound, so deferring it works, and since
now it's just a variable. Since the $val variable has a known name,
there are no issues with quoting, and it could have even been written as:
eval $var=\$val
But since it's better to always add quotes, the former is better, or
even this:
eval "$var=\"\$val\""
A better alternative in bash that was mentioned for the whole thing that
avoids eval completely (and is not as subtle as declare etc):
printf -v "$var" "%s" "$val"
Though this is not a direct answer what I originally asked...
Newer versions of bash support something called "parameter transformation", documented in a section of the same name in bash(1).
"${value#Q}" expands to a shell-quoted version of "${value}" that you can re-use as input.
Which means the following is a safe solution:
eval="${varname}=${value#Q}"
Just for completeness I also want to suggest the possible use of the bash built in read. I've also made corrections regarding -d'' based on socowi's comments.
But much care needs to be exercised when using read to ensure the input is sanitized (-d'' reads until null termination and printf "...\0" terminates the value with a null), and that read itself is executed in the main shell where the variable is needed and not a sub-shell (hence the < <( ... ) syntax).
var=x; val=foo0shouldnotterminateearly
read -d'' -r "$var" < <(printf "$val\0")
echo $x # --> foo0shouldnotterminateearly
echo ${!var} # --> foo0shouldnotterminateearly
I tested this with \n \t \r spaces and 0, etc it worked as expected on my version of bash.
The -r will avoid escaping \, so if you had the characters "\" and "n" in your value and not an actual newline, x will contain the two characters "\" and "n" also.
This method may not be aesthetically as pleasing as the eval or printf solution, and would be more useful if the value is coming in from a file or other input file descriptor
read -d'' -r "$var" < <( cat $file )
And here are some alternative suggestions for the < <() syntax
read -d'' -r "$var" <<< "$val"$'\0'
read -d'' -r "$var" < <(printf "$val") #Apparently I didn't even need the \0, the printf process ending was enough to trigger the read to finish.
read -d'' -r "$var" <<< $(printf "$val")
read -d'' -r "$var" <<< "$val"
read -d'' -r "$var" < <(printf "$val")
Yet another way to accomplish this, without eval, is to use "read":
INDIRECT=foo
read -d '' -r "${INDIRECT}" <<<"$(( 2 * 2 ))"
echo "${foo}" # outputs "4"
From the bash man page:
"$*" is equivalent to "$1c$2c...", where c is the first character of the value of the IFS variable.
"$#" is equivalent to "$1" "$2" ...
Any example where "$#" cannot work in place of "$*"?
My favorite use is replacing the field separator.
$ set -- 'My word' but this is a bad 'example!'
$ IFS=,
$ echo "$*"
My word,but,this,is,a,bad,example!
There are other ways to replace delimiters, but IFS and "$*" are often one of the simplest.
These are completely different tools and should be used in completely different situations. There's no reasonable question about one substituting for the other, because in any given situation, only one or the other will be correct.
"$*" is most applicable when you're trying to form a single string argument from an argument list -- mostly for logging (but not cases where division between arguments is important; then, "$#" is appropriate with something like print '%q '). "$#" is useful in... well, any other case.
Examples:
die() {
local stat=$1; shift
log "ERROR: $*"
exit $stat
}
Using "$*" when formatting a string to be passed to log uses only a single argv entry, allowing other optional positional arguments to be added to log's usage in the future.
$* expands to all parameters as just one word with IFS between parameters.
$# expands to all parameters as a list.
Try next code in a file named list.sh:
#!/bin/bash
echo "using '$*'"
for i in "$*"
do
echo $i
done
echo "using '$#'"
for i in "$#"
do
echo $i
done
use it:
./list.sh apple pear kiwi
I'm a bit confused with printing a variable that contain a new line symbol in bash.
var="Age:\n20\ncolor:\nred"
echo -e $var
Age:
20
color:
red
This is working, but a lot of people say that echo with options is not portable and it is better to use printf.
I never used prinf. According to manuals to emitate echo command:
printf '%s\n' "$var"
Age:\n20\ncoloe:\nred
But this doesn't parse \n inside variable. manuals usually have this example:
printf "Surname: %s\nName: %s\n" "$SURNAME" "$LASTNAME"
But it's not my case and from my point of view it not comfortable to use. I found out simply by typing that I can use this:
printf "$var\n"
Is it portable?
If I then pass $var to a mail command will it save new line breaks?
printf "$var\n" | mail -s subj email#domain.com
printf's %b format specifier was meant specifically to replace echo -e (actually, the XSI extension to echo which calls for special interpretation of the arguments by default. -e was never specified and is disallowed by POSIX.), and is identical in virtually every way including a few differences from $'...' and the format string argument to printf.
$ ( var='Age:\n20\ncolor:\nred'; printf '%b\n' "$var" )
Age:
20
color:
red
You should generally avoid expanding variables into the format string unless your program controls the exact value and it is intended specifically to be a format string. Your last example in particular has the potential to be quite dangerous in Bash due to printf's -v option.
# Bad!
var='-v_[$(echo "oops, arbitrary code execution" >&2)0]'
printf "$var" foo
It is usually good practice to avoid %b unless you have a special portability requirement. Storing the escape codes in a variable instead of the literal data violates principles of separation of code and data. There are contexts in which this is ok, but it is usually better to assign the the value using $'...' quoting, which is specified for the next version of POSIX, and has long been available in Bash and most ksh flavours.
x=$'foo\nbar'; printf '%s\n' "$x" # Good
x=(foo bar); printf '%s\n' "${x[#]}" # Also good (depending on the goal)
x='foo\nbar'; printf '%b\n' "$x" # Ok, especially for compatibility
x='foo\nbar'; printf -- "$x" # Avoid if possible, without specific reason
http://wiki.bash-hackers.org/commands/builtin/printf
VAR="-e xyz"
echo $VAR
This prints xyz, for some reason. I don't seem to be able to find a way to get a string to start with -e.
What is going on here?
The answers that say to put $VAR in quotes are only correct by side effect. That is, when put in quotes, echo(1) receives a single argument of -e xyz, and since that is not a valid option string, echo just prints it out. It is a side effect as echo could just as easily print an error regarding malformed options. Most programs will do this, but it seems GNU echo (from coreutils) and the version built into bash simply echo strings that start with a hyphen but are not valid argument strings. This behaviour is not documented so it should not be relied upon.
Further, if $VAR contains a valid echo option argument, then quoting $VAR will not help:
$ VAR="-e"
$ echo "$VAR"
$
Most GNU programs take -- as an argument to mean no more option processing — all the arguments after -- are to be processed as non-option arguments. bash echo does not support this so you cannot use it. Even if it did, it would not be portable. echo has other portability issues (-n vs \c, no -e).
The correct and portable solution is to use printf(1).
printf "%s\n" "$VAR"
The variable VAR contains -e xyz, if you access the variable via $ the -e is interpreted as a command-line option for echo. Note that the content of $VAR is not wrapped into "" automatically.
Use echo "$VAR" to fix your problem.
In zsh, you can use a single dash (-) before your arguments. This ensures that no following arguments are interpreted as options.
% VAR="-e xyz"
% echo - $VAR
-e xyz
From the zsh docs:
echo [ -neE ] [ arg ... ]
...
Note that for standards compliance a double dash does not
terminate option processing; instead, it is printed directly.
However, a single dash does terminate option processing, so the
first dash, possibly following options, is not printed, but
everything following it is printed as an argument.
The single dash behaviour is different from other shells.
Keep in mind this behavior is specific to zsh.
Try:
echo "$VAR"
instead.
(-e is a valid option for echo - this is what causes this phenomenon).