Why doesn't `FOO=42 echo "$FOO"` print 42 in Bash? - bash

I'm confused by this behaviour:
$ FOO=42 echo "$FOO"
$ FOO=42 && echo "$FOO"
42
I'm accustomed to using VARNAME=value cmd to specify environment variables for other commands, so I was surprised to discover it doesn't work when cmd is echo. This might have something to do with echo being a Bash built-in. Is this the case? If so, is using && the best way to specify environment variables for Bash built-ins?
Update: && is no good, as it results in the environment variables being set permanently, which I'm trying to avoid.

It doesn't matter that echo is a built-in. The reason is that variables on the command line are evaluated before executing the command, they're not evaluated by the command. You could get the result you want with:
FOO=42 bash -c 'echo "$FOO"'
This starts a new shell to execute echo $foo. Since the argument to bash is in single quotes, the variable is not replaced by the original shell.

The replacement happens before the command is executed:
$FOO is replaced with its current value.
echo "" is executed with $FOO set to 42.
Try:
FOO=42 sh -c 'echo "$FOO"'

Related

In which order are these instructions executed?

I am using a shell script named script.sh that looks like that :
#!/bin/bash
STRING=$(cat my_string.txt)
${1}
In my_string.txt, there is only :
this_is_my_string
When I execute the commands :
$ STRING="not_my_string"
$ ./script.sh "echo $STRING"
The shell prints not_my_string instead of this_is_my_string and I don’t understand why.
Could you explain me ? And is there any way to force to print the value of the STRING variable which is defined inside the script ?
The variable $STRING is being expanded before the script is called, which is why not_my_string is being assigned.
To delay expansion until after the script is called you should replace "echo $STRING" with 'echo $STRING'. The single quotes cause the expansion to be delayed.
There is some discussion of delayed expansion here:
How to delay expansion of variable in bash if command should be executed on an other machine?
You will also need to replace ${1} in your script with eval ${1}, which will force the string to be executed and expanded.
$ STRING="not_my_string"
$ ./script.sh "echo $STRING"
During command execution bash will expand all variables to the values and actually the following command will be executed:
./script.sh "echo not_my_string"
You can use the following:
./script.sh 'echo $STRING' to send string as is and eval "${1} inside the script to execute argument

How to save environment variables to file [duplicate]

Suppose you have the following command stored in a variable:
COMMAND='echo hello'
What's the difference between
$ eval "$COMMAND"
hello
$ bash -c "$COMMAND"
hello
$ $COMMAND
hello
? Why is the last version almost never used if it is shorter and (as far as I can see) does exactly the same thing?
The third form is not at all like the other two -- but to understand why, we need to go into the order of operations when bash in interpreting a command, and look at which of those are followed when each method is in use.
Bash Parsing Stages
Quote Processing
Splitting Into Commands
Special Operator Parsing
Expansions
Word Splitting
Globbing
Execution
Using eval "$string"
eval "$string" follows all the above steps starting from #1. Thus:
Literal quotes within the string become syntactic quotes
Special operators such as >() are processed
Expansions such as $foo are honored
Results of those expansions are split on characters into whitespace into separate words
Those words are expanded as globs if they parse as same and have available matches, and finally the command is executed.
Using sh -c "$string"
...performs the same as eval does, but in a new shell launched as a separate process; thus, changes to variable state, current directory, etc. will expire when this new process exits. (Note, too, that that new shell may be a different interpreter supporting a different language; ie. sh -c "foo" will not support the same syntax that bash, ksh, zsh, etc. do).
Using $string
...starts at step 5, "Word Splitting".
What does this mean?
Quotes are not honored.
printf '%s\n' "two words" will thus parse as printf %s\n "two words", as opposed to the usual/expected behavior of printf %s\n two words (with the quotes being consumed by the shell).
Splitting into multiple commands (on ;s, &s, or similar) does not take place.
Thus:
s='echo foo && echo bar'
$s
...will emit the following output:
foo && echo bar
...instead of the following, which would otherwise be expected:
foo
bar
Special operators and expansions are not honored.
No $(foo), no $foo, no <(foo), etc.
Redirections are not honored.
>foo or 2>&1 is just another word created by string-splitting, rather than a shell directive.
$ bash -c "$COMMAND"
This version starts up a new bash interpreter, runs the command, and then exits, returning control to the original shell. You don't need to be running bash at all in the first place to do this, you can start a bash interpreter from tcsh, for example. You might also do this from a bash script to start with a fresh environment or to keep from polluting your current environment.
EDIT:
As #CharlesDuffy points out starting a new bash shell in this way will clear shell variables but environment variables will be inherited by the spawned shell process.
Using eval causes the shell to parse your command twice. In the example you gave, executing $COMMAND directly or doing an eval are equivalent, but have a look at the answer here to get a more thorough idea of what eval is good (or bad) for.
There are at least times when they are different. Consider the following:
$ cmd="echo \$var"
$ var=hello
$ $cmd
$var
$ eval $cmd
hello
$ bash -c "$cmd"
$ var=world bash -c "$cmd"
world
which shows the different points at which variable expansion is performed. It's even more clear if we do set -x first
$ set -x
$ $cmd
+ echo '$var'
$var
$ eval $cmd
+ eval echo '$var'
++ echo hello
hello
$ bash -c "$cmd"
+ bash -c 'echo $var'
$ var=world bash -c "$cmd"
+ var=world
+ bash -c 'echo $var'
world
We can see here much of what Charles Duffy talks about in his excellent answer. For example, attempting to execute the variable directly prints $var because parameter expansion and those earlier steps had already been done, and so we don't get the value of var, as we do with eval.
The bash -c option only inherits exported variables from the parent shell, and since I didn't export var it's not available to the new shell.

Can't use redirections in a shell command stored in a string [duplicate]

Suppose you have the following command stored in a variable:
COMMAND='echo hello'
What's the difference between
$ eval "$COMMAND"
hello
$ bash -c "$COMMAND"
hello
$ $COMMAND
hello
? Why is the last version almost never used if it is shorter and (as far as I can see) does exactly the same thing?
The third form is not at all like the other two -- but to understand why, we need to go into the order of operations when bash in interpreting a command, and look at which of those are followed when each method is in use.
Bash Parsing Stages
Quote Processing
Splitting Into Commands
Special Operator Parsing
Expansions
Word Splitting
Globbing
Execution
Using eval "$string"
eval "$string" follows all the above steps starting from #1. Thus:
Literal quotes within the string become syntactic quotes
Special operators such as >() are processed
Expansions such as $foo are honored
Results of those expansions are split on characters into whitespace into separate words
Those words are expanded as globs if they parse as same and have available matches, and finally the command is executed.
Using sh -c "$string"
...performs the same as eval does, but in a new shell launched as a separate process; thus, changes to variable state, current directory, etc. will expire when this new process exits. (Note, too, that that new shell may be a different interpreter supporting a different language; ie. sh -c "foo" will not support the same syntax that bash, ksh, zsh, etc. do).
Using $string
...starts at step 5, "Word Splitting".
What does this mean?
Quotes are not honored.
printf '%s\n' "two words" will thus parse as printf %s\n "two words", as opposed to the usual/expected behavior of printf %s\n two words (with the quotes being consumed by the shell).
Splitting into multiple commands (on ;s, &s, or similar) does not take place.
Thus:
s='echo foo && echo bar'
$s
...will emit the following output:
foo && echo bar
...instead of the following, which would otherwise be expected:
foo
bar
Special operators and expansions are not honored.
No $(foo), no $foo, no <(foo), etc.
Redirections are not honored.
>foo or 2>&1 is just another word created by string-splitting, rather than a shell directive.
$ bash -c "$COMMAND"
This version starts up a new bash interpreter, runs the command, and then exits, returning control to the original shell. You don't need to be running bash at all in the first place to do this, you can start a bash interpreter from tcsh, for example. You might also do this from a bash script to start with a fresh environment or to keep from polluting your current environment.
EDIT:
As #CharlesDuffy points out starting a new bash shell in this way will clear shell variables but environment variables will be inherited by the spawned shell process.
Using eval causes the shell to parse your command twice. In the example you gave, executing $COMMAND directly or doing an eval are equivalent, but have a look at the answer here to get a more thorough idea of what eval is good (or bad) for.
There are at least times when they are different. Consider the following:
$ cmd="echo \$var"
$ var=hello
$ $cmd
$var
$ eval $cmd
hello
$ bash -c "$cmd"
$ var=world bash -c "$cmd"
world
which shows the different points at which variable expansion is performed. It's even more clear if we do set -x first
$ set -x
$ $cmd
+ echo '$var'
$var
$ eval $cmd
+ eval echo '$var'
++ echo hello
hello
$ bash -c "$cmd"
+ bash -c 'echo $var'
$ var=world bash -c "$cmd"
+ var=world
+ bash -c 'echo $var'
world
We can see here much of what Charles Duffy talks about in his excellent answer. For example, attempting to execute the variable directly prints $var because parameter expansion and those earlier steps had already been done, and so we don't get the value of var, as we do with eval.
The bash -c option only inherits exported variables from the parent shell, and since I didn't export var it's not available to the new shell.

Assign a Variable in bash login shell

i am trying to do this from a Windows command prompt.
C:\cygwin64\bin\bash --login -c "$var="<hallo>" &&
echo "$var""
and i get error :
The system cannot find the file specified.
but this works:
C:\cygwin64\bin\bash --login -c
"var="hello" && echo "$hello""
The login shell seems to cause the problem when it gets a '<'. how can i still assign the string with angle brackets to the shell variable?
When you write
C:\cygwin64\bin\bash --login -c "$var="<hallo>" && echo "$var""
You are expecting the shell to strip off the outer quotes from that argument to -c and end up with a string that looks like
$var="<hallo>" && echo "$var"
but that's not what the shell does.
The shell just matches quotes as it goes along. So the shell sees.
["$var="][<hallo>][" && echo "][$var][""].
You need to escape the inner quotes from the current shell or use different quotes to avoid this parsing problem.
C:\cygwin64\bin\bash --login -c 'var="<hallo>" && echo "$var"'
Note also that I removed the $ from the start of the variable name in the assignment and that I used single quotes on the outside so that the current shell didn't expand $var.
With double quotes on the outside you'd need to use something like this instead.
C:\cygwin64\bin\bash --login -c "var='<hallo>' && echo \"\$var\""
For a similar discussion of shell parsing and how things nest (or don't) with backticks you can see my answer here.

Setting environment variable for one program call in bash using env

I am trying to invoke a shell command with a modified environment via the command env.
According to the manual
env HELLO='Hello World' echo $HELLO
should echo Hello World, but it doesn't.
If I do
HELLO='Hello World' bash -c 'echo $HELLO'
it prints Hello World as expected (thanks to this answer for this info).
What am I missing here?
It's because in your first case, your current shell expands the $HELLO variable before running the commands. And there's no HELLO variable set in your current shell.
env HELLO='Hello World' echo $HELLO
will do this:
expand any variables given, in this case $HELLO
run env with the 3 arguments 'HELLO=Hello World', 'echo' and '' (an empty string, since there's no HELLO variable set in the current shell)
The env command will run and set the HELLO='Hello World' in its environment
env will run echo with the argument '' (an empty string)
As you see, the current shell expanded the $HELLO variable, which isn't set.
HELLO='Hello World' bash -c 'echo $HELLO'
will do this:
set the variable HELLO='Hello World for the following command
run bash with the 2 arguments '-c' and 'echo $HELLO'
since the last argument is enclosed in single quotes, nothing inside it is expanded
the new bash in turn will run the command echo $HELLO
To run echo $HELLO in the new bash sub-shell, bash first expands anything it can, $HELLO in this case, and the parent shell set that to Hello World for us.
The subshell runs echo 'Hello World'
If you tried to do e.g. this:
env HELLO='Hello World' echo '$HELLO'
The current shell would expand anything it can, which is nothing since $HELLO is enclosed in single quotes
run env with the 3 arguments 'HELLO=Hello World', 'echo' and '$HELLO'
The env command will run and set the HELLO='Hello World' in its environment
env will run echo with the argument '$HELLO'
In this case, there's no shell that will expand the $HELLO, so echo receives the string $HELLO and prints out that. Variable expansion is done by shells only.
I think what happens is similar to this situation in which I was also puzzled.
In a nutshell, the variable expansion in the first case is done by the current shell which doesn't have $HELLO in its environment. In the second case, though, single quotes prevent the current shell from doing the variable expansion, so everything works as expected.
Note how changing single quotes to double quotes prevents this command from working the way you want:
HELLO='Hello World' bash -c "echo $HELLO"
Now this will be failing for the same reason as the first command in your question.
This works and is good for me
$ MY_VAR='Hello' ANOTHER_VAR='World!!!' && echo "$MY_VAR $ANOTHER_VAR"
Hello World!!!
Here is an easier way to confirm shell is working as expected.
env A=42 env
env
The first command sets A to 42 and runs env. The second command also runs env. Compare the output of both.
As nos noted, expansion of your variable passed to echo occurs before that variable gets set.
One alternative not yet mentioned in the earlier answers is to use:
$ a=abc eval 'echo $a'
abc
$ echo $a
<blank>
Note that you have to use the a=abc cmd syntax and not env a=abc cmd syntax; apparently env doesn't play nice with the built-in eval.

Resources