I am using a shell script named script.sh that looks like that :
#!/bin/bash
STRING=$(cat my_string.txt)
${1}
In my_string.txt, there is only :
this_is_my_string
When I execute the commands :
$ STRING="not_my_string"
$ ./script.sh "echo $STRING"
The shell prints not_my_string instead of this_is_my_string and I don’t understand why.
Could you explain me ? And is there any way to force to print the value of the STRING variable which is defined inside the script ?
The variable $STRING is being expanded before the script is called, which is why not_my_string is being assigned.
To delay expansion until after the script is called you should replace "echo $STRING" with 'echo $STRING'. The single quotes cause the expansion to be delayed.
There is some discussion of delayed expansion here:
How to delay expansion of variable in bash if command should be executed on an other machine?
You will also need to replace ${1} in your script with eval ${1}, which will force the string to be executed and expanded.
$ STRING="not_my_string"
$ ./script.sh "echo $STRING"
During command execution bash will expand all variables to the values and actually the following command will be executed:
./script.sh "echo not_my_string"
You can use the following:
./script.sh 'echo $STRING' to send string as is and eval "${1} inside the script to execute argument
Related
I have a bash variable which is stored as
var="\"abc\""
So, when i locally print this variable it gives me proper output(with double quotes)
echo "$var"
"abc"
What, I want to do is print it using 'bash -c' option.. but when I do the same, it prints only value without double quotes.
bash -c "echo \"$var\""
abc
bash -c "echo $var"
abc
Can anyone help me on how to preserve the double quotes in my string, when I use it in 'bash -c'. And what does -c actually mean?
In the statement
bash -c "echo $var"
the following happens:
(1) var is expanded (resulting into "abc" including the quotes).
(2) A bash child process is invoked, receiving as first parameter -c and as second paramter echo "abc"
(3) This child process runs the command, i.e. echo "abc", and according to the rules about quote removal, this is equivalent to echo abc, and you don't see any quotes in the output.
You may be tempted to do a
bash -c 'echo "$var"'
instead. In this case, the following happens:
(1) A bash child process is invoked, receiving as first parameter -c and as second paramter echo "$var". Note that $var is not expanded yet, because it is between single quotes.
(2) This child process runs the command, i.e. echo "$var". However, since we are in a child process, the variable var does not exist, and the command is equivalent to echo "", i.e. only a newline is printed.
You can however combine both solution by doing a
export var
bash -c 'echo "$var"'
In this case, var is made available to the child processes of your script, and you will see the quotes being printed.
From info bash
-c If the -c option is present, then commands are read from the first non-option argument command_string.
If there are arguments after the command_string, the first argument is assigned to $0
and any remaining arguments are assigned to the positional parameters.
So the -c option executes the commands within the quotes.
To preserve your double quotes while calling via bash -c, you would need to quote the variable seperately.
$ bash -c "echo '"$var"'"
"abc"
Suppose you have the following command stored in a variable:
COMMAND='echo hello'
What's the difference between
$ eval "$COMMAND"
hello
$ bash -c "$COMMAND"
hello
$ $COMMAND
hello
? Why is the last version almost never used if it is shorter and (as far as I can see) does exactly the same thing?
The third form is not at all like the other two -- but to understand why, we need to go into the order of operations when bash in interpreting a command, and look at which of those are followed when each method is in use.
Bash Parsing Stages
Quote Processing
Splitting Into Commands
Special Operator Parsing
Expansions
Word Splitting
Globbing
Execution
Using eval "$string"
eval "$string" follows all the above steps starting from #1. Thus:
Literal quotes within the string become syntactic quotes
Special operators such as >() are processed
Expansions such as $foo are honored
Results of those expansions are split on characters into whitespace into separate words
Those words are expanded as globs if they parse as same and have available matches, and finally the command is executed.
Using sh -c "$string"
...performs the same as eval does, but in a new shell launched as a separate process; thus, changes to variable state, current directory, etc. will expire when this new process exits. (Note, too, that that new shell may be a different interpreter supporting a different language; ie. sh -c "foo" will not support the same syntax that bash, ksh, zsh, etc. do).
Using $string
...starts at step 5, "Word Splitting".
What does this mean?
Quotes are not honored.
printf '%s\n' "two words" will thus parse as printf %s\n "two words", as opposed to the usual/expected behavior of printf %s\n two words (with the quotes being consumed by the shell).
Splitting into multiple commands (on ;s, &s, or similar) does not take place.
Thus:
s='echo foo && echo bar'
$s
...will emit the following output:
foo && echo bar
...instead of the following, which would otherwise be expected:
foo
bar
Special operators and expansions are not honored.
No $(foo), no $foo, no <(foo), etc.
Redirections are not honored.
>foo or 2>&1 is just another word created by string-splitting, rather than a shell directive.
$ bash -c "$COMMAND"
This version starts up a new bash interpreter, runs the command, and then exits, returning control to the original shell. You don't need to be running bash at all in the first place to do this, you can start a bash interpreter from tcsh, for example. You might also do this from a bash script to start with a fresh environment or to keep from polluting your current environment.
EDIT:
As #CharlesDuffy points out starting a new bash shell in this way will clear shell variables but environment variables will be inherited by the spawned shell process.
Using eval causes the shell to parse your command twice. In the example you gave, executing $COMMAND directly or doing an eval are equivalent, but have a look at the answer here to get a more thorough idea of what eval is good (or bad) for.
There are at least times when they are different. Consider the following:
$ cmd="echo \$var"
$ var=hello
$ $cmd
$var
$ eval $cmd
hello
$ bash -c "$cmd"
$ var=world bash -c "$cmd"
world
which shows the different points at which variable expansion is performed. It's even more clear if we do set -x first
$ set -x
$ $cmd
+ echo '$var'
$var
$ eval $cmd
+ eval echo '$var'
++ echo hello
hello
$ bash -c "$cmd"
+ bash -c 'echo $var'
$ var=world bash -c "$cmd"
+ var=world
+ bash -c 'echo $var'
world
We can see here much of what Charles Duffy talks about in his excellent answer. For example, attempting to execute the variable directly prints $var because parameter expansion and those earlier steps had already been done, and so we don't get the value of var, as we do with eval.
The bash -c option only inherits exported variables from the parent shell, and since I didn't export var it's not available to the new shell.
I'm confused by this behaviour:
$ FOO=42 echo "$FOO"
$ FOO=42 && echo "$FOO"
42
I'm accustomed to using VARNAME=value cmd to specify environment variables for other commands, so I was surprised to discover it doesn't work when cmd is echo. This might have something to do with echo being a Bash built-in. Is this the case? If so, is using && the best way to specify environment variables for Bash built-ins?
Update: && is no good, as it results in the environment variables being set permanently, which I'm trying to avoid.
It doesn't matter that echo is a built-in. The reason is that variables on the command line are evaluated before executing the command, they're not evaluated by the command. You could get the result you want with:
FOO=42 bash -c 'echo "$FOO"'
This starts a new shell to execute echo $foo. Since the argument to bash is in single quotes, the variable is not replaced by the original shell.
The replacement happens before the command is executed:
$FOO is replaced with its current value.
echo "" is executed with $FOO set to 42.
Try:
FOO=42 sh -c 'echo "$FOO"'
I have a parent script
while read cmd
do
nohup ./script ${cmd[#]} &>> log &
done < ~/list
that executes this child script
while true
do
eval "${CMD[#]}"
#${CMD[#]}
#./panic
done
with this list of commands
node ~/www/splash/app.js
node ~/www/splash-two/app.js
When the child script calls
eval ${CMD[#]}
it executes the way I expect running that command with no complaints but when I try to remove the eval and run the command using
${CMD[#]}
It throws the error
Error: Cannot find module '/home/rumplefraggle/SYS/RABBOT/~/www/splash/app.js'
Now I thought possibly this had something to do with the node command so I tried to execute
ls ~
as the command and it throws the error that ~ can not be found.
Echoing ${#} and not running it expands as I would expect it to.
Also manually inserting the command into the child script also works as expected
I don`t understand why eval works and simply running the command using ${#} does not. What is causing ${#} to not expand the ~ ?
Why is node appending the directory name to the command when ${#} is used?
Because bash first expands tilde and then the variables. node is not the one expanding the variable. You should stick with eval or use ${HOME} in your commands.
The expansion order is like this: brace expansion, tilde expansion, parameter, variable, and arithmetic expansion and command substitution (done in a left-to-right fashion), word splitting, and filename expansion
I am trying to invoke a shell command with a modified environment via the command env.
According to the manual
env HELLO='Hello World' echo $HELLO
should echo Hello World, but it doesn't.
If I do
HELLO='Hello World' bash -c 'echo $HELLO'
it prints Hello World as expected (thanks to this answer for this info).
What am I missing here?
It's because in your first case, your current shell expands the $HELLO variable before running the commands. And there's no HELLO variable set in your current shell.
env HELLO='Hello World' echo $HELLO
will do this:
expand any variables given, in this case $HELLO
run env with the 3 arguments 'HELLO=Hello World', 'echo' and '' (an empty string, since there's no HELLO variable set in the current shell)
The env command will run and set the HELLO='Hello World' in its environment
env will run echo with the argument '' (an empty string)
As you see, the current shell expanded the $HELLO variable, which isn't set.
HELLO='Hello World' bash -c 'echo $HELLO'
will do this:
set the variable HELLO='Hello World for the following command
run bash with the 2 arguments '-c' and 'echo $HELLO'
since the last argument is enclosed in single quotes, nothing inside it is expanded
the new bash in turn will run the command echo $HELLO
To run echo $HELLO in the new bash sub-shell, bash first expands anything it can, $HELLO in this case, and the parent shell set that to Hello World for us.
The subshell runs echo 'Hello World'
If you tried to do e.g. this:
env HELLO='Hello World' echo '$HELLO'
The current shell would expand anything it can, which is nothing since $HELLO is enclosed in single quotes
run env with the 3 arguments 'HELLO=Hello World', 'echo' and '$HELLO'
The env command will run and set the HELLO='Hello World' in its environment
env will run echo with the argument '$HELLO'
In this case, there's no shell that will expand the $HELLO, so echo receives the string $HELLO and prints out that. Variable expansion is done by shells only.
I think what happens is similar to this situation in which I was also puzzled.
In a nutshell, the variable expansion in the first case is done by the current shell which doesn't have $HELLO in its environment. In the second case, though, single quotes prevent the current shell from doing the variable expansion, so everything works as expected.
Note how changing single quotes to double quotes prevents this command from working the way you want:
HELLO='Hello World' bash -c "echo $HELLO"
Now this will be failing for the same reason as the first command in your question.
This works and is good for me
$ MY_VAR='Hello' ANOTHER_VAR='World!!!' && echo "$MY_VAR $ANOTHER_VAR"
Hello World!!!
Here is an easier way to confirm shell is working as expected.
env A=42 env
env
The first command sets A to 42 and runs env. The second command also runs env. Compare the output of both.
As nos noted, expansion of your variable passed to echo occurs before that variable gets set.
One alternative not yet mentioned in the earlier answers is to use:
$ a=abc eval 'echo $a'
abc
$ echo $a
<blank>
Note that you have to use the a=abc cmd syntax and not env a=abc cmd syntax; apparently env doesn't play nice with the built-in eval.