Define variable in single line command in Bash [duplicate] - bash

This question already has answers here:
Difference between single and double quotes in Bash
(7 answers)
Closed 2 years ago.
I'm trying to execute a list of commands through the command:
bash -l -c "commands"
However, when I define a variable an then try to use them, the variable is undefined (empty). To be clear:
bash -l -c "var=MyVariable; echo $var"

Bash expansion (as explained here) will expand the variable value inside the double quotes before actually executing the commands. To avoid so you can follow either of these options:
Option 1:
Avoid the expansion using single quotes
bash -l -c 'var=MyVariable; echo $var'
Option 2:
Avoid the expansion inside the double quotes by escaping the desired variables
bash -l -c "var=MyVariable; echo \$var"
The second option allows you to expand some variables and some others not. For example:
expandVar=MyVariable1
bash -l -c "var=MyVariable; echo $expandVar; echo \$var"

Bash expands variables inside double quotes. So in effect in your command $var is replaced by the current value of var before the command is executed. What you want can be accomplished by using single quotes:
bash -l -c 'var=MyVariable; echo $var'
Please note that it is rather unusual to invoke Bash as a login shell (-l) when passing a command string with -c, but then you may have your reasons.

Related

How to execute a command that is the arguments to the script? [duplicate]

This question already has answers here:
What is the difference between $* and $#
(4 answers)
How do I pass on script arguments that contain quotes/spaces?
(2 answers)
Closed last month.
I want a bash script, call it args, to execute a command that is the arguments to the script.
In particular, I would like this command (note multiple blanks):
$./args echo 'foobar *0x0'
to execute this precise command:
echo 'foobar *0x0'
I tried this in args:
#!/bin/bash
set -x
$*
but it doesn't work:
./args echo 'foobar *0x0'
+ echo foobar '*0x0'
foobar *0x0
Witness the single space, as well as moved single quotes.
With $#, the result is exactly the same, so please don't close the question on the account of differences between $* and $#. Also, blanks are not my only problem, there is the *0x0.
#!/bin/bash
"$#"
This expands to all of the command-line arguments with spacing and quoting intact. $*, by contrast, is subject to unwanted word splitting and globbing since it's not quoted.

Preserver double quotes in bash string when run with "bash -c"

I have a bash variable which is stored as
var="\"abc\""
So, when i locally print this variable it gives me proper output(with double quotes)
echo "$var"
"abc"
What, I want to do is print it using 'bash -c' option.. but when I do the same, it prints only value without double quotes.
bash -c "echo \"$var\""
abc
bash -c "echo $var"
abc
Can anyone help me on how to preserve the double quotes in my string, when I use it in 'bash -c'. And what does -c actually mean?
In the statement
bash -c "echo $var"
the following happens:
(1) var is expanded (resulting into "abc" including the quotes).
(2) A bash child process is invoked, receiving as first parameter -c and as second paramter echo "abc"
(3) This child process runs the command, i.e. echo "abc", and according to the rules about quote removal, this is equivalent to echo abc, and you don't see any quotes in the output.
You may be tempted to do a
bash -c 'echo "$var"'
instead. In this case, the following happens:
(1) A bash child process is invoked, receiving as first parameter -c and as second paramter echo "$var". Note that $var is not expanded yet, because it is between single quotes.
(2) This child process runs the command, i.e. echo "$var". However, since we are in a child process, the variable var does not exist, and the command is equivalent to echo "", i.e. only a newline is printed.
You can however combine both solution by doing a
export var
bash -c 'echo "$var"'
In this case, var is made available to the child processes of your script, and you will see the quotes being printed.
From info bash
-c If the -c option is present, then commands are read from the first non-option argument command_string.
If there are arguments after the command_string, the first argument is assigned to $0
and any remaining arguments are assigned to the positional parameters.
So the -c option executes the commands within the quotes.
To preserve your double quotes while calling via bash -c, you would need to quote the variable seperately.
$ bash -c "echo '"$var"'"
"abc"

Seeing shell commands as they are being executed including variables [duplicate]

This question already has answers here:
Difference between single and double quotes in Bash
(7 answers)
Closed 6 years ago.
I can do
bash -x mysellh.sh
to see the commands as they are being executed, but I don't see the variables replaced with their values. For instance, if I have in the shell script:
screen -d -m sh -c 'RAILS_ENV="$R_ENV" bundle exec rake sunspot:solr:reindex[500] > "$PROJECT_DIR"/solr/indexing_out.txt'
I will see:
+ screen -d -m sh -c 'RAILS_ENV="$R_ENV" bundle exec rake sunspot:solr:reindex[500] > "$PROJECT_DIR"/solr/indexing_out.txt'
Even though I've already declared earlier:
R_ENV=test
Is there an option to do what I want?
In this case, it is doing what you asked: the command line "word" that $R_ENV is in is what you have enclosed in single quotes, which inhibit expansion. The -x output is showing you the non-expansion. If you want those variables to be expanded, enclose that first "word" in double quotes and use single quotes in the contents, like this:
screen -d -m sh -c "RAILS_ENV='$R_ENV' bundle exec rake sunspot:solr:reindex[500] > '$PROJECT_DIR'/solr/indexing_out.txt"
Then the single quotes are around the expanded text and the double quotes allow the variables to expand.

Assign a Variable in bash login shell

i am trying to do this from a Windows command prompt.
C:\cygwin64\bin\bash --login -c "$var="<hallo>" &&
echo "$var""
and i get error :
The system cannot find the file specified.
but this works:
C:\cygwin64\bin\bash --login -c
"var="hello" && echo "$hello""
The login shell seems to cause the problem when it gets a '<'. how can i still assign the string with angle brackets to the shell variable?
When you write
C:\cygwin64\bin\bash --login -c "$var="<hallo>" && echo "$var""
You are expecting the shell to strip off the outer quotes from that argument to -c and end up with a string that looks like
$var="<hallo>" && echo "$var"
but that's not what the shell does.
The shell just matches quotes as it goes along. So the shell sees.
["$var="][<hallo>][" && echo "][$var][""].
You need to escape the inner quotes from the current shell or use different quotes to avoid this parsing problem.
C:\cygwin64\bin\bash --login -c 'var="<hallo>" && echo "$var"'
Note also that I removed the $ from the start of the variable name in the assignment and that I used single quotes on the outside so that the current shell didn't expand $var.
With double quotes on the outside you'd need to use something like this instead.
C:\cygwin64\bin\bash --login -c "var='<hallo>' && echo \"\$var\""
For a similar discussion of shell parsing and how things nest (or don't) with backticks you can see my answer here.

"bash -c" doesn't export vars from sourced scripts

I have an inclusion file test.inc:
export XXX=xxx
I use it when call bash to interpret a string:
bash -c ". test.inc; echo $XXX"
But the variable is not set at the point of echo command. If I do 'export' I can see it though:
bash -c ". test.inc; export"
Shows
declare -x XXX="XXX"
How do I make my first command see the exported variables from sourced files when I use bash -c syntax?
You are using double quotes. Therefore your current shell expands $XXX long before the bash -c instance sees it. Switch to single quotes, or escape the dollar sign.

Resources