Bash: Echoing a echo command with a variable in bash - bash

Ok, here is one I am struggling with as we speak. Echoing a echo command with a variable.
echo "creating new script file."
echo "#!/bin/bash" > $servsfile
echo "read -p "Please enter a service: " ser " >> $servfile
echo "servicetest=`getsebool -a | grep ${ser}` " >> $servfile
echo "if [ $servicetest > /dev/null ];then " >> $servfile
echo "echo "we are now going to work with ${ser}" " >> $servfile
echo "else" >> $servfile
echo "exit 1" >> $servfile
echo "fi" >> $servfile
My goal is create a script using echo commands then run it later. I just need to figure out how to echo echo/read commands while maintaining my variables.
edit: the variables need to transfer what's inside of them into the new file.

The immediate problem is you have is with quoting: by using double quotes ("..."), your variable references are instantly expanded, which is probably not what you want.
Use single quotes instead - strings inside single quotes are not expanded or interpreted in any way by the shell.
(If you want selective expansion inside a string - i.e., expand some variable references, but not others - do use double quotes, but prefix the $ of references you do not want expanded with \; e.g., \$var).
However, you're better off using a single here-doc[ument], which allows you to create multi-line stdin input on the spot, bracketed by two instances of a self-chosen delimiter, the opening one prefixed by <<, and the closing one on a line by itself - starting at the very first column; search for Here Documents in man bash or at http://www.gnu.org/software/bash/manual/html_node/Redirections.html.
If you quote the here-doc delimiter (EOF in the code below), variable references are also not expanded. As #chepner points out, you're free to choose the method of quoting in this case: enclose the delimiter in single quotes or double quotes, or even simply arbitrarily escape one character in the delimiter with \:
echo "creating new script file."
cat <<'EOF' > "$servfile"
#!/bin/bash
read -p "Please enter a service: " ser
servicetest=`getsebool -a | grep ${ser}`
if [ $servicetest > /dev/null ]; then
echo "we are now going to work with ${ser}"
else
exit 1
fi
EOF
As #BruceK notes, you can prefix your here-doc delimiter with - (applied to this example: <<-"EOF") in order to have leading tabs stripped, allowing for indentation that makes the actual content of the here-doc easier to discern.
Note, however, that this only works with actual tab characters, not leading spaces.
Employing this technique combined with the afterthoughts regarding the script's content below, we get (again, note that actual tab chars. must be used to lead each here-doc content line for them to get stripped):
cat <<-'EOF' > "$servfile"
#!/bin/bash
read -p "Please enter a service name: " ser
if [[ -n $(getsebool -a | grep "${ser}") ]]; then
echo "We are now going to work with ${ser}."
else
exit 1
fi
EOF
Finally, note that in bash even normal single- or double-quoted strings can span multiple lines, but you won't get the benefits of tab-stripping or line-block scoping, as everything inside the quotes becomes part of the string.
Thus, note how in the following #!/bin/bash has to follow the opening ' immediately in order to become the first line of output:
echo '#!/bin/bash
read -p "Please enter a service: " ser
servicetest=$(getsebool -a | grep "${ser}")
if [[ -n $servicetest ]]; then
echo "we are now going to work with ${ser}"
else
exit 1
fi' > "$servfile"
Afterthoughts regarding the contents of your script:
The syntax $(...) is preferred over `...` for command substitution nowadays.
You should double-quote ${ser} in the grep command, as the command will likely break if the value contains embedded spaces (alternatively, make sure that the valued read contains no spaces or other shell metacharacters).
Use [[ -n $servicetest ]] to test whether $servicetest is empty (or perform the command substitution directly inside the conditional) - [[ ... ]] - the preferred form in bash - protects you from breaking the conditional if the $servicetest happens to have embedded spaces; there's NEVER a need to suppress stdout output inside a conditional (whether [ ... ] or [[ ... ]], as no stdout output is passed through; thus, the > /dev/null is redundant (that said, with a command substitution inside a conditional, stderr output IS passed through).

You just need to use single quotes:
$ echo "$TEST"
test
$ echo '$TEST'
$TEST
Inside single quotes special characters are not special any more, they are just normal characters.

echo "echo "we are now going to work with ${ser}" " >> $servfile
Escape all " within quotes with \. Do this with variables like \$servicetest too:
echo "echo \"we are now going to work with \${ser}\" " >> $servfile
echo "read -p \"Please enter a service: \" ser " >> $servfile
echo "if [ \$servicetest > /dev/null ];then " >> $servfile

Related

Bash script loop not incrementing

I have a script that inputs 2 -> file#.txt -> newline into my program. However, the script only inputs file4.txt into my program, even though the loop is running 10 times. I'm not sure why this is happening specifically for i=4.
gnome-terminal --working-directory=/path/to/dir/ -- bash -c "{ for i in {1..10};
do echo "2"; echo "file"$i".txt"; echo $'\n'; done; } | ./program; exec bash"
The $ characters are being processed by your original shell, not the shell executed inside gnome-terminal. Also, the embedded double quotes are delimiting the -c argument, not being passed explicitly to bash.
You need to escape the $ and " characters to preserve them.
gnome-terminal --working-directory=/path/to/dir/ -- bash -c "{ for i in {1..10};
do echo \"2\"; echo \"file\"\$i".txt\"; echo \$'\n'; done; } | ./program; exec bash"
You could also put the command in single quotes, but then you won't be able to embed $'\n' in it, because you can't escape quotes inside single quotes. But you can use printf instead of echo, since it will translate escape sequences itself.
gnome-terminal --working-directory=/path/to/dir/ -- bash -c '{ for i in {1..10};
do echo "2"; printf "file%d.txt\n\n" $i; done; } | ./program; exec bash'
See Difference between single and double quotes in Bash

Strange issue resolving bash environmental variable in nested double quotes

I have a setup script that needs to be run remotely on an arbitrary machine (can be windows). So I had something along the lines of bash -c "do things that need environmental variables".
I found some strange things happening with nested quotes + enviornmental variables that I don't understand (demonstrated below)
# This worked because my environment was polluted.
bash -c "NAME=me echo $NAME"
> me
# I think this was a weird cross platform issue with how I was running.
# I couldn't reproduce it locally.
bash -c "NAME=me echo "Hi $NAME""
> Hi $NAME
# This was my workaround, and I have no clue why this works.
# I get that "Start "" end" does string concatenation in bash,
# but I have no clue why that would make this print 'Hi me' instead
# of 'Hi'.
#
# This works because echo Hi name prints "Hi name". I thought echo only
# took the first argument passed in.
bash -c "NAME=me echo Hi "" $NAME"
> Hi me
# This is the same as the first case. NAME was just empty this time.
bash -c "NAME=me echo Hi $NAME"
> Hi
Edit: A bunch of people have pointed out that the variables get expanded in double quotes before bash -c gets run. This makes sense, but I feel like it doesn't explain why case 1 works.
shouldn't bash -c "NAME=me echo $NAME" be expanded to bash -c "NAME=me echo ", since NAME isn't set before we run this?
Edit 2: A bunch of this stuff worked because my environment was polluted. I've tried to describe what mistakes I made in my assumptions
There are at least three sources of confusion here: quotes don't (generally) nest, $variable references are expanded by the shell even if they're in double-quotes, and variable references are resolved before var=value assignments are done.
Let me look at the second problem first. Here's an interactive example showing the effect:
$ NAME=Gordon
$ bash -c "NAME=me echo $NAME"
Gordon
Here, the outer (interactive) shell expanded $NAME before passing it to bash -c, so the command essentially became bash -c "NAME=me echo Gordon". There are several ways to avoid this: you can escape the $ to remove its normal effect (but the escape gets removed, so the inner shell will see it and apply it normally), or use single-quotes instead of double (which remove the special effect of all characters, except for another single-quote which ends the single-quoted string). So let's try those:
$ bash -c "NAME=me echo \$NAME"
$ bash -c 'NAME=me echo $NAME'
(You can't really see it, but there's a blank line after the second command as well, because it didn't print anything either.) What happened here is that the inner shell (the one created by bash -c) indeed got the command NAME=me echo $NAME, but when executing it expands $NAME first (giving nothing, because it's not defined in that shell), and then executes NAME=me echo which runs the echo command with NAME set to "me" in its environment. Let's try that interactively:
$ NAME=me echo $NAME
Gordon
(Remember that I set NAME=Gordon in my interactive shell earlier.) To get the intended effect, you'd need to set NAME and then as a separate command use it in an echo command:
$ bash -c "NAME=me; echo \$NAME"
me
$ bash -c 'NAME=me; echo $NAME'
me
Ok, with that out of the way let's move on to the original question about quoting. As I said, quotes don't (generally) nest. To understand what's going on, let's analyze some of the example commands. You can get a better idea how the shell interprets things by using set -x, which makes the shell print each command's equivalent just before it's executed:
$ set -x
$ bash -c "NAME=me echo "Hi $NAME""
+ bash -c 'NAME=me echo Hi' Gordon
Hi
What happened here is that the shell parsed "NAME=me echo "Hi as a double-quoted string immediately followed by two unquoted characters; since there's no gap between them, they get merged into a single argument to bash -c. It may seem a little weird having only part of an argument quoted, but it's actually entirely normal in shell syntax. It's even normal to have part of a single argument be unquoted, part single-quoted, part double-quoted, and even part in ANSI-C mode ($'ANSI-c-escaped stuff goes here').
With set -x, bash will print something equivalent to the command being executed. All of these commands are equivalent in shell syntax:
bash -c "NAME=me echo "Hi Gordon
bash -c "NAME=me echo Hi" Gordon
bash -c 'NAME=me echo Hi' Gordon
bash -c NAME=me\ echo\ Hi Gordon
bash -c NAME=me' 'echo' 'Hi Gordon
bash -c 'NAME=me'\ "echo Hi" Gordon
...and lots more. With set -x, bash will print one of these equivalents, and it just happens to choose the one with single-quotes around the entire argument.
Just for completeness, what happened to $NAME""? It's treated as an unquoted variable reference (which expands to Gordon) immediately followed by a zero-length double-quoted string, which doesn't do anything at all.
But... why does that just print "Hi"? Well, bash -c treats the next argument as a command to run, and any further arguments as the argument vector ($0, $1, etc) for that command's environment. Here's an illustration:
$ bash -c 'echo "Args: $0 $1 $2"' zeroth first second third
+ bash -c 'echo "Args: $0 $1 $2"' zeroth first second third
Args: zeroth first second
("third" doesn't get printed because the command doesn't print $3.)
Thus, when you run bash -c 'NAME=me echo Hi' Gordon, it executes NAME=me echo Hi with $0 set to "Gordon".
Ok, here's the last example I'll look at:
$ bash -c "NAME=me echo Hi "" $NAME"
+ bash -c 'NAME=me echo Hi Gordon'
Hi Gordon
What's happening here is that there's a double-quoted section "NAME=me echo Hi " immediately followed by another one, " $NAME", so they get merged into a single long argument (which happens to contain two spaces in a row -- one part of the first quoted section, one part of the second). Essentially, the "" in the middle ends one double-quotes section and immediately starts another, thus having no overall effect. And again, the shell decided to print a single-quoted equivalent rather than any of the various other possible equivalents.
So how do we actually get this to work right? Here's what I'd actually recommend:
$ bash -c 'NAME=me; echo "Hi $NAME"'
+ bash -c 'NAME=me; echo "Hi $NAME"'
Hi me
Since the entire command string is in single-quotes, none of these problems occur. The double-quotes are just normal characters being passed as part of the argument (so double-quotes sort of nest inside single-quotes -- and vice versa -- but it's really just 'cause they're ignored), and the $ doesn't get its special meaning to the outer shell either. Oh, and the ; makes this two separate commands, so the NAME=me part can take effect before the echo "$NAME" part uses it.
Another equivalent would be:
$ bash -c "NAME=me; echo \"Hi \$NAME\""
+ bash -c 'NAME=me; echo "Hi $NAME"'
Hi me
Here the escapes remove the special meanings of the $ and enclosed double-quotes. Note that the shell prints exactly the same thing as last time for its set -x output, indicating that this really is equivalent to the single-quoted version.

How can I write and append using echo command to a file

I am trying to write a script which will use echo and write/append to a file.
But I have " " in syntax already in strings .. say ..
echo "I am "Finding" difficult to write this to file" > file.txt
echo "I can "write" without double quotes" >> file.txt
Can anyone please help to understand this, really appreciated.
BR,
SM
If you want to have quotes, then you must escape them using the backslash character.
echo "I am \"Finding\" difficult to write this to file" > file.txt echo
echo "I can \"write\" without double quotes" >> file.txt
The same holds true if you i.e. also want to write the \ itself, as it may cause side effects. So you have to use \\
Another option would be to use The `'' instead of quotes.
echo 'I am "Finding" difficult to write this to file' > file.txt echo
echo 'I can "write" without double quotes' >> file.txt
However in this case variable substition doesn't work, so if you want to use variables you have to put them outside.
echo "This is a test to write $PATH in my file" >> file.txt
echo 'This is a test to write '"$PATH"' in my file' >> file.txt
If you have special characters, you can escape them with a backslash to use them as needed:
echo "I am \"Finding\" difficult to write this to file" > file.txt
echo "I can \"write\" without double quotes" >> file.txt
However, you can also use the shell's "EOF" feature with the tee command, which is really nice for writing all sorts of things:
tee -a file.txt <<EOF
I am "Finding" difficult to write this to file
I can "write" without double quotes
EOF
That will write virtually ANY content you want directly to that file, and escape any special characters until you get to the EOF.
*Edited to add the append switch, to prevent overwriting the file:
-a

how to restrain bash from removing blanks when processing file

A simple yet annoying thing:
Using a script like this:
while read x; do
echo "$x"
done<file
on a file containing whitespace:
text
will give me an output without the whitespace:
text
The problem is i need this space before text (it's one tab mostly but not always).
So the question is: how to obtain identical lines as are in input file in such a script?
Update: Ok, so I changed my while read x to while IFS= read x.
echo "$x" gives me correct answer without stripping first tab, but, eval "echo $x" strips this tab.
What should I do then?
read is stripping the whitespace. Wipe $IFS first.
while IFS= read x
do
echo "$x"
done < file
The entire contents of the read are put into a variable called REPLY. If you use REPLY instead of 'x', you won't have to worry about read's word splitting and IFS and all that.
I ran into the same trouble you are having when attempting to strip spaces off the end of filenames. REPLY came to the rescue:
find . -name '* ' -depth -print | while read; do mv -v "${REPLY}" "`echo "${REPLY}" | sed -e 's/ *$//'`"; done
I found the solution to the problem 'eval "echo $x" strips this tab.' This should fix it:
eval "echo \"$x\""
I think this causes the inner (escaped) quotes will be evaluated with the echo, whereas I think that both
eval "echo $x"
and
eval echo "$x"
cause the quotes to be evaluated before the echo, which means that the string passed to echo has no quotes, causing the white space to be lost. So the complete answer is:
while IFS= read x
do
eval "echo \"$x\""
done < file

How to keep quotes in Bash arguments? [duplicate]

This question already has answers here:
How can I preserve quotes in printing a bash script's arguments
(7 answers)
Closed 3 years ago.
I have a Bash script where I want to keep quotes in the arguments passed.
Example:
./test.sh this is "some test"
then I want to use those arguments, and re-use them, including quotes and quotes around the whole argument list.
I tried using \"$#\", but that removes the quotes inside the list.
How do I accomplish this?
using "$#" will substitute the arguments as a list, without re-splitting them on whitespace (they were split once when the shell script was invoked), which is generally exactly what you want if you just want to re-pass the arguments to another program.
Note that this is a special form and is only recognized as such if it appears exactly this way. If you add anything else in the quotes the result will get combined into a single argument.
What are you trying to do and in what way is it not working?
There are two safe ways to do this:
1. Shell parameter expansion: ${variable#Q}:
When expanding a variable via ${variable#Q}:
The expansion is a string that is the value of parameter quoted in a format that can be reused as input.
Example:
$ expand-q() { for i; do echo ${i#Q}; done; } # Same as for `i in "$#"`...
$ expand-q word "two words" 'new
> line' "single'quote" 'double"quote'
word
'two words'
$'new\nline'
'single'\''quote'
'double"quote'
2. printf %q "$quote-me"
printf supports quoting internally. The manual's entry for printf says:
%q Causes printf to output the corresponding argument in a format that can be reused as shell input.
Example:
$ cat test.sh
#!/bin/bash
printf "%q\n" "$#"
$
$ ./test.sh this is "some test" 'new
>line' "single'quote" 'double"quote'
this
is
some\ test
$'new\nline'
single\'quote
double\"quote
$
Note the 2nd way is a bit cleaner if displaying the quoted text to a human.
Related: For bash, POSIX sh and zsh: Quote string with single quotes rather than backslashes
Yuku's answer only works if you're the only user of your script, while Dennis Williamson's is great if you're mainly interested in printing the strings, and expect them to have no quotes-in-quotes.
Here's a version that can be used if you want to pass all arguments as one big quoted-string argument to the -c parameter of bash or su:
#!/bin/bash
C=''
for i in "$#"; do
i="${i//\\/\\\\}"
C="$C \"${i//\"/\\\"}\""
done
bash -c "$C"
So, all the arguments get a quote around them (harmless if it wasn't there before, for this purpose), but we also escape any escapes and then escape any quotes that were already in an argument (the syntax ${var//from/to} does global substring substitution).
You could of course only quote stuff which already had whitespace in it, but it won't matter here. One utility of a script like this is to be able to have a certain predefined set of environment variables (or, with su, to run stuff as a certain user, without that mess of double-quoting everything).
Update: I recently had reason to do this in a POSIX way with minimal forking, which lead to this script (the last printf there outputs the command line used to invoke the script, which you should be able to copy-paste in order to invoke it with equivalent arguments):
#!/bin/sh
C=''
for i in "$#"; do
case "$i" in
*\'*)
i=`printf "%s" "$i" | sed "s/'/'\"'\"'/g"`
;;
*) : ;;
esac
C="$C '$i'"
done
printf "$0%s\n" "$C"
I switched to '' since shells also interpret things like $ and !! in ""-quotes.
If it's safe to make the assumption that an argument that contains white space must have been (and should be) quoted, then you can add them like this:
#!/bin/bash
whitespace="[[:space:]]"
for i in "$#"
do
if [[ $i =~ $whitespace ]]
then
i=\"$i\"
fi
echo "$i"
done
Here is a sample run:
$ ./argtest abc def "ghi jkl" $'mno\tpqr' $'stu\nvwx'
abc
def
"ghi jkl"
"mno pqr"
"stu
vwx"
You can also insert literal tabs and newlines using Ctrl-V Tab and Ctrl-V Ctrl-J within double or single quotes instead of using escapes within $'...'.
A note on inserting characters in Bash: If you're using Vi key bindings (set -o vi) in Bash (Emacs is the default - set -o emacs), you'll need to be in insert mode in order to insert characters. In Emacs mode, you're always in insert mode.
I needed this for forwarding all arguments to another interpreter.
What ended up right for me is:
bash -c "$(printf ' %q' "$#")"
Example (when named as forward.sh):
$ ./forward.sh echo "3 4"
3 4
$ ./forward.sh bash -c "bash -c 'echo 3'"
3
(Of course the actual script I use is more complex, involving in my case nohup and redirections etc., but this is the key part.)
Like Tom Hale said, one way to do this is with printf using %q to quote-escape.
For example:
send_all_args.sh
#!/bin/bash
if [ "$#" -lt 1 ]; then
quoted_args=""
else
quoted_args="$(printf " %q" "${#}")"
fi
bash -c "$( dirname "${BASH_SOURCE[0]}" )/receiver.sh${quoted_args}"
send_fewer_args.sh
#!/bin/bash
if [ "$#" -lt 2 ]; then
quoted_last_args=""
else
quoted_last_args="$(printf " %q" "${#:2}")"
fi
bash -c "$( dirname "${BASH_SOURCE[0]}" )/receiver.sh${quoted_last_args}"
receiver.sh
#!/bin/bash
for arg in "$#"; do
echo "$arg"
done
Example usage:
$ ./send_all_args.sh
$ ./send_all_args.sh a b
a
b
$ ./send_all_args.sh "a' b" 'c "e '
a' b
c "e
$ ./send_fewer_args.sh
$ ./send_fewer_args.sh a
$ ./send_fewer_args.sh a b
b
$ ./send_fewer_args.sh "a' b" 'c "e '
c "e
$ ./send_fewer_args.sh "a' b" 'c "e ' 'f " g'
c "e
f " g
Just use:
"${#}"
For example:
# cat t2.sh
for I in "${#}"
do
echo "Param: $I"
done
# cat t1.sh
./t2.sh "${#}"
# ./t1.sh "This is a test" "This is another line" a b "and also c"
Param: This is a test
Param: This is another line
Param: a
Param: b
Param: and also c
Changed unhammer's example to use array.
printargs() { printf "'%s' " "$#"; echo; }; # http://superuser.com/a/361133/126847
C=()
for i in "$#"; do
C+=("$i") # Need quotes here to append as a single array element.
done
printargs "${C[#]}" # Pass array to a program as a list of arguments.
My problem was similar and I used mixed ideas posted here.
We have a server with a PHP script that sends e-mails. And then we have a second server that connects to the 1st server via SSH and executes it.
The script name is the same on both servers and both are actually executed via a bash script.
On server 1 (local) bash script we have just:
/usr/bin/php /usr/local/myscript/myscript.php "$#"
This resides on /usr/local/bin/myscript and is called by the remote server. It works fine even for arguments with spaces.
But then at the remote server we can't use the same logic because the 1st server will not receive the quotes from "$#". I used the ideas from JohnMudd and Dennis Williamson to recreate the options and parameters array with the quotations. I like the idea of adding escaped quotations only when the item has spaces in it.
So the remote script runs with:
CSMOPTS=()
whitespace="[[:space:]]"
for i in "$#"
do
if [[ $i =~ $whitespace ]]
then
CSMOPTS+=(\"$i\")
else
CSMOPTS+=($i)
fi
done
/usr/bin/ssh "$USER#$SERVER" "/usr/local/bin/myscript ${CSMOPTS[#]}"
Note that I use "${CSMOPTS[#]}" to pass the options array to the remote server.
Thanks for eveyone that posted here! It really helped me! :)
Quotes are interpreted by bash and are not stored in command line arguments or variable values.
If you want to use quoted arguments, you have to quote them each time you use them:
val="$3"
echo "Hello World" > "$val"
As Gary S. Weaver shown in his source code tips, the trick is to call bash with parameter '-c' and then quote the next.
e.g.
bash -c "<your program> <parameters>"
or
docker exec -it <my docker> bash -c "$SCRIPT $quoted_args"
If you need to pass all arguments to bash from another programming language (for example, if you'd want to execute bash -c or emit_bash_code | bash), use this:
escape all single quote characters you have with '\''.
then, surround the result with singular quotes
The argument of abc'def will thus be converted to 'abc'\''def'. The characters '\'' are interpreted as following: the already existing quoting is terminated with the first first quote, then the escaped singular single quote \' comes, then the new quoting starts.
Yes, seems that it is not possible to ever preserve the quotes, but for the issue I was dealing with it wasn't necessary.
I have a bash function that will search down folder recursively and grep for a string, the problem is passing a string that has spaces, such as "find this string". Passing this to the bash script will then take the base argument $n and pass it to grep, this has grep believing these are different arguments. The way I solved this by using the fact that when you quote bash to call the function it groups the items in the quotes into a single argument. I just needed to decorate that argument with quotes and pass it to the grep command.
If you know what argument you are receiving in bash that needs quotes for its next step you can just decorate with with quotes.
Just use single quotes around the string with the double quotes:
./test.sh this is '"some test"'
So the double quotes of inside the single quotes were also interpreted as string.
But I would recommend to put the whole string between single quotes:
./test.sh 'this is "some test" '
In order to understand what the shell is doing or rather interpreting arguments in scripts, you can write a little script like this:
#!/bin/bash
echo $#
echo "$#"
Then you'll see and test, what's going on when calling a script with different strings

Resources