What's the difference between grep \\$ and grep \\\$? - shell

When I type grep \\\$ shell escapes both \ and $ and transforms it to \$ and sends to grep, which then finds all the lines with dollar sign $. That's fine!
When I type grep \\$ the result is the same and I don't really know why. The first backslash should escape the second one, but then $ is not escaped and shell should replace it with an empty string? grep should receive \ and report an error but instead everything works as in first example for some reason..

In UNIX shells, $x is replaced by the value of the shell variable x but when there is nothing following the $, no substitution is performed. You can test this with echo:
> echo $
$
> echo $x
>
Your two grep arguments are passed into grep as exactly the same regular expression.
> echo \\\$
\$
> echo \\$
\$
>

Related

Why is echo producing a newline?

What's the difference between
echo -n "
> "
and
echo -n ""
The first one produces a newline whereas the second doesn't.
I'm running GNU bash, version 4.2.45(1)-release (x86_64-pc-linux-gnu)
Edit : I get that the input gets a newline here. I should have been clearer with the question. Consider the following script input.sh
#!/bin/bash
echo -n $1
The following doesn't produce a newline.
./input.sh "
> "
The string
"
> "
has a newline in it as far as bash is concerned. The -n flag just means that echo will not print an extra newline at the end of your output. It will still reproduce your input, including newlines.
Expanding on #chepner's comment. Consider this:
$ set -- "
"
$ echo -n "$1" | od -c
0000000 \n
0000001
$ echo -n $1 | od -c
0000000
When you leave the variable unquoted, any leading or trailing sequences of whitespace are removed by shell. So bash discards your newline when you don't quote $1. This happens before "echo" is invoked, so "echo -n" is given no arguments.
From the Word Splitting section in the manual:
If IFS is unset, or its value is exactly <space><tab><newline>, the default, then sequences of <space>, <tab>, and <newline> at the beginning and end of the results of the previous expansions are ignored, and any sequence of IFS characters not at the beginning or end serves to delimit words.

Assigning a value having semicolon (';') to a variable in bash

I'm trying to escape ('\') a semicolon (';') in a string on unix shell (bash) with sed. It works when I do it directly without assigning the value to a variable. That is,
$ echo "hello;" | sed 's/\([^\\]\);/\1\\;/g'
hello\;
$
However, it doesn't appear to work when the above command is assigned to a variable:
$ result=`echo "hello;" | sed 's/\([^\\]\);/\1\\;/g'`
$
$ echo $result
hello;
$
Any idea why?
I tried by using the value enclosed with and without quotes but that didn't help. Any clue greatly appreciated.
btw, I first thought the semicolon at the end of the string was somehow acting as a terminator and hence the shell didn't continue executing the sed (if that made any sense). However, that doesn't appear to be an issue. I tried by using the semicolon not at the end of the string (somewhere in between). I still see the same result as before. That is,
$ echo "hel;lo" | sed 's/\([^\\]\);/\1\\;/g'
hel\;lo
$
$ result=`echo "hel;lo" | sed 's/\([^\\]\);/\1\\;/g'`
$
$ echo $result
hel;lo
$
You don't need sed (or any other regex engine) for this at all:
s='hello;'
echo "${s//;/\;}"
This is a parameter expansion which replaces ; with \;.
That said -- why are you trying to do this? In most cases, you don't want escape characters (which are syntax) to be inside of scalar variables (which are data); they only matter if you're parsing your data as syntax (such as using eval), which is a bad idea for other reasons, and best avoided (or done programatically, as via printf %q).
I find it interesting that the use of back-ticks gives one result (your result) and the use of $(...) gives another result (the wanted result):
$ echo "hello;" | sed 's/\([^\\]\);/\1\\;/g'
hello\;
$ z1=$(echo "hello;" | sed 's/\([^\\]\);/\1\\;/g')
$ z2=`echo "hello;" | sed 's/\([^\\]\);/\1\\;/g'`
$ printf "%s\n" "$z1" "$z2"
hello\;
hello;
$
If ever you needed an argument for using the modern x=$(...) notation in preference to the older x=`...` notation, this is probably it. The shell does an extra round of backslash interpretation with the back-ticks. I can demonstrate this with a little program I use when debugging shell scripts called al (for 'argument list'); you can simulate it with printf "%s\n":
$ z2=`echo "hello;" | al sed 's/\([^\\]\);/\1\\;/g'`
$ echo "$z2"
sed
s/\([^\]\);/\1\;/g
$ z1=$(echo "hello;" | al sed 's/\([^\\]\);/\1\\;/g')
$ echo "$z1"
sed
s/\([^\\]\);/\1\\;/g
$ z1=$(echo "hello;" | printf "%s\n" sed 's/\([^\\]\);/\1\\;/g')
$ echo "$z1"
sed
s/\([^\\]\);/\1\\;/g
$
As you can see, the script executed by sed differs depending on whether you use x=$(...) notation or x=`...` notation.
s/\([^\]\);/\1\;/g # ``
s/\([^\\]\);/\1\\;/g # $()
Summary
Use $(...); it is easier to understand.
You need to use four (three also work). I guess its because it's interpreted twice, first one by the sed command and the second one by the shell when reading the content of the variable:
result=`echo "hello;" | sed 's/\([^\\]\);/\1\\\\;/g'`
And
echo "$result"
yields:
hello\;

Escaping backslash in AWK in command substituion

I am trying to escape backslash in AWK. This is a sample of what I am trying to do.
Say, I have a variable
$echo $a
hi
The following works
$echo $a | awk '{printf("\\\"%s\"",$1)'}
\"hi"
But, when I am trying to save the output of the same command to a variable using command substitution, I get the following error:
$ q=`echo $a | awk '{printf("\\\"%s\"",$1)'}`
awk: {printf("\\"%s\"",$1)}
awk: ^ backslash not last character on line
I am unable to understand why command substitution is breaking the AWK. Thanks a lot for your help.
Try this:
q=$(echo $a | awk '{printf("\\\"%s\"",$1)}')
Test:
$ a=hi
$ echo $a
hi
$ q=$(echo $a | awk '{printf("\\\"%s\"",$1)}')
$ echo $q
\"hi"
Update:
It will, it just gets a littler messier.
q=`echo $a | awk '{printf("\\\\\"%s\"",$1)}'`
Test:
$ b=hello
$ echo $b
hello
$ t=`echo $b | awk '{printf("\\\\\"%s\"",$1)}'`
$ echo $t
\"hello"
Reference
Quoting inside backquoted commands is somewhat complicated, mainy
because the same token is used to start and to end a backquoted
command. As a consequence, to nest backquoted commands, the backquotes
of the inner one have to be escaped using backslashes. Furthermore,
backslashes can be used to quote other backslashes and dollar signs
(the latter are in fact redundant). If the backquoted command is
contained within double quotes, a backslash can also be used to quote a
double quote. All these backslashes are removed when the shell reads
the backquoted command. All other backslashes are left intact.
The new $(...) avoids these troubles.
Don't get into bad habits with backticks, quoting and parsing shell variables to awk The correct way to do this is:
$ shell_var="hi"
$ awk -v awk_var="$shell_var" -v c='\' 'BEGIN{printf "%s%s\n",c,awk_var}'
\hi
$ res=$(awk -v awk_var="$shell_var" -v c='\' 'BEGIN{printf "%s%s\n",c,awk_var}')
$ echo "$res"
\hi

How to replace .. from string in bash script?

I have to remove .. character from a file in Bash script. Example:
I have some string like:
some/../path/to/file
some/ab/path/to/file
And after replace, it should look like
some/path/to/file
some/ab/path/to/file
I have used below code
DUMMY_STRING=/../
TEMP_FILE=./temp.txt
sed s%${DUMMY_STRING}%/%g ${SRC_FILE} > ${TEMP_FILE}
cp ${TEMP_FILE} ${SRC_FILE}
It is replacing the /../ in line 1; but it is also removing the line /ab/ from second line. This is not desired. I understand it is considering /../ as some regex and /ab/ matches this regex. But I want only those /../ to be replaced.
Please provide some help.
Thanks,
NN
The . is a metacharacter in sed meaning 'any character'. To suppress its special meaning, escape it with a backslash:
sed -e 's%/\.\./%/%g' $src_file > $temp_file
Note that you are referring to different files after you eliminate the /../ like that. To refer to the same name as before (in the absence of symlinks, which complicate things), you would need to remove the directory component before the /../. Thus:
some/../path/to/file
path/to/file
refer to the same file, assuming some is a directory and not a symlink somewhere else, but in general, some/path/to/file is a different file (though symlinks could be used to confound that assertion).
$ x="some/../path/to/file
> some/ab/path/to/file
> /some/path/../to/another/../file"
$ echo "$x"
some/../path/to/file
some/ab/path/to/file
/some/path/../to/another/../file
$ echo "$x" | sed -e 's%/\.\./%/%g'
some/path/to/file
some/ab/path/to/file
/some/path/to/another/file
$ echo "$x" | sed -e "s%/\.\./%/%g"
some/path/to/file
some/ab/path/to/file
/some/path/to/another/file
$ echo "$x" | sed -e s%/\.\./%/%g
some/path/file
some/path/file
/some/path/to/another/file
$ echo "$x" | sed -e s%/\\.\\./%/%g
some/path/to/file
some/ab/path/to/file
/some/path/to/another/file
$
Note the careful use of double quotes around the variable "$x" in the echo commands. I could have used either single or double quotes in the assignment and would have gotten the same result.
Test on Mac OS X 10.7.4 with the standard sed (and shell is /bin/sh, aka bash 3.2.x), but the results would be the same on any system.

Search for single quoted grep strings?

I have a string, "$server['fish_stick']" (disregard double quotes)
I don't know how to successfully grep for an exact match for this string. I've tried many ways.
I've tried,
rgrep -i \$'server'\[\''fish'\_'stick'\'\] .
rgrep -i "\$server\[\'fish\_stick\'\]" .
rgrep -i '\$server\[\'fish\_stick\'\]' .
Is it single quotes that are causing my issue?
When I echo the first grep out it shows exactly what I want to search but returns garbage results like anything with $server in it.
Please help and explain, thank you!
The main problem here is that you are not quoting the argument being passed to grep. The only thing that needs to be escaped is \$ (if double quoted) and []. If you want the exact string (not using regex), just use fgrep (grep -F) which does exact string matching:
grep -F "\$server['fish_stick']"
Works on my system:
$ foo="\$server['fish_stick']"
$ echo "$foo" | grep -F "\$server['fish_stick']"
$server['fish_stick']
Using regex:
$ echo "$foo" | grep "\$server\['fish_stick'\]"
$server['fish_stick']
Using regex and handling nested single quotes:
$ echo "$foo" | grep '\$server\['\''fish_stick'\''\]'
$server['fish_stick']
Inside of single quotes, nested single quotes can not be not be escaped. You have to close the quotes, and then reopen it to "escape" the single quotes.
http://mywiki.wooledge.org/Quotes
I don't suppose you're asking how to get that string into a variable without having quoting issues. If you are, here's a way using a here-document:
str=$(cat <<'END'
$foo['bar']
END
)
To address your concern about escaping special characters for grep, you could use sed to put a backslash before any non-alphanumeric character:
grep "$(sed 's/[^[:alnum:]]/\\&/g' <<< "$str")" ...
When used with set -x, the grep command looks like: grep '\$foo\[\'\''bar\'\''\]' ...

Resources