Alias if script matches variable, quotation issues - quotes

I'm trying to write an alias for a common command to cancel a process, but I'm having issues with the single and double quotations. This is my first attempt at Bash scripting and I'm a bit stumped.
lsof -i tcp:80 | awk '$1 == "Google" {print $2}'
This works as a stand-alone command and outputs the correct PID.
When I try formatting it as an alias though I'm having issues. I know the command is stopping at the first single quote by the way this is structure but I'm not sure how to fix it.
alias test='lsof -i tcp:80 | awk '$1=="Google" {print $2}''

There's no escape sequence for single quotes inside single quotes. You can't write \' like you might expect. So there are two options.
You can break out of single quotes, add an escaped single quote \', and then go back in, like so:
alias test='lsof -i tcp:80 | awk '\''$1 == "Google" {print $2}'\'
You can use double quotes. You then have to escape not just the double quotes inside the string but also the dollar signs.
alias test="lsof -i tcp:80 | awk '\$1 == \"Google\" {print \$2}'"

Try defining your alias like this
alias test='lsof -i tcp:80 | awk '"'"'$1=="Google" {print $2}'"'"
The single quotes ' must be escaped between double quotes ". To do so, the command has to be split into several parts to escape them differently. lsof -i tcp:80 | awk '$1=="Google" {print $2}' can be split on single quotes like this
lsof -i tcp:80 | awk
'
$1=="Google" {print $2}
'
Then quote with appropriate quotes
'lsof -i tcp:80 | awk'
"'"
'$1=="Google" {print $2}'
"'"
And merge every part together and you have your alias:
'lsof -i tcp:80 | awk'"'"'$1=="Google" {print $2}'"'"
Note that the first part does not contain any interpreted variable so it can be quoted with double quotes and merged with the second part. Thus the alias becomes
alias test="lsof -i tcp:80 | awk'"'$1=="Google" {print $2}'"'"

In almost every case where find yourself trying to define an alias, define a function instead.
testing () {
lsof -i tcp:80 | awk '$1=="Google" {print $2}'
}

Related

Why are results different when passing an argument to a function from piping to it as a process?

I found this thread with two solutions for trimming whitespace: piping to xargs and defining a trim() function:
trim() {
local var="$*"
# remove leading whitespace characters
var="${var#"${var%%[![:space:]]*}"}"
# remove trailing whitespace characters
var="${var%"${var##*[![:space:]]}"}"
echo -n "$var"
}
I prefer the second because of one comment:
This is overwhelmingly the ideal solution. Forking one or more external processes merely to trim whitespace from a single string is fundamentally insane – particularly when most shells (including bash) already provide native string munging facilities out-of-the-box.
I am getting, for example, the wifi SSID on macOS by piping to awk (when I get comfortable with regular expressions in bash, I won't fork an awk process), which includes a leading space:
$ /System/Library/PrivateFrameworks/Apple80211.framework/Resources/airport -I | awk -F: '/ SSID/{print $2}'
<some-ssid>
$ /System/Library/PrivateFrameworks/Apple80211.framework/Resources/airport -I | awk -F: '/ SSID/{print $2}' | xargs
<some-ssid>
$ /System/Library/PrivateFrameworks/Apple80211.framework/Resources/airport -I | awk -F: '/ SSID/{print $2}' | trim
$ wifi=$(/System/Library/PrivateFrameworks/Apple80211.framework/Resources/airport -I | awk -F: '/ SSID/{print $2}')
$ trim "$wifi"
<some-ssid>
Why does piping to the trim function fail and giving it an argument work?
It is because your trim() function is expecting a positional argument list to process. The $* is the argument list passed to your function. For the case that you report as not working, you are connecting the read end of a pipe to the function inside which you need to fetch from the standard input file descriptor.
In such a case you need to read from standard input using read command and process the argument list, i.e. as
trim() {
# convert the input received over pipe to a a single string
IFS= read -r var
# remove leading whitespace characters
var="${var#"${var%%[![:space:]]*}"}"
# remove trailing whitespace characters
var="${var%"${var##*[![:space:]]}"}"
echo -n "$var"
}
for which you can now do
$ echo " abc " | trim
abc
or using a command substitution syntax to run the command that fetches the string, that you want to pass to trim() with your older definition.
trim "$(/System/Library/PrivateFrameworks/Apple80211.framework/Resources/airport -I | awk -F: '/ SSID/{print $2}')"
In this case, the shell expands the $(..) by running the command inside and replaces it with output of the commands run. So now the function sees trim <args> which it interprets as a positional argument and runs the string replacement functions directly on it.

Reverse geocoding in BASH - error using variable as coordinates

im trying to use variables instead of static coordinates in the code below, but without any succes.
What am i doing wrong here ?
stored_address=$(curl -s "http://maps.googleapis.com/maps/api/geocode/json?latlng="'${coor1}'","'${coor2}'"&sensor=false" | grep -B 1 "route" | awk -F'"' '/short_name/ {print $4}')
My curl works if I use coordinates instead of the two variables "'${coor1}'" and "'${coor2}'", could someone please point out the error, thanks :)
working example with static coordinates:
stored_address=$(curl -s "http://maps.googleapis.com/maps/api/geocode/json?latlng=56.433125,10.07003&sensor=false" | grep -B 1 "route" | awk -F'"' '/short_name/ {print $4}')
you're using hard quoting, i.e. you wrap your variables in '. lose the single quotes and the variables will be expanded correctly:
stored_address=$(curl -s "http://maps.googleapis.com/maps/api/geocode/json?latlng=${coor1},${coor2}&sensor=false" | grep -B 1 "route" | awk -F'"' '/short_name/ {print $4}')
from the bash man page:
Enclosing characters in single quotes (‘'’) preserves the literal value of each character within the quotes.

grep to sed, append after string match but instead on end of line

I have the following text file with the following lines:
<test="123">
<test="456">
<test="789">
My aim is to have the above text file to be appended with a keyword "HELLO" after the above numbers, as following:
<test="123.HELLO">
<test="456.HELLO">
<test="789.HELLO">
with the grep command and cut, I manage to get the value between the quotation mark.
grep -o "test=".* test.txt | cut -d \" -f2
I tried to use sed on top of it, with this line
grep -o "test=".* test.txt | cut -d \" -f2 | sed -i -- 's/$/.HELLO/' test.txt
however the closest I manage to get is instead a ".HELLO" which directly appended on the end of the line (and not after the numbers in between the quotes)
<test="123">.HELLO
<test="456">.HELLO
<test="789">.HELLO
How can I fix my sed statement to provide me with the requested line?
You can do it with groups in sed. To create new output, you can do this:
sed 's/\(test="[^"]*\)"/\1.HELLO"/g' test.txt
To modify it in-place, you can use the -i switch:
sed -i 's/\(test="[^"]*\)"/\1.HELLO"/g' test.txt
Explanation:
() is a group. You can refer to it with \1. In sed we have to escape the parentheses: \(\)
[^"]* matches everything that's not a quote. So the match will stop before the quote
In the replacement, you have to add the quote manually, since it's outside of the group. So you can put stuff before the quote.
Try this:
This is how your file looks like.
bash > cat a.txt
<test="123">
<test="456">
<test="789">
Your text piped to SED
bash > cat a.txt |sed 's/">/.HELLO">/g'
<test="123.HELLO">
<test="456.HELLO">
<test="789.HELLO">
bash >
Let me know if this worked out for you.
awk 'sub("[0-9]+","&.HELLO")' file
You can accomplish this with sed directly. Cut should not be necessary:
grep "test=" test.txt | sed 's/"\(.*\)"/"\1.HELLO"/'

How to parse variable to sed command in shell?

I have some variables:
$begin=10
$end=20
how to pass them to sed command.
sed -n '$begin,$endp' filename | grep word
sed -n '10,20p' filename | grep word
The reason this doesn't work is that single quotes in shell code prevent variable expansion. The good way is to use awk:
awk -v begin="$begin" -v end="$end" 'NR == begin, NR == end' filename
It is possible with sed if you use double quotes (in which shell variables are expanded):
sed -n "$begin,$end p" filename
However, this is subject to code injection vulnerabilities because sed cannot distinguish between code and data this way (unlike the awk code above). If a user manages to set, say, end="20 e rm -Rf /;", unpleasant things can happen.

Escape single quotes ssh remote command

I read any solutions for escape single quotes on remote command over ssh. But any work fien.
I'm trying
ssh root#server "ps uax|grep bac | grep -v grep | awk '{ print $2 }' > /tmp/back.tmp"
Don't work awk
ssh root#server "ps uax|grep bac | grep -v grep | awk \'{ print $2 }\' > /tmp/back.tmp"
....
awk: '{
awk: ^ caracter ''' inválido en la expresión
And try put single quotas on command but also don't work.
Aprecite help
The ssh command treats all text typed after the hostname as the remote command to executed. Critically what this means to your question is that you do not need to quote the entire command as you have done. Rather, you can just send through the command as you would type it as if you were on the remote system itself.
This simplifies dealing with quoting issues, since it reduces the number of quotes that you need to use. Since you won't be using quotes, all special bash characters need to be escaped with backslashes.
In your situation, you need to type,
ssh root#server ps uax \| grep ba[c] \| \'{ print \$2 }\' \> /tmp/back.tmp
or you could double quote the single quotes instead of escaping them (in both cases, you need to escape the dollar sign)
ssh root#server ps uax \| grep ba[c] \| "'{ print \$2 }'" \> /tmp/back.tmp
Honestly this feels a little more complicated, but I have found this knowledge pretty valuable when dealing with sending commands to remote systems that involve more complex use of quotes.
In your first try you use double-quotes " so you need to escape the $ character:
ssh root#server "ps uax|grep bac | grep -v grep | awk '{ print \$2 }' > /tmp/back.tmp"
▲
Also, you can use:
ps uax | grep 'ba[c]' | ...
so then you don't need the grep -v grep step.

Resources