How to use awk {print} inside an inline ssh command - bash

I am trying to run an inline ssh command which looks like this:
ssh user#127.0.0.1 "df / -h | awk 'FNR == 2 {print $3}'"
I would expect the output to be 3.8G (as this is the second line, third column) but instead, I am getting /dev/sda1 6.9G 3.8G 2.8G 58% /(the entire second line).
This means that the FNR == 2 is working but the {print $3} is not.
If I run that command directly on the machine that I am ssh'ing into then I get the expected result, just not when calling it through an inline ssh command as above.
This code will eventually ran within a bash script. Is it not possible to use print in this way? If not, is there another method that you can suggest? I am relatively new to terminal life so please bear with me.

The problem resides in the way you pass you ssh arguments.
By calling:
ssh user#127.0.0.1 "df / -h | awk 'FNR == 2 {print $3}'"
You are passing two arguments:
user#127.0.0.1
"df / -h | awk 'FNR == 2 {print $3}'"
Since your second argument is given inside double quotes, the $3 variable will be expanded. You can prevent this variable expansion by escaping the dollar sign:
ssh user#127.0.0.1 "df / -h | awk 'FNR == 2 {print \$3}'"

The joys of shell quoting. The line:
ssh user#127.0.0.1 "df / -h | awk 'FNR == 2 {print $3}'"
Is parsed by the shell, which invokes ssh with two arguments:
user#127.0.01 and df / -h | awk 'FNR == 2 {print }' The $3 was interpolated, and (I'm assuming) was empty. To prevent that, you have many options. One of which is:
ssh user#127.0.0.1 << \EOF
df / -h | awk 'FNR == 2 {print $3}'
EOF
another is:
ssh user#127.0.0.1 sh -c '"df / -h | awk '"'"'FNR == 2 {print $3}'"'"'"'

Related

shell script in a here-document used as input to ssh gives no result

I am piping a result of grep to AWK and using the result as a pattern for another grep inside EOF (not sure whats the terminology there), but the AWK gives me blank results. Below is part of the bash script that gave me issues.
ssh "$USER"#logs << EOF
zgrep $wgr $loc$env/app*$date* | awk -F":" '{print $5 "::" $7}' | awk -F"," '{print $1}' | sort | uniq | while read -r rid ; do
zgrep $rid $loc$env/app*$date*;
done
EOF
I am really drawing a blank here beacuse of no error and Im out of ideas.
Samples:
I am greping log files that looks like below:
app-server.log.2020010416.gz:2020-01-04 16:00:00,441 INFO [redacted] (redacted) [rid:12345::12345-12345-12345-12345-12345,...
I am interested in rid and I can grep that in logs again:
zgrep $rid $loc$env/app*$date*
loc, env and date are working properly, but they are outside of EOF.
The script as a whole connects to ssh and goes out properly but I am getting no result.
The immediate problem is that the dollar signs are evaluated by the local shell because you don't (and presumably cannot) quote the here document (because then $wqr and $loc etc will also not be expanded by the shell).
The quick fix is to backslash the dollar signs, but in addition, I see several opportunities to get rid of inelegant or wasteful constructs.
ssh "$USER"#logs << EOF
zgrep "$wgr" "$loc$env/app"*"$date"* |
awk -F":" '{v = \$5 "::" \$7; split(v, f, /,/); print f[1]}' |
sort -u | xargs -I {} zgrep {} "$loc$env"/app*"$date"*
EOF
If you want to add decorations around the final zgrep, probably revert to the while loop you had; but of course, you need to escape the dollar sign in that, too:
ssh "$USER"#logs << EOF
zgrep "$wgr" "$loc$env/app"*"$date"* |
awk -F":" '{v = \$5 "::" \$7; split(v, f, /,/); print f[1]}' |
sort -u |
while read -r rid; do
echo Dancing hampsters "\$rid" more dancing hampsters
zgrep "\$rid" "$loc$env"/app*"$date"*
done
EOF
Again, any unescaped dollar sign is evaluated by your local shell even before the ssh command starts executing.
Could you please try following. Fair warning I couldn't test it since lack of samples. By doing this approach we need not to escape things while doing ssh.
##Configure/define your shell variables(wgr, loc, env, date, rid) here.
printf -v var_wgr %q "$wgr"
printf -v var_loc %q "$loc"
printf -v var_env %q "$env"
printf -v var_date %q "$date"
ssh -T -p your_pass user#"$host" "bash -s $var_str" <<'EOF'
# retrieve it off the shell command line
zgrep "$var_wgr $var_loc$var_env/app*$var_date*" | awk -F":" '{print $5 "::" $7}' | awk -F"," '{print $1}' | sort | uniq | while read -r rid ; do
zgrep "$rid $var_loc$var_env/app*$date*";
done
EOF

How to extract CSV field and pass it to another command

I have a bash script that makes an API request that returns the result in CSV format.
I want to extract only the "id" value from the first line(It will always be the same in the rest of the lines) and then pass it to wget command (for example wget http://test.com/$id)
Current bash script:
req = curl -k -d '{"returnFormat":"csv","eventid":"2"}' -H "Authorization: xxx" -H "Accept: application/json" -H "Content-type: application/json" -X POST https://test.api/restSearch
The outputs:
id,category,type,value,comment,date,subject_tag
1357,"activity","domain","dodskj.com","comment",1547034584,"kill-chain"
1357,"activity","ip-dst","8.8.6.6","comment example",1547034600,"maec-mal""
According to this example, I want to extract the value "1357" into a variable and send it to the wget command or any other command.
You can use the cut command ... in this case:
curl <params> | cut -d, -f1
or alternatively awk:
curl <params> | awk -F, '{print $1}'
For your specific example, if you only want the second line you can use:
curl <param> | awk -F, 'NR == 2 {print $1}'
If you want to select a line based on a particular field then:
curl <param> | awk -F, '$4 == "dodskj.com" {print $1}'
(you can match regular expressions using ~ operator in place of ==)
You can also break after the first match with exit:
curl <param> | awk -F, '$4 == "dodskj.com" {print $1; exit}'
Then you can encapsulate the whole lot in $() to assign to a variable ...
VAR=$(curl ... | awk ...)
Hope that helps!
You can pipe your csv into awk.
req=$(<prev_cmd> | awk -F, 'NR==2{print $1}')
-F, tells awk that fields are comma-delimited.
NR==2 will run the script in braces only for the second row.
print $1 will print the first field.
$(...) performs command substitution.
Note: remember no spaces between variable assignments.

Why does awk fail when part of ssh command? [duplicate]

This question already has answers here:
How to use bash $(awk) in single ssh-command?
(6 answers)
awk remote ssh command
(1 answer)
Run a remote awk command using ssh
(1 answer)
Closed 4 years ago.
When I ssh into a server and run the command, it works:
$ssh -q postgres#XXXXX
Last login: Mon Mar 12 12:30:16 2018 from 10.101.XXX.X
[postgres#pgsdba203 ~]$ df -hP | grep pgsql | awk '{ if ($5 >= 70) print $0 }'
/dev/mapper/pgvg-pg_log 9.9G 7.2G 2.2G 77% /var/lib/pgsql/data/pg_log
But not when it's part of an ssh command parameter:
$ ssh -q postgres#XXXXX "df -hP | grep pgsql | awk '{ if ($5 >= 70) print $0 }'"
awk: { if ( >= 70) print -bash }
awk: ^ syntax error
Since the awk command is in single quotes, I'd have expected the $5 to be preserved and passed directly to server XXXXX.
EDIT: the end goal is to make the server name into a variable and call ssh from within a bash for loop.
What I would do :
ssh -q postgres#XXXXX <<'EOF'
df -hP | awk '/pgsql/ && $5 >= 70'
EOF
or in a for loop
for server in foo bar base; do
ssh -q postgres#$server <<'EOF'
df -hP | awk '/pgsql/ && $5 >= 70'
EOF
done
As an easy-to-define pattern, which works even when you need to pipe stdin into your remote code: Define a shell function, and tell the shell itself to emit that function's literal text substituted into the remote command:
myfunc() { df -hP | grep pgsql | awk '{ if ($5 >= 70) print $0 }'; }
ssh -q postgres#XXXX "$(declare -f myfunc); myfunc"

how to output awk result to varial

i need to run hadoop command to list all live nodes, then based on the output i reformat it using awk command, and eventually output the result to a variable, awk use different delimiter each time i call it:
hadoop job -list-active-trackers | sort | awk -F. '{print $1}' | awk -F_ '{print $2}'
it outputs result like this:
hadoop-dn-11
hadoop-dn-12
...
then i put the whole command in variable to print out the result line by line:
var=$(sudo -H -u hadoop bash -c "hadoop job -list-active-trackers | sort | awk -F "." '{print $1}' | awk -F "_" '{print $2}'")
printf %s "$var" | while IFS= read -r line
do
echo "$line"
done
the awk -F didnt' work, it output result as:
tracker_hadoop-dn-1.xx.xsy.interanl:localhost/127.0.0.1:9990
tracker_hadoop-dn-1.xx.xsy.interanl:localhost/127.0.0.1:9390
why the awk with -F won't work correctly? and how i can fix it?
var=$(sudo -H -u hadoop bash -c "hadoop job -list-active-trackers | sort | awk -F "." '{print $1}' | awk -F "_" '{print $2}'")
Because you're enclosing the whole command in double quotes, your shell is expanding the variables $1 and $2 before launching sudo. This is what the sudo command looks like (I'm assuming $1 and $2 are empty)
sudo -H -u hadoop bash -c "hadoop job -list-active-trackers | sort | awk -F . '{print }' | awk -F _ '{print }'"
So, you see your awk commands are printing the whole line instead of just the first and 2nd fields respectively.
This is merely a quoting challenge
var=$(sudo -H -u hadoop bash -c 'hadoop job -list-active-trackers | sort | awk -F "." '\''{print $1}'\'' | awk -F "_" '\''{print $2}'\')
A bash single quoted string cannot contain single quotes, so that's why you see ...'\''... -- to close the string, concatenate a literal single quote, then re-open the string.
Another way is to escape the vars and inner double quotes:
var=$(sudo -H -u hadoop bash -c "hadoop job -list-active-trackers | sort | awk -F \".\" '{print \$1}' | awk -F \"_\" '{print \$2}'")

Calling Awk in a shell script

I have this command which executes correctly if run directly on the terminal.
awk '/word/ {print NR}' file.txt | head -n 1
The purpose is to find the line number of the line on which the word 'word' first appears in file.txt.
But when I put it in a script file, it doens't seem to work.
#! /bin/sh
if [ $# -ne 2 ]
then
echo "Usage: $0 <word> <filename>"
exit 1
fi
awk '/$1/ {print NR}' $2 | head -n 1
So what did I do wrong?
Thanks,
Replace the single quotes with double quotes so that the $1 is evaluated by the shell:
awk "/$1/ {print NR}" $2 | head -n 1
In the shell, single-quotes prevent parameter-substitution; so if your script is invoked like this:
script.sh word
then you want to run this AWK program:
/word/ {print NR}
but you're actually running this one:
/$1/ {print NR}
and needless to say, AWK has no idea what $1 is supposed to be.
To fix this, change your single-quotes to double-quotes:
awk "/$1/ {print NR}" $2 | head -n 1
so that the shell will substitute word for $1.
You should use AWK's variable passing feature:
awk -v patt="$1" '$0 ~ patt {print NR; exit}' "$2"
The exit makes the head -1 unnecessary.
you could also pass the value as a variable to awk:
awk -v varA=$1 '{if(match($0,varA)>0){print NR;}}' $2 | head -n 1
Seems more cumbersome than the above, but illustrates passing vars.

Resources