How to get dynamic NR parameter in AWK command at shell scripting - bash

I have txt file like this:
1 a
2 b
3 c
I want to take these datas step by step for example first ı will get " 1 " and put it a varible and then get " a " put it in a varible and run a curl command
I mean first first row first column then fisrt row second column then second row first column .....
ı wrote a script in below but not working, it turns null varible for b and c
#!/bin/sh
for ((i=1;i<=4;i++)); do
echo $i
b=$(awk 'NR==$i { print $1}' a.txt)
c=$(awk 'NR==$i { print $2}' a.txt)
echo $b
echo $c
curl -X POST \
-H "X-netmera-api-key: xxxxxxxxxx" \
-H "Content-Type: application/json" \
-d '[
{
"deviceToken" : "$b",
"platform" : "1",
"extId" : "$c"
}
]' \
https://xxxx.xxxxx.com/rest/3.0/xxxxx
done

Throw awk away; it isn't necessary here.
while read -r b c; do
curl -X POST \
-H "X-netmera-api-key: xxxxxxxxxx" \
-H "Content-Type: application/json" \
-d "[
{
\"deviceToken\" : \"$b\",
\"platform\" : \"1\",
\"extId\" : \"$c\"
}
]" \
https://xxxx.xxxxx.com/rest/3.0/xxxxx
done < a.txt
You shouldn't be trying to generate JSON dynamically like this, but that's an issue for another question.

You will need to pass variables to awk with -v and so:
b=$(awk -v varb=$i 'NR==varb { print $1}' a.txt)
Here we are setting the awk variable varb equal to the i variable of your bash loop.

Related

Bash script greater than or equal to if statement not functioning as expected

The goal of this script is to get the output of my getCpuTemp function, which is currently 60.1, and to send me a push notification via Pushbullet if the CPU temperature is over 70 degrees.
My script currently thinks that 60.1 is over 70, and I'm not sure why, is there something wrong with my if statement?
function getCpuTemp {
sensors cpu_thermal-virtual-0 | awk 'FNR == 3 {print $2}' | tr -d '+\°\C'
}
function sendPushNotification {
curl --header 'Access-Token: <REDACTED>' \
--header 'Content-Type: application/json' \
--data-binary '{"body":"Server CPU Temp is very warm.","title":"WARNING: HIGH TEMP","type":"note"}' \
--request POST \
https://api.pushbullet.com/v2/pushes
}
if getCpuTemp -ge 70; then
echo "ABOVE 70"
sendPushNotification
else
echo "BELOW 70"
fi
As #CharlesDuffy already pointed out in comments, your use of -ge is invalid syntax plus bash doesn't understand floating point numbers but awk does and you're already using it so consider:
cpuTempIsOverOrEqual() {
sensors cpu_thermal-virtual-0 |
awk -v max="$1" 'NR == 3 {exit (($2+0)>=max ? 0 : 1)}'
}
sendPushNotification() {
curl --header 'Access-Token: <REDACTED>' \
--header 'Content-Type: application/json' \
--data-binary '{"body":"Server CPU Temp is very warm.","title":"WARNING: HIGH TEMP","type":"note"}' \
--request POST \
https://api.pushbullet.com/v2/pushes
}
if cpuTempIsOverOrEqual 70; then
echo "ABOVE OR EQUAL TO 70"
sendPushNotification
else
echo "BELOW 70"
fi
I'm assuming above that the output of sensors cpu_thermal-virtual-0 has those characters that you're using tr to remove attached to the end of the 2nd field and, if so, you don't need to do that as awk will strip non-numeric characters from the end of any string you perform a numeric operation on, e.g. +0:
$ printf 'a\nb\nx %s+°C\nd\n' 60.1
a
b
x 60.1+°C
d
$ printf 'a\nb\nx %s+°C\nd\n' 60.1 |
awk 'NR == 3 {print $2+0}'
60.1
$ printf 'a\nb\nx %s+°C\nd\n' 60.1 |
awk -v max="70" 'NR == 3 {exit (($2+0)>max ? 0 : 1)}'; echo $?
1
$ printf 'a\nb\nx %s+°C\nd\n' 70.1 |
awk -v max="70" 'NR == 3 {exit (($2+0)>max ? 0 : 1)}'; echo $?
0
If that assumption is wrong then edit your question to show us the output of sensors cpu_thermal-virtual-0 | head -n 3 so we know what you're trying to parse.
Obviously pick whatever name you like for the function and change >= to > if you want the comparison to be greater than (as you say you want in your text) rather than greater than or equal to (as you have in your code with -ge).

Put variable in bash for loop

I am making a post request in bash multiple times with a for loop like:
for n in {1..5}; do curl -X POST 'http://localhost:8080/hello' -H 'accept: */*' -H 'Content-Type: application/json' -d '{
"a": 1,
"b": 2
}';done
Now the values a and b are the same so I wanted to generate random values for the variables.So I created two random values :
a= awk -v min=10 -v max=20 -v num=1 'BEGIN{srand(); for (i=1;i<=num;i++) printf ("%.5f\n",min+rand()*(max-min+1))}'
b= awk -v min=50 -v max=60 -v num=1 'BEGIN{srand(); for (i=1;i<=num;i++) printf ("%.5f\n",min+rand()*(max-min+1))}'
and tried to insert them into my post request like this:
for n in {1..5}; do curl -X POST 'http://localhost:8080/hello' -H 'accept: */*' -H 'Content-Type: application/json' -d '{
"a": a,
"b": b
}';done
but it didnt work I also tried :
for (n in {1..5};a= awk -v min=10 -v max=20 -v num=1 'BEGIN{srand(); for (i=1;i<=num;i++) printf ("%.5f\n",min+rand()*(max-min+1))}'; b= awk -v min=50 -v max=60 -v num=1 'BEGIN{srand(); for (i=1;i<=num;i++) printf ("%.5f\n",min+rand()*(max-min+1))}'); do curl -X POST 'http://localhost:8080/hello' -H 'accept: */*' -H 'Content-Type: application/json' -d '{
"a": 1,
"b": 2
}';done
but that also didnt work, what am I doing wrong?
Thanks in regard
You must perform some small corrections so you can use all of this in a script.
Next you can see the values being replaced (a and b). Adapt for what you need
a=`awk -v min=10 -v max=20 -v num=1 'BEGIN{srand(); for (i=1;i<=num;i++) printf ("%.5f\n",min+rand()*(max-min+1))}'`
b=`awk -v min=50 -v max=60 -v num=1 'BEGIN{srand(); for (i=1;i<=num;i++) printf ("%.5f\n",min+rand()*(max-min+1))}'`
for n in {1..5}; do
echo curl -X POST 'http://localhost:8080/hello' -H 'accept: */*' -H 'Content-Type: application/json' -d "{
\"a\": $a,
\"b\": $b
}";done
The first problem is that you want a and b to have the result of awk command. For that you need to (for example) enclose the command between ``
To have the variables being substituted you cannot have them between '...' (single quotes). Otherwise they are not substituted. So you must use double quotes "..." to have $a and $b being substituted by their variable values.

bash variable as a command: echo the command before execution and save the result to a variable

I am executing a chain of curl commands:
I need to echo the command before the execution.
Execute the command and save the result to a bash variable.
Get values from the result of the execution and execute the next curl with that values.
This is how it looks like:
# -----> step 1 <-----
URL="https://web.example.com:8444/hello.html"
CMD="curl \
--insecure \
--dump-header - \
\"$URL\""
echo $CMD && eval $CMD
OUT="<result of the curl command???>"
# Set-Cookie: JSESSIONID=5D5B29689EFE6987B6B17630E1F228AD; Path=/; Secure; HttpOnly
JSESSIONID=$(echo $OUT | grep JSESSIONID | awk '{ s = ""; for (i = 2; i <= NF; i++) s = s $i " "; print s }' | xargs)
# Location: https://web.example.com:8444/oauth2/authorization/openam
URL=$(echo $OUT | grep Location | awk '{print $2}')
# -----> step 2 <-----
CMD="curl \
--insecure \
--dump-header - \
--cookie \"$JSESSIONID\" \
\"$URL\""
echo $CMD && eval $CMD
OUT="<result of the curl command???>"
...
# -----> step 3 <-----
...
I only have a problem with the step 2: save the full result of the curl command to a variable in order to I can parse it.
I have tried it many different way, non of them works:
OUT="eval \$CMD"
OUT=\$$CMD
OUT=$($CMD)
...
What I missed?
For very basic commands, OUT=$($CMD) should work. The problem with this is, that strings stored in variables are processed differently than strings entered directly. For instance, echo "a" prints a, but var='"a"'; echo $a prints "a" (note the quotes). Because of that and other reasons, you shouldn't store commands in variables.
In bash, you can use arrays instead. By the way: The naming convention for regular variables is NOT ALLCAPS, as such names might accidentally collide with special variables. Also, you can probably drastically simplifiy your grep | awk | xargs.
url="https://web.example.com:8444/hello.html"
cmd=(curl --insecure --dump-header - "$url")
printf '%q ' "${cmd[#]}"; echo
out=$("${cmd[#]}")
# Set-Cookie: JSESSIONID=5D5B29689EFE6987B6B17630E1F228AD; Path=/; Secure; HttpOnly
jsessionid=$(awk '{$1=""; printf "%s%s", d, substr($0,2); d=FS}' <<< "$out")
# Location: https://web.example.com:8444/oauth2/authorization/openam
url=$(awk '/Location/ {print $2}' <<< "$out")
# -----> step 2 <-----
cmd=(curl --insecure --dump-header - --cookie "$jsessionid" "$url")
printf '%q ' "${cmd[#]}"; echo
out=$("${cmd[#]}")
# -----> step 3 <-----
...
If you have more steps than that, wrap the repeating part into a function, as suggested by Charles Duffy.
Easy Mode: Use set -x
Bash has a built-in feature, xtrace, which tells it to log every command to the file descriptor named in the variable BASH_XTRACEFD (by default, file descriptor 2, stderr).
#!/bin/bash
set -x
url="https://web.example.com:8444/hello.html"
output=$(curl \
--insecure \
--dump-header - \
"$url")
echo "Output of curl follows:"
echo "$output"
...will provide logs having the form of:
+ url=https://web.example.com:8444/hello.html
++ curl --insecure --dump-header - https://web.example.com:8444/hello.html
+ output=Whatever
+ echo 'Output of curl follows:'
+ echo Whatever
...where the + is based on the contents of the variable PS4, which can be modified to have more information. (I often use and suggest PS4=':${BASH_SOURCE}:$LINENO+' to put the source filename and line number in each logged line).
Doing It By Hand
If that's not acceptable, you can write a function.
log_and_run() {
{ printf '%q ' "$#"; echo; } >&2
"$#"
}
output=$(log_and_run curl --insecure --dump-header - "$url")
...will write your curl command line to stderr before storing its output in $output. Note when writing that output that you need to use quotes: echo "$output", not echo $output.
I guess OUT=$(eval $CMD) will do what you want.

How to automate curl POST for CSV file?

I need to write a bash script that takes CSV file and iterates row by row, sending each row to http://localhost:9999/myListener.
In other words the script should execute this code for each N-th row of CSV file:
curl -H "Content-Type: application/json" -X POST -d '{"col1":1,"col2":3,"col3":"value"}' http://localhost:9999/myListener
you could translate csv to json line by line with awk:
cat foo.csv | \
awk -F',' '{printf("{\"col1\": %s, \"col2\": %s, \"col3\": \"%s\"}\n", $1, $2, $3)}' | \
while read s
do
curl -H "Content-Type: application/json" -X POST -d "$s" http://localhost:9999/myListener
done

Unable to send large files to elasticsearch using curl: argument too long

This is the script i used to export some documents to elasticsearch but no luck
#!/bin/ksh
set -v
trap read debug
date=$(date +%Y-%m-%d);
echo $date;
config_file="/home/p.sshanm/reports_elastic.cfg";
echo $config_file;
URL="http://p-acqpes-app01.wirecard.sys:9200/reports-"$date"";
echo $URL;
find /transfers/documents/*/done/ -type f -name "ABC-Record*_${date}*.csv"|
while IFS='' read -r -d '' filename
do
echo "filename : ${filename}"
var=$(base64 "$filename"| perl -pe 's/\n//g');
#if i use below it will fail as argument too long , so i used with curl option #
# var1= $(curl -XPUT 'http://localhost:9200/reports-'$date'/document/reports?pipeline=attachment&pretty' -d' { "data" : "'$var'" }')
var1=$(curl -X PUT -H "Content-Type: application/json" -d #- "$URL" >>CURLDATA
{ "data": "$var" }
CURL_DATA)
done;
If i use below it as
var1= $(curl -XPUT 'http://localhost:9200/reports-'$date'/document/reports?pipeline=attachment&pretty' -d' { "data" : "'$var'" }')
will fail as below, so i used with curl option #
argument too long
Your syntax to read from stdin is wrong, the here-doc string should have been (<<) and the de-limiters are mis-matching use CURL_DATA at both places.
curl -X PUT -H "Content-Type: application/json" -d #- "$URL" <<CURL_DATA
{ "data": "$var" }
CURL_DATA

Resources