How to hande a curl response fail in Bash script - bash

How can I handle request fails in this example of bash curl requests. I.e. if all servers are responde with JSON all is okay and I have JSON file at end of a cycle. But if one of this servers not responde with JSON or not responde at all I do have nothing in "/data.json" file, even all other servers are working perfectly. How can I catch a server fail and skip it?
#!/bin/bash
CONFIG=config.json
jsondata=data.json
i=1
echo "{ \"success\": \"OK\", \"servers\": {" > $jsondata
jq -r '.servers|keys[]' $CONFIG | while read key ; do
if [ "$i" -ne "1" ]; then
echo "," >> $jsondata
fi
echo "\"server$i\": {" >> $jsondata
RESPONSE=$(curl -s $HTTP://$IP:$PORT/api)
DATA1=$(echo $RESPONSE | jq '.data1')
DATA2=$(echo $RESPONSE | jq '.data2')
echo " \"data1\": $DATA1", >> $jsondata
echo " \"data2\": $DATA2", >> $jsondata
echo "}" >> $jsondata
((i++))
done
echo "}," >> $jsondata

First, let's assume we have a function to make a single API call.
do_api () {
key=$1
curl $HTTP://$IP:$PORT/api # Presumably, key is needed here somewhere
}
I'll also assume that the intended output is a JSON object that has at least two fields, data1 and data2. Now, we can write a simple pipeline with three stages:
Read keys from $CONFIG
Make an API call for each key
Generate the desired output from the combined output of all the API calls.
It's not too complicated:
jq -r '.servers | keys[]' | # stage 1
while read key; do do_api "$key"; done | # stage 2
jq -s 'to_entries |
map({key: "server(\.key+1)",
value: {data1: .value.data1,
data2: .value.data2}
}) |
{success: "OK", servers: from_entries}' # stage 3
do_api should output nothing for a particular key of curl fails, but you can modify it to produce some sort of default data if you wish:
do_api () {
key=$1
curl --fail ... || jq -n '{data1: null, data2: null}'
}
jq -s 'to_entries' takes an input like
{ "data1": ..., "data2": ... }
{ "data1": ..., "data2": ... }
(which is what we expect from curl), and produces as output
[ { "key": 0, "value": { "data1": ..., "data2": ... } },
{ "key": 1, "value": { "data1": ..., "data2": ... } }
]
The map(...) filter takes the preceding array and produces the keys and values we want to add to the servers object in the final result, which is created by a call to from_entries.
Here is a full example. tmp.json contains the simulated output of do_api, complete with an extra field data3 that will be filtered from the final output.
$ cat tmp.json
{
"data1": "foo",
"data2": "bar",
"data3": "baz"
}
{
"data1": "hello",
"data2": "world",
"data3": "bye"
}
$ jq -s 'to_entries | map({key: "server\(.key+1)", value: {data1: .value.data1, data2: .value.data2}}) | {success: "OK", servers: from_entries}' tmp.json
{
"success": "OK",
"servers": {
"server1": {
"data1": "foo",
"data2": "bar"
},
"server2": {
"data1": "hello",
"data2": "world"
}
}
}

Related

shell script: Returning wrong output

In the given script, the nested key is not getting appended with the value. I could not figure out where the script is going wrong.
#!/bin/bash
echo "Add the figma json file path"
read path
figma_json="$(echo -e "${path}" | tr -d '[:space:]')"
echo $(cat $figma_json | jq -r '.color | to_entries[] | "\(.key):\(.value| if .value == null then .[] | .value else .value end)"')
Sample input:
{
"color": {
"white": {
"description": "this is just plain white color",
"type": "color",
"value": "#ffffffff",
"extensions": {
"org.lukasoppermann.figmaDesignTokens": {
"styleId": "S:40940df38088633aa746892469dd674de8b147eb,",
"exportKey": "color"
}
}
},
"gray": {
"50": {
"description": "",
"type": "color",
"value": "#fafafaff",
"extensions": {
"org.lukasoppermann.figmaDesignTokens": {
"styleId": "S:748a0078c39ca645fbcb4b2a5585e5b0d84e5fd7,",
"exportKey": "color"
}
}
}
}
}
}
Actual output:
white:#ffffffff gray:#fafafaff
Excepted output:
white:#ffffffff gray:50:#fafafaff
Full input file
Here's a solution using tostream instead of to_entries to facilitate simultaneous access to the full path and its value:
jq -r '
.color | tostream | select(.[0][-1] == "value" and has(1)) | .[0][:-1]+.[1:] | join(":")
' "$figma_json"
white:#ffffffff
gray:50:#fafafaff
Demo
An approach attempting to demonstrate bash best-practices:
#!/bin/bash
figma_json=$1 # prefer command-line arguments to prompts
[[ $figma_json ]] || {
read -r -p 'Figma JSON file path: ' path # read -r means backslashes not munged
figma_json=${path//[[:space:]]/} # parameter expansion is more efficient than tr
}
jq -r '
def recurse_for_value($prefix):
to_entries[]
| .key as $current_key
| .value?.value? as $immediate_value
| if $immediate_value == null then
.value | recurse_for_value(
if $prefix != "" then
$prefix + ":" + $current_key
else
$current_key
end
)
else
if $prefix == "" then
"\($current_key):\($immediate_value)"
else
"\($prefix):\($current_key):\($immediate_value)"
end
end
;
.color |
recurse_for_value("")
' "$figma_json"

Bash looping over returned Json using curl

So i make a curl command to a url which returns an array of objects
response= $(curl --locaiton --request GET "http://.....")
I need to iterate over the returned json and extract a single value..
The json is as follows:
{
data:[
{"name": "ABC", "value": 1},
{"name": "EFC", "value": 4},
{"name": "CEC", "value": 3}
]
}
Is there anyway in BASh i can extract the second object value.. by perhaps iterating and doing an IF
Use jq
A simple example to extract the 2nd entry of the array would be:
RESPONSE='{"data":[{"name":"ABC","value": 1},{"name":"EFC","value":4},{"name":"CEC","value":3}]}';
EXTRACTED=$(echo -n "$RESPONSE" | jq '.data[1]');
echo $EXTRACTED
jq is the most commonly used command/tool to parse JSON data from a shell script. Here is an example with your data:
#!/usr/bin/env sh
# JSON response
response='
{
"data": [
{"name": "ABC", "value": 1},
{"name": "EFC", "value": 4},
{"name": "CEC", "value": 3}]
}'
# Name of entry
name='EFC'
# Get value of entry
value=$( jq --null-input --raw-output --arg aName "$name" \
"$response"' | .data[] | select(.name == $aName) | .value')
# Print it out
printf 'value for %s is: %s\n' "$name" "$value"
Alternatively jq can be used to transform the whole JSON name value objects array, into a Bash associative array declaration:
#!/usr/bin/env bash
# JSON response
response='
{
"data": [
{"name": "ABC", "value": 1},
{"name": "EFC", "value": 4},
{"name": "CEC", "value": 3}]
}'
# Name of entry
name='EFC'
# Covert all JSON array name value entries into a Bash associative array
# shellcheck disable=SC2155 # safe generated declaration
declare -A entries="($( jq --null-input --raw-output \
"$response"' | .data[] | ( "[" + ( .name | #sh ) + "]=" + (.value | #sh) )'))"
# Print it out
printf 'value for %s is: %s\n' "$name" "${entries[$name]}"

Loop json Output in shell script

I have this output variable
OUTPUT=$(echo $ZONE_LIST | jq -r '.response | .data[]')
The Output:
{
"accountId": "xyz",
"addDate": "2020-09-05T10:57:11Z",
"content": "\"MyContent\"",
"id": "MyID",
"priority": null
}
{
"accountId": "xyz",
"addDate": "2020-09-05T06:58:52Z",
"content": "\"MyContent\"",
"id": "MyID",
"priority": null
}
How can I create a loop for this two values?
MyLoop
echo "$content - $id"
done
I tried this, but then I get a loop through every single value
for k in $(echo $ZONE_LIST | jq -r '.response | .data[]'); do
echo $k
done
EDIT 1:
My complete JSON:
{
"errors": [],
"metadata": {
"transactionId": "",
},
"response": {
"data": [
{
"accountId": "xyz",
"addDate": "2020-09-05T10:57:11Z",
"content": "\"abcd\"",
"id": "myID1",
"lastChangeDate": "2020-09-05T10:57:11Z",
},
{
"accountId": "xyz",
"addDate": "2020-09-05T06:58:52Z",
"content": "\"abc\"",
"id": "myID2",
"lastChangeDate": "2020-09-05T07:08:15Z",
}
],
"limit": 10,
"page": 1,
"totalEntries": 2,
},
"status": "success",
"warnings": []
}
Now I need a loop for data, because I need it for a curl
The curl NOW:
curl -s -v -X POST --data '{
"deleteEntries": [
Data_from_json
]
}' https://URL_to_Update 2>/dev/null)
Now I want to create a new variable from my JSON data. My CURL should look like this at the end:
curl -s -v -X POST --data '{
"deleteEntries": [
{
"readID": "myID1",
"date": "2020-09-05T10:57:11Z", <--Value from addDate
"content": "abcd"
},
{
"readID": "myID2",
"date": "2020-09-05T06:58:52Z", <--Value from addDate
"content": "abc"
}
]
}' https://URL_to_Update 2>/dev/null)
Something like:
#!/usr/bin/env bash
while IFS=$'\37' read -r -d '' id content; do
echo "$id" "$content"
done < <(
jq -j '.response | .data[] | .id + "\u001f" + .content + "\u0000"' \
<<<"$ZONE_LIST"
)
jq -j: Forces a raw output from jq.
.id + "\u001f" + .content + "\u0000": Assemble fields delimited by ASCII FS (Hexadecimal 1f or Octal 37), and end record by a null character.
It then becomes easy and reliable to iterate over null delimited records by having read -d '' (null delimiter).
Fields id content are separated by ASCII FS, so just set the Internal Field Separator IFS environment variable to the corresponding octal IFS=$'37' before reading.
The first step is to realize you can turn the set of fields into an array like this using a technique like this:
jq '(.accountId + "," + .addDate)'
Now you can update your bash loop:
for k in $(echo $ZONE_LIST | jq -r '.response | .data[]' | jq '(.content + "," + .id)'); do
echo $k
done
There is probably a way to combine the two jq commands but I don't have your original json data for testing.
UPDATE - inside the loop you can parse the comma-delimited string into separate fields. This are more efficient ways to handle this task but I prefer simplicity.
ID=$(echo $k | cut -d',' -f1)
PRIORITY=$(echo $k | cut -d',' -f2)
echo "ID($ID) PRIORITY($PRIORITY)"
Try this.
for k in $(echo $ZONE_LIST | jq -rc '.response | .data[]'); do
echo $k|jq '.content + " - " + .id' -r
done

jq how to get the last entry in an object

I'm working with jq 1.6 to get the last entry in an object. It should work like this:
data='{ "1": { "a": "1" }, "2": { "a": "2" }, "3": { "a": "3" } }'
result=`echo $data | jq 'myfilter'`
echo $result
{ "3": { "a": "3" } }
I tried these filters:
jq '. | last' # error: Cannot index object with number
How can I tell jq to quote the number?
jq '. | to_entries | last' # { "key": "3", "value": { "a": "3" } }
I guess I could munge this up by concatenating the key and value entries. Is there a simpler way?
The tutorial and the manual didn't help. No joy on SO either.
You can use the following :
jq 'to_entries | [last] | from_entries'
Try it here.
We can't use with_entries(last) because last returns a single element and from_entries requires an array, hence the [...] construct above.

cannot call bash environment variable inside jq

In the below script, I am not able to successfully call the "repovar" variable in the jq command.
cat quayrepo.txt | while read line
do
export repovar="$line"
jq -r --arg repovar "$repovar" '.data.Layer| .Features[] | "\(.Name), \(.Version), $repovar"' severity.json > volume.csv
done
The script uses a text file to loop through the repo names
quayrepo.txt---> file has the list of names in this case the file has a value of "Reponame1"
sample input severity.json file:
{
"status": "scanned",
"data": {
"Layer": {
"IndexedByVersion": 3,
"Features": [
{
"Name": "elfutils",
"Version": "0.168-1",
"Vulnerabilities": [
{
"NamespaceName": "debian:9",
"Severity": "Medium",
"Name": "CVE-2016-2779"
}
]
}
]
}
}
}
desired output:
elfutils, 0.168-1, Medium, Reponame1
Required output: I need to retrieve the value of my environment variable as the last column in my output csv file
You need to surround $repovar with parenthesis, as the other values
repovar='qweqe'; jq -r --arg repovar "$repovar" '.data.Layer| .Features[] | "\(.Name), \(.Version), \($repovar)"' tmp.json
Result:
elfutils, 0.168-1, qweqe
There's no need for the export.
#!/usr/bin/env bash
while read line
do
jq -r --arg repovar "$line" '.data.Layer.Features[] | .Name + ", " + .Version + ", " + $repovar' severity.json
done < quayrepo.txt > volume.csv
with quayrepo.txt as
Reponame1
and severity.json as
{
"status": "scanned",
"data": {
"Layer": {
"IndexedByVersion": 3,
"Features": [
{
"Name": "elfutils",
"Version": "0.168-1",
"Vulnerabilities": [
{
"NamespaceName": "debian:9",
"Severity": "Medium",
"Name": "CVE-2016-2779"
}
]
}
]
}
}
}
produces volume.csv containing
elfutils, 0.168-1, Reponame1
To #peak's point, changing > to >> in ...severity.json >> volume.csv will create a multi-line csv instead of just overwriting until the last line
You don't need a while read loop in bash at all; jq itself can loop over your input lines, even when they aren't JSON, letting you run jq only once, not once per line in quayrepo.txt.
jq -rR --slurpfile inJson severity.json <quayrepo.txt >volume.csv '
($inJson[0].data.Layer | .Features[]) as $features |
[$features.Name, $features.Version, .] |
#csv
'
jq -R specifies raw input, letting jq directly read lines from quayrepo.txt into .
jq --slurpfile varname filename.json reads filename.json into an array of JSON objects parsed from that file. If the file contains only one object, one needs to refer to $varname[0] to refer to it.
#csv converts an array to a CSV output line, correctly handling data with embedded quotes or other oddities that require special processing.

Resources