Convert values from array in json object using bash shell - bash

I am totaly new to shell..... let me ut the proper use case.
Use case:-
I have written two get method in my shell script, and when a user calls that script I will perform some operation for many id's using a for loop. like below
test_get1(){
value1=//performing some operation and storing it
value2=//performing some operation and storing it
//below line I am converting the o/p of value1 and value2 in json
value=$JQ_TOOL -n --arg key1 "$value1" --arg key2 "$value2" '{"key1":"\($value1)","key2":"\($value2)"}'
}
test_get2(){
arr=(1,2,3)
local arr_values=()
for value in arr
do
// Calling test_get1 for each iteraion of this loop, like below
val=$(test_get1 $value)
//below line will store the values in array
arr_values+=("$val")
done
}
When I am doing echo for the above arr_values, I am getting the below output
Output.
arr_values={
"key1":"value1",
"key2":"value2"
}
{
"key1":"value1",
"key2":"value2"
}
I want to convert the above value in json format like below.
json_value=[
{
"key1":"value1",
"key2":"value2"
},
{
"key1":"value1",
"key2":"value2"
}
]
I tried to do it with JQ, but unable to get the proper result.

Use the slurp option:
jq -s . in.json > out.json
in.json
{
"key1": "value1",
"key2": "value2"
}
{
"key1": "value1",
"key2": "value2"
}
out.json
[
{
"key1": "value1",
"key2": "value2"
}
]
[
{
"key1": "value1",
"key2": "value2"
}
]

1) Your existing "value=" line can be simplified to:
value=$(jq -n --arg key1 "$value1" --arg key2 "$value2" '\
{key1: $value1, key2: $value2}')
because --arg always interprets the provided value as a string, and because jq expressions need not follow all the rules of JSON.
2) From your script, arr_value is a bash array of JSON values. To convert it into a JSON array, you should be able to use an incantation such as:
for r in "${a[#]}" ; do printf "%s" "$r" ; done | jq -s .
3) There is almost surely a much better way to achieve your ultimate goal. Perhaps it would help if you thought about calling jq just once.

Related

jq: iterate over every element of list and replace it with value

I've got this json-file:
{
"name": "market",
"type": "grocery",
"shelves": {
"upper_one": [
"23423565",
"23552352",
"08789089"
]
}
}
I need to iterate over every element of an list (upper_one), and replace it with other value.
I've tried this code:
#/bin/bash
for product in $(cat first-shop.json| jq -r '.shelves.upper_one[]')
do
cat first-shop.json| jq --arg id "$((1 + $RANDOM % 10))" --arg product "$product" -r '.shelves.upper_one[]|select(. == $product)|= $id'
done
But I got this kind of output:
1
23552352
08789089
23423565
10
08789089
23423565
23552352
7
Is it possible to iterate over list with jq, replace values with value from another function (like $id in the code), and print the whole final json with substituted values?
I need this kind of output:
{
"name": "market",
"type": "grocery",
"shelves": {
"upper_one": [
"1",
"10",
"7"
]
}
}
not just elements of "upper_one" list thrice.
You could try the following script :
#!/usr/bin/env bash
for product in $(jq -r '.shelves.upper_one[]' input.json)
do
id="$((1 + $RANDOM % 10))"
newIds+=("$id")
done
jq '.shelves.upper_one = $ARGS.positional' input.json --args "${newIds[#]}"
IMHO its better to use some scripting language and manipulate objects programmatically. If bash and jq is your only option - this do the job though not nice
$ jq '.shelves.upper_one[] |= (sub("23423565";"1") | sub("23552352";"10") | sub("08789089";"7"))' your.json
{
"name": "market",
"type": "grocery",
"shelves": {
"upper_one": [
"1",
"10",
"7"
]
}
}
consider conversion to numbers with | tonumber

How to output key values from json to a table using jq?

I have this json which I get from an API.
{
"result": {
"key1": "val1",
"key2": "val2",
"key3": "val3"
}
}
I have the following shell script which creates headers for table in a text file. I want to extract key values from the above result object and put in the same text file where keys should go under KEYS and values under VALUES in the table. I am new to jq and shell and struggling to achieve this.
echo "%table"
echo -e "KEYS\tVALUES" > outputfile.txt
KEYVALS=$(curl -uuser:password
"http://localhost:8080/customapi")
# here I want to split the key values using jq and write to the outputfile.txt
cat outputfile.txt
Outcome I am expecting is:
KEYS VALUES
key1 val1
key2 val2
key3 val3
How can I achieve this?
The key is to convert .result to an array of key/value pairs using to_entries, then outputing a set of strings (created using string interpolation) in raw mode.
% cat tmp.json
{
"result": {
"key1": "val1",
"key2": "val2",
"key3": "val3"
}
}
% jq -r '{"KEYS": "VALUES"} + .result | to_entries[] | "\(.key)\t\(.value)"' tmp.json
KEYS VALUES
key1 val1
key2 val2
key3 val3
I added the header to the input before conversion to the key/value list.
by adding the column call at the end the alignment will work for longer values as well ...
Note the usage of the # char as token separator ... of course if your data contains it this will not work ...
aws cognito-idp list-user-pools --max-results 20 | \
jq -r '.UserPools[]|to_entries[]|select (.key == "Name")|("\(.key):#\(.value)")'| column -t -s'#'
output
Name: corp_stg_user_pool
Name: corp_dev_user_pool

Handling json object with special characters in jq

I have a json object with below element,
rsrecords="{
"ResourceRecords": [
{
"Value": "\"heritage=external-dns,external-dns/owner=us-east-1:sandbox,external-dns/resource=service/api""
}
],
"Type": "TXT",
"Name": "\\052.apiconsumer.alpha.sandbox.test.net.",
"TTL": 300
}"
And in my bash script,I have below code snippet,
jq -r '.[] | .Name ,.ResourceRecords[0].Value' <<< "$rsrecords" | \
while read -r name; read -r value; do
echo $name
Output is printed as,
\052.apiconsumer.alpha.sandbox.test.net.
But I am expecting it to print as \\052.apiconsumer.alpha.sandbox.test.net., which is , as it is "Name" from the json object..
How can this be done?
Before getting to the heart of the matter, please note that
the sample data as given is a bit of a mishmash, so I'll assume you meant something like:
rsrecords='
{
"ResourceRecords": [
{
"Value": "heritage=external-dns,external-dns/owner=us-east-1:sandbox,external-dns/resource=service/api"
}
],
"Type": "TXT",
"Name": "\\052.apiconsumer.alpha.sandbox.test.net.",
"TTL": 300
}
'
Your jq query does not match the above JSON, so I'll assume you intended the query to be simply:
.Name, .ResourceRecords[0].Value
In any case, with the above JSON, the bash commands:
jq -r '.Name, .ResourceRecords[0].Value' <<< "$rsrecords" |
while read -r name; read -r value; do
echo "$name"
done
yields:
\052.apiconsumer.alpha.sandbox.test.net.
This is correct, because the JSON string "\\X" is an encoding of the raw string: \X
If you want to see the JSON string, then invoke jq without the -r option. If you want to invoke jq with the -r option and want to see two backslashes, you will have to encode them as four backslashes in your JSON.

cannot call bash environment variable inside jq

In the below script, I am not able to successfully call the "repovar" variable in the jq command.
cat quayrepo.txt | while read line
do
export repovar="$line"
jq -r --arg repovar "$repovar" '.data.Layer| .Features[] | "\(.Name), \(.Version), $repovar"' severity.json > volume.csv
done
The script uses a text file to loop through the repo names
quayrepo.txt---> file has the list of names in this case the file has a value of "Reponame1"
sample input severity.json file:
{
"status": "scanned",
"data": {
"Layer": {
"IndexedByVersion": 3,
"Features": [
{
"Name": "elfutils",
"Version": "0.168-1",
"Vulnerabilities": [
{
"NamespaceName": "debian:9",
"Severity": "Medium",
"Name": "CVE-2016-2779"
}
]
}
]
}
}
}
desired output:
elfutils, 0.168-1, Medium, Reponame1
Required output: I need to retrieve the value of my environment variable as the last column in my output csv file
You need to surround $repovar with parenthesis, as the other values
repovar='qweqe'; jq -r --arg repovar "$repovar" '.data.Layer| .Features[] | "\(.Name), \(.Version), \($repovar)"' tmp.json
Result:
elfutils, 0.168-1, qweqe
There's no need for the export.
#!/usr/bin/env bash
while read line
do
jq -r --arg repovar "$line" '.data.Layer.Features[] | .Name + ", " + .Version + ", " + $repovar' severity.json
done < quayrepo.txt > volume.csv
with quayrepo.txt as
Reponame1
and severity.json as
{
"status": "scanned",
"data": {
"Layer": {
"IndexedByVersion": 3,
"Features": [
{
"Name": "elfutils",
"Version": "0.168-1",
"Vulnerabilities": [
{
"NamespaceName": "debian:9",
"Severity": "Medium",
"Name": "CVE-2016-2779"
}
]
}
]
}
}
}
produces volume.csv containing
elfutils, 0.168-1, Reponame1
To #peak's point, changing > to >> in ...severity.json >> volume.csv will create a multi-line csv instead of just overwriting until the last line
You don't need a while read loop in bash at all; jq itself can loop over your input lines, even when they aren't JSON, letting you run jq only once, not once per line in quayrepo.txt.
jq -rR --slurpfile inJson severity.json <quayrepo.txt >volume.csv '
($inJson[0].data.Layer | .Features[]) as $features |
[$features.Name, $features.Version, .] |
#csv
'
jq -R specifies raw input, letting jq directly read lines from quayrepo.txt into .
jq --slurpfile varname filename.json reads filename.json into an array of JSON objects parsed from that file. If the file contains only one object, one needs to refer to $varname[0] to refer to it.
#csv converts an array to a CSV output line, correctly handling data with embedded quotes or other oddities that require special processing.

How to manipulate a jq output using bash?

I have the following jq code snippet:
https://jqplay.org/s/QzOttRHoz1
I want to loop each element from the result array using bash such as the pseudo code shows:
#!/bin/bash
foreach result
print "My name is {name}, I'm {age} years old"
print "--"
The result would be:
My name is A, I'm 1 years old.
---
My name is B, I'm 2 years old.
---
My name is C, I'm 3 years old.
---
Of course this is a trivial example just to clarify that my goal is to manipulate each array from the jq result individually.
Any suggestions on how to write the pseudo code into valid bash statements?
Saving the json:
{
"Names": [
{ "Name": "A", "Age": "1" },
{ "Name": "B", "Age": "2" },
{ "Name": "C", "Age": "3" }
]
}
as /tmp/input.txt I can run:
</tmp/input.txt jq --raw-output 'foreach .Names[] as $name ([];[];$name | .Name, .Age )' \
| while read -r name && read -r age; do
printf "My name is %s, I'm %d years old.\n" "$name" "$age";
printf -- "--\n";
done
The --raw-output with | .Name, .Age just prints two lines per .Names array member, one with name and another with age. Then I read two lines at a time with while read && read and use that to loop through them.
If you rather have:
["A","1"]
["B","2"]
["C","3"]
that's sad, the best would be to write a full parser that would take strings like "\"" into account. Anyway then you can:
</tmp/input2.txt sed 's/^\[//;s/\]$//;' \
| while IFS=, read name age; do
name=${name%\"};
name=${name#\"};
age=${age%\"};
age=${age#\"};
printf "My name is %s, I'm %d years old.\n" "$name" "$age";
printf -- "--\n";
done
The first sed removed the leading and enclosing [ and ] in each line. Then I read two strings separated by , (so vars like "a,b","c,d" will be read incorrectly). Then these two strings are stripped of leading and enclosing ". Then the usuall printf is used to output the result.
I have a written a simple script to achieve what you need:
My Json file test.json which is similar to your snippet:
{
"Names": [
{ "Name": "A", "Age": "1" },
{ "Name": "B", "Age": "2" },
{ "Name": "C", "Age": "3" }
]
}
My script:
#!/bin/bash
for i in $(cat test.json | jq -r '.Names[] | #base64'); do
_jq() {
echo ${i} | base64 --decode | jq -r ${1}
}
echo "My Name is $(_jq '.Name'), I'm $(_jq '.Age') years old"
done
Note that foreach .Names[] as $name ([];[];$name | .Name, .Age )
can be simplified to:
.Names[] | ( .Name, .Age )
or even in this specific case to:
.Names[][]
or for that matter to:
.[][][]
The important point, however, is that foreach is not needed to achieve simple iteration.

Resources