Iterate over an array of objects and format the string - bash

I have a json file like this:
[
{
"classname": "Test endpoint",
"name": "expect failure",
"failure_system_out": "expected 404 Not Found\nError in test endpoint\n\tat Test._assertStatus"
},
{
"classname": "Test inner functions",
"name": "expect failure",
"failure_system_out": "Example fo test\n\tExpect 4 and got 5"
}
]
As you see the value in "failure_system_out" is a string containing newline chars (\n) and tab chars (\t).
I am trying to read the file, loop around the objects and print them with this code:
jq -c '.[]' myfile.json | while read i; do
test_name=$(echo "$i" | jq -r .name)
system_error=$(echo "$i" | jq -r .failure_system_out)
printf "${system_error}"
done
The problem is that using this approach, printf doesn't print the script according the the new line & tab chars, but It prints something like this expected 404 Not FoundnError in test endpointntat Test._assertStatus
Basically, I think that jq -c removes the \ char and therefore the printf doesn't work properly.
How can I iterate over an array of object stored in a file and keep the chars using to format the string?
Desired output for the first item:
expected 404 Not Found
Error in test endpoint
at Test._assertStatus
Desired output for the second item:
Example fo test
Expect 4 and got 5

Just use jq it's a scripting language on it's own.
$ jq -r '.[0].failure_system_out' /tmp/1
expected 404 Not Found
Error in test endpoint
at Test._assertStatus
$ jq -r '.[1].failure_system_out' /tmp/1
Example fo test
Expect 4 and got 5
$ jq -r '.[] | .name as $test_name | .failure_system_out as $system_error | $system_error' /tmp/1
expected 404 Not Found
Error in test endpoint
at Test._assertStatus
Example fo test
Expect 4 and got 5
As for using bash, first read https://mywiki.wooledge.org/BashFAQ/001 . I like using base64 to properly transfer context from jq to bash and handle all corner cases.
jq -r '.[] | #base64' /tmp/1 |
while IFS= read -r line; do
line=$(<<<"$line" base64 -d);
test_name=$(<<<"$line" jq -r .name);
system_error=$(<<<"$line" jq -r .failure_system_out);
printf "%s\n" "$system_error";
done
but it's not needed here, just a proper while read loop should be enough:
jq -c '.[]' /tmp/1 |
while IFS= read -r line; do
test_name=$(<<<"$line" jq -r .name);
system_error=$(<<<"$line" jq -r .failure_system_out);
printf "%s\n" "$system_error";
done

The question seems to weave amongst several goals, but in any case:
there is no need for jq to be called more than once, and
there should be no need to use base64 conversions, except possibly if the values corresponding to the keys of interest contain NULs.
If the goal is simply to emit the values of .failure_system_out then:
jq -r '.[].failure_system_out' test.json
would do it.
If the values of both .name and .failure_system_out must be made available separately as bash variables, then consider:
while IFS= read -d $'\0' system_error ; do
IFS= read -d $'\0' test_name
printf "%s\n" name="$test_name"
printf "%s\n" fso="$system_error"
echo ""
done < <(jq -rj '.[] | [.name, .failure_system_out, ""] | join("\u0000")' test.json)
readarray could also be used -- see e.g.
Storing JQ NULL-delimited output in bash array

#KamilCuk's answer works great and gives quite some more control.
Thought I'd still share this jq only solution:
printf "%s\n" "$(jq -r -c '.[] | .failure_system_out' test.json)"
This will produce:
expected 404 Not Found
Error in test endpoint
at Test._assertStatus
Example fo test
Expect 4 and got 5

Related

How to use variable with jq cmd in shell

I am facing issue with below commands. I need to use variable but it is returning me null whereas when I hardcode its value, it return me correct response.
Can anybody help me whats the correct way of writing this command?
My intension is to pull value of corresponding key passed as a variable?
temp1="{ \"SSM_DEV_SECRET_KEY\": \"Smkfnkhnb48dh\", \"SSM_DEV_GRAPH_DB\": \"Prod=bolt://neo4j:Grt56#atc.preprod.test.com:7687\", \"SSM_DEV_RDS_DB\": \"sqlite:////var/local/ecosystem_dashboard/config.db\", \"SSM_DEV_SUPPERUSER_USERNAME\": \"admin\", \"SSM_DEV_SUPPERUSER_PASSWORD\": \"9dW6JE8#KH9qiO006\" }"
var_name=SSM_DEV_SECRET_KEY
echo $temp1 | jq -r '.SSM_DEV_SECRET_KEY' <----- return Smkfnkhnb48dh // output
echo $temp1 | jq -r '."$var_name"' <---- return null
echo $temp1 | jq -r --arg var_name "$var_name" '."$var_name"' <---- return null , alternative way
Update: I am adding actual piece of where I am trying to use above fix. My intension is to first read all values which start with SSM_DEV_... and then get there original values from aws than replace it in. one key pair look like this --> SECRET_KEY=$SSM_DEV_SECRET_KEY
temp0="dev"
temp1="DEV"
result1=$(aws secretsmanager get-secret-value --secret-id "xxx-secret-$temp0" | jq '.SecretString')
while IFS= read -r line; do
if [[ "$line" == *"=\$SSM_$temp1"* ]]; then
before=${line%%"="*}
after=${line#*"="}
var_name="${after:1}"
jq -r --arg var_name "$var_name" '.[$var_name]' <<< "$result1"
fi
done < sample_file.txt
Fix: I have solved my issue which was of carriage return character.
Below cmd help me:
var_name=`echo ${after:1} | tr -d '\r'`
jq -r --arg var_name "$var_name" '.[$var_name]' <<< "$result1"
You'll need to use Generic Object Index (.[$var_name]) to let jq know the variable should be seen as a key
The command should look like:
jq -r --arg var_name "$var_name" '.[$var_name]' <<< "$temp1"
Wich will output:
Smkfnkhnb48dh
Note: <<< "$temp1" instead off the echo
Let's look at the following statement:
echo $temp1 | jq -r '."$var_name"' <---- return null
Your problem is actually with the shell quoting and not jq. The single quotes tell the shell not to interpolate (do variable substitution) among other things (like escaping white space and preventing globing). Thus, jq is receiving literally ."$var_name" as it's script - which is not what you want. You simply need to remove the single quotes and you'll be good:
echo $temp1 | jq -r ."$var_name" <---- Does this work?
That said, I would never write my script that way. I would definitely want to include the '.' in the quoted string like this:
echo $temp1 | jq -r ".$var_name" <---- Does this work?
Some would also suggest that you quote "$temp1" as well (typically all variable references should be quoted to protect against white space, but this is not a problem with echo):
echo "$temp1" | jq -r ".$var_name" <---- Does this work?

Is there a way to output jq into multiple variables for bash script?

Basically I have a bash script that at one point makes an API call and a cert and key are generated and returned in json. I pipe it to jq and can select either the cert or the key and store it in a variable.
Something like this:
CERT=$(API call | jq -r '.certificate')
or
KEY=$(API call | jq -r '.key')
I want to store each in its own variable but I can't make the call twice because it will generate a new cert/key.
I know that I can just store both in a file and then manipulate after to accomplish my task but I am curious if jq offers a direct way to selectively store each value in its own variable?
You could store the original JSON in a variable (instead of a file), and then extract the key and cert from that:
apiResult=$(API call)
cert=$(jq -r '.certificate' <<<"$apiResult")
key=$(jq -r '.key' <<<"$apiResult")
Notes: I recommend using lower- or mixed-case variables, to avoid accidental conflicts with the many all-caps names with special meanings. Also, <<< is a bashism, and won't work in all other shells; if you need this be portable to e.g. dash, use something like key=$(printf '%s\n' "$apiResult" | jq -r '.key').
Unless there are NUL characters in the strings, there should be no need to call jq more than once, or to serialize the data.
In the simplest case, you could proceed along the following lines:
{ IFS= read -r certificate
IFS= read -r key
echo "certificate=$certificate"
echo "key=$key"
} < <(API call | jq -r '.certificate, .key')
If the values do not contain NUL but might contain newline characters,
then you could use NUL as the delimiter. For the sake of variety,
we could also use a while loop:
while IFS= read -r -d $'\0' certificate
do
IFS= read -r -d $'\0' key
echo "certificate=$certificate"
echo "key=$key"
done < <(API call | jq -rj '[.certificate, .key] | join("\u0000")')
Conversely ...
If the values of interest might contain literal NUL values ("\u0000"), then the question seems to be problematic, as bash variables in effect cannot contain literal NULs.
If any of the values of interest might contain literal NUL values, then here are two strategies for extracting the "raw" string equivalents into separate files:
Save the JSON output in a (temporary) file, and invoke jq -r once per value of interest in the obvious way.
Set up a bash pipeline starting with:
API call | jq -r '.certificate, .key | #base64`
and continuing with a loop in which each line is decoded, e.g. using base64 --decode or jq's #base64d.
The second strategy might make sense if the API call produces a very large JSON document.
If neither of the values contain a newline, you can easily use read:
$ read cert key < <( echo '{"certificate": "foo", "key": "bar"}' | jq -r .certificate,.key | tr \\n ' ')
$ echo $cert
foo
$ echo $key
bar
or:
$ { read cert; read key; } << EOF
> $( echo '{"certificate": "foo", "key": "bar"}' | jq -r .certificate,.key )
> EOF
$ echo $cert:$key
foo:bar
The certificate almost certainly will contain newlines, so you many want to serialize that data (eg, base64 encode it). But you're probably better off using Gordon Davisson's approach.
Say you created a program that output shell commands.
$ data_source | jq -r '#sh "certificate=\( .certificate )\nkey=\( .key )"'
certificate='the certificate'
key='the key'
Then, all you would need to do is evaluate them.
eval "$( data_source | jq -r '#sh "certificate=\( .certificate )\nkey=\( .key )"' )"
If you were dealing with many variables, you could use the following:
eval "$(
data_source |
jq -r '
{ "certificate": "certificate", "key": "key" } as $map |
( $map | keys_unsorted[] ) as $shell_var |
$shell_var + #sh "=\( .[$map[$shell_var]] )"
'
)"
You could even use paths.
eval "$(
data_source |
jq -r '
{ "certificate": [ "certificate" ], "key": [ "key" ] } as $map |
( $map | keys_unsorted[] ) as $shell_var |
$shell_var + #sh "=\( getpath($map[$shell_var]) )"
'
)"

How to get the output using jq for a json array for each value [duplicate]

I parsed a json file with jq like this :
# cat test.json | jq '.logs' | jq '.[]' | jq '._id' | jq -s
It returns an array like this : [34,235,436,546,.....]
Using bash script i described an array :
# declare -a msgIds = ...
This array uses () instead of [] so when I pass the array given above to this array it won't work.
([324,32,45..]) this causes problem. If i remove the jq -s, an array forms with only 1 member in it.
Is there a way to solve this issue?
We can solve this problem by two ways. They are:
Input string:
// test.json
{
"keys": ["key1","key2","key3"]
}
Approach 1:
1) Use jq -r (output raw strings, not JSON texts) .
KEYS=$(jq -r '.keys' test.json)
echo $KEYS
# Output: [ "key1", "key2", "key3" ]
2) Use #sh (Converts input string to a series of space-separated strings). It removes square brackets[], comma(,) from the string.
KEYS=$(<test.json jq -r '.keys | #sh')
echo $KEYS
# Output: 'key1' 'key2' 'key3'
3) Using tr to remove single quotes from the string output. To delete specific characters use the -d option in tr.
KEYS=$((<test.json jq -r '.keys | #sh')| tr -d \')
echo $KEYS
# Output: key1 key2 key3
4) We can convert the comma-separated string to the array by placing our string output in a round bracket().
It also called compound Assignment, where we declare the array with a bunch of values.
ARRAYNAME=(value1 value2 .... valueN)
#!/bin/bash
KEYS=($((<test.json jq -r '.keys | #sh') | tr -d \'\"))
echo "Array size: " ${#KEYS[#]}
echo "Array elements: "${KEYS[#]}
# Output:
# Array size: 3
# Array elements: key1 key2 key3
Approach 2:
1) Use jq -r to get the string output, then use tr to delete characters like square brackets, double quotes and comma.
#!/bin/bash
KEYS=$(jq -r '.keys' test.json | tr -d '[],"')
echo $KEYS
# Output: key1 key2 key3
2) Then we can convert the comma-separated string to the array by placing our string output in a round bracket().
#!/bin/bash
KEYS=($(jq -r '.keys' test.json | tr -d '[]," '))
echo "Array size: " ${#KEYS[#]}
echo "Array elements: "${KEYS[#]}
# Output:
# Array size: 3
# Array elements: key1 key2 key3
To correctly parse values that have spaces, newlines (or any other arbitrary characters) just use jq's #sh filter and bash's declare -a. (No need for a while read loop or any other pre-processing)
// foo.json
{"data": ["A B", "C'D", ""]}
str=$(jq -r '.data | #sh' foo.json)
declare -a arr="($str)" # must be quoted like this
$ declare -p arr
declare -a arr=([0]="A B" [1]="C'D" [2]="")
The reason that this works correctly is that #sh will produce a space-separated list of shell-quoted words:
$ echo "$str"
'A B' 'C'\''D' ''
and this is exactly the format that declare expects for an array definition.
Use jq -r to output a string "raw", without JSON formatting, and use the #sh formatter to format your results as a string for shell consumption. Per the jq docs:
#sh:
The input is escaped suitable for use in a command-line for a POSIX shell. If the input is an array, the output will be a series of space-separated strings.
So can do e.g.
msgids=($(<test.json jq -r '.logs[]._id | #sh'))
and get the result you want.
From the jq FAQ (https://github.com/stedolan/jq/wiki/FAQ):
𝑸: How can a stream of JSON texts produced by jq be converted into a bash array of corresponding values?
A: One option would be to use mapfile (aka readarray), for example:
mapfile -t array <<< $(jq -c '.[]' input.json)
An alternative that might be indicative of what to do in other shells is to use read -r within a while loop. The following bash script populates an array, x, with JSON texts. The key points are the use of the -c option, and the use of the bash idiom while read -r value; do ... done < <(jq .......):
#!/bin/bash
x=()
while read -r value
do
x+=("$value")
done < <(jq -c '.[]' input.json)
++ To resolve this, we can use a very simple approach:
++ Since I am not aware of you input file, I am creating a file input.json with the following contents:
input.json:
{
"keys": ["key1","key2","key3"]
}
++ Use jq to get the value from the above file input.json:
Command: cat input.json | jq -r '.keys | #sh'
Output: 'key1' 'key2' 'key3'
Explanation: | #sh removes [ and "
++ To remove ' ' as well we use tr
command: cat input.json | jq -r '.keys | #sh' | tr -d \'
Explanation: use tr delete -d to remove '
++ To store this in a bash array we use () with `` and print it:
command:
KEYS=(`cat input.json | jq -r '.keys | #sh' | tr -d \'`)
To print all the entries of the array: echo "${KEYS[*]}"

Using yq in for loop bash

I have a yaml array like below,
identitymappings:
- arn: "arn:aws:iam::12345567:role/AdmRole"
group: "system:masters"
user: "user1"
- arn: "arn:aws:iam::12345567:role/TestRole"
group: "system:masters"
user: "user2"
I am trying to parse this yaml in a bash script using for loop and yq.
for identityMapping in $(yq read test.yaml "identitymappings[*]"); do
roleArn=$identityMapping["arn"]
group=$identityMapping.group
user=$identityMapping.user
done
But I am not getting the expected results like not able to fetch the values of roleArn,group,user.
Please let me know how to fix this.
The way I would do it is:
# load array into a bash array
# need to output each entry as a single line
readarray identityMappings < <(yq e -o=j -I=0 '.identitymappings[]' test.yml )
for identityMapping in "${identityMappings[#]}"; do
# identity mapping is a yaml snippet representing a single entry
roleArn=$(echo "$identityMapping" | yq e '.arn' -)
echo "roleArn: $roleArn"
done
output:
roleArn: arn:aws:iam::12345567:role/AdmRole
roleArn: arn:aws:iam::12345567:role/TestRole
Disclaimer: I wrote yq
I wasn't able to comment on Charles Duffy's proper answer, but this works for yq v4 without the use of jq...
while IFS=$'\t' read -r roleArn group user _; do
echo "Role: $roleArn"
echo "Group: $group"
echo "User: $user"
done < <(yq e '.identitymappings[] | [.arn, .group, .user] | #tsv' test.yaml)
The easiest way to read from jq or yq into bash is to use a BashFAQ #1 while read loop to handle line-oriented data; in the below, we use #tsv to generate line-oriented output:
while IFS=$'\t' read -r roleArn group user _; do
echo "Role: $roleArn"
echo "Group: $group"
echo "User: $user"
done < <(yq -j read test.yaml \
| jq -r '.identitymappings[] | [.arn, .group, .user] | #tsv')
Note that if you were using the Python yq rather than the Go one, you could remove the yq -j read and just use yq -r '...' in place of jq -r '...'.
There is an improvement of #Rad4's answer that worked for me.
You can neatly loop through using latest yq and jq via:
for im in $(yq eval -o=j test.yaml | jq -cr '.identitymappings[]'); do
arn=$(echo $im | jq -r '.arn' -)
group=$(echo $im | jq -r '.group' -)
user=$(echo $im | jq -r '.user' -)
echo $arn $group $user
done
This loops through valid online jsons, which makes jq still work inside the loop.
The answer by #mike.f is a good one. However, it does not work on OSX machines, because readarray is not an available command.
You can read more about this here.
Here is the equivalent that would work on a mac:
# load array into a bash array
# need to output each entry as a single line
identitymappings=( $(yq e -o=j -I=0 '.identitymappings[]' test.yml ) )
for identityMapping in "${identityMappings[#]}"; do
# identity mapping is a yaml snippet representing a single entry
roleArn=$(echo "$identityMapping" | yq e '.arn' -)
echo "roleArn: $roleArn"
done
Get identitymappings length
Using index to access element of identitymappings
array_length=`yq e ". identitymappings | length - 1" test.yaml`
if [ $array_length -le 0 ] ; then
exit
fi
for element_index in `seq 0 $array_length`;do
arn=`yq e ".identitymappings[$element_index]. arn" test.yml`
group=`yq e ".identitymappings[$element_index]. group" test.yml`
user=`yq e ".identitymappings[$element_index]. user" test.yml`
done
How to get length of array in yq?
https://mikefarah.gitbook.io/yq/operators/length
I figured out..
for identityMapping in $(yq read test.yaml -j "identitymappings[*]"); do
echo $identityMapping
roleArn= echo $identityMapping | jq -r '.arn'
echo $roleArn
group= echo $identityMapping | jq -r '.group'
echo $group
user= echo $identityMapping | jq -r '.user'
echo $user

exporting environment variables with spaces using jq

So, I'm trying to export an environment variable that comes from an api that returns json values. Would like to use jq to just do a one liner, but if the values have spaces I cannot get it working
Trying without surrounding the value in quotes
/app/src $ $(echo '{"params":[{ "Name":"KEY","Value":"value with space"}]}' | jq
-r '.params[] | "export " + .Name + "=" + .Value')
/app/src $ printenv KEY
value
/app/src $
Next, I try wrapping the value in quotes
/app/src $ $(echo '{"params":[{ "Name":"KEY","Value":"value with space"}]}' | jq
-r '.params[] | "export " + .Name + "=\"" + .Value + "\""')
sh: export: space": bad variable name
/app/src $
For all of the below, I'm assuming that:
json='{"params":[{ "Name":"KEY","Value":"value with space"}]}'
It can be done, but ONLY IF YOU TRUST YOUR INPUT.
A solution that uses eval might look like:
eval "$(jq -r '.params[] | "export \(.Name | #sh)=\(.Value | #sh)"' <<<"$json")"
The #sh builtin in jq escapes content to be eval-safe in bash, and the eval invocation then ensures that the content goes through all parsing stages (so literal quotes in the data emitted by jq become syntactic).
However, all solutions that allow arbitrary shell variables to be assigned have innate security problems, as the ability to set variables like PATH, LD_LIBRARY_PATH, LD_PRELOAD and the like can be leveraged into arbitrary code execution.
Better form is to generate a NUL-delimited key/value list...
build_kv_nsv() {
jq -j '.params[] |
((.Name | gsub("\u0000"; "")),
"\u0000",
(.Value | gsub("\u0000"; "")),
"\u0000")'
}
...and either populate an associative array...
declare -A content_received=( )
while IFS= read -r -d '' name && IFS= read -r -d '' value; do
content_received[$name]=$value
done < <(build_kv_nsv <<<"$json")
# print the value of the populated associative array
declare -p content_received
...or to use a namespace that's prefixed to guarantee safety.
while IFS= read -r -d '' name && IFS= read -r -d '' value; do
printf -v "received_$name" %s "$value" && export "received_$name"
done < <(build_kv_nsv <<<"$json")
# print names and values of our variables that start with received_
declare -p "${!received_#}" >&2
If the values are known not to contain (raw) newlines, and if you have access to mapfile, it may be worthwhile considering using it, e.g.
$ json='{"params":[{ "Name":"KEY","Value":"value with space"}]}'
$ mapfile -t KEY < <( jq -r '.params[] | .Value' <<< "$json" )
$ echo N=${#KEY[#]}
N=1
If the values might contain (raw) newlines, then you'd need a version of mapfile with the -d option, which could be used as illustrated below:
$ json='{"params":[{ "Name":"KEY1","Value":"value with space"}, { "Name":"KEY2","Value":"value with \n newline"}]}'
$ mapfile -d $'\0' KEY < <( jq -r -j '.params[] | .Value + "\u0000"' <<< "$json" )
$ echo N=${#KEY[#]}
N=2

Resources