jq command throws error "cannot iterate over string" when - bash

I am writing a bash script utilizing jq to filter out JSON entries given some bash variables and return some of the key values from each entry into a tab-delimited file. I think the first few lines of this command are okay, but the 4th line I think is causing the problem. I have tried piping each entry in line 4 to tostring but to no avail.
info=`cat $FILE | jq -r \
' .[] \
| map(select(.host| contains(env.A))) \
| [."ip.A",."ts",."ip.B"] \
| #tsv'`
JSON example entry:
{
"ts": "2019-06-19T00:00:00.000000Z",
"ip.A": "0.0.0.0",
"ip.B": "0.0.0.0",
"host": "www.google.com",
}
In these files, there are no brackets surrounding the entire text within the file.
Error Given:
jq: error (at <stdin>:0): Cannot iterate over string ("2019-06-18...)
Do I need to handle ".ts" in some special way?

This code is broken long before the 3rd line.
If there isn't an outer array or object, you can't use .[].
If your data type is an object and not a list, using map() on it throws away data (specifically, it discards the keys, leaving only the values).
...so, .[] iterates over the values in your object, and then map() tries to iterate over each of those values as if it was an iterable object itself, which it's not... hence your error.
A version cut down to remove the broken parts might look like:
a="google.com" jq -r '
if (.host | contains(env.a)) then
[."ip.A",."ts",."ip.B"] | #tsv
else
empty
end
' <<'EOF'
{
"ts": "2019-06-19T00:00:00.000000Z",
"ip.A": "0.0.0.0",
"ip.B": "0.0.0.0",
"host": "www.google.com"
}
EOF
...which works fine.

Related

How to use(read) value from associative array during jq map?

I am reading aws ssm parameters via jq with json, and want to put them into a yaml file after some processing and alias mapping.
I have one associative array of mapping parameter name to some internal name (see process.sh)
My simplified example.json is
{
"Parameters": [
{
"Name": "/dev/applications/web/some_great_key",
"Value": "magicvaluenotrelevant"
},
{
"Name": "/dev/applications/web/api_magic",
"Value": "blabla"
}
]
}
My bash script process.sh is
#!/usr/bin/env bash
declare -A ssmMap
ssmMap[/dev/applications/web/some_great_key]=theGreatKEYAlias
ssmMap[/dev/applications/web/api_magic]=apiKey
jq -r '
.Parameters
| map({
Name: .Name,
parameterValue: .Value
})
| map((${!ssmMap[.Name]} + ": \"" + .parameterValue + "\""))
| join("\n")
' < example.json;
I want/expected the output to be:
ssm_theGreatKEYAlias: "magicvaluenotrelevant"
ssm_apiKey: "singleblabla"
With the process.sh script that I provided I get error because cannot find a way to use the associative array sshMap inside jq map, to transform the parameter name from json into the mapped alias from sshMap.
Any idea how that can be achieved ?
If I change the line 13 of process.sh into
| map((.Name + ": \"" + .parameterValue + "\""))
it works fine but without mapping, uses original parameter name as comes from json file.
You're trying to access a shell var from jq.
Fixed:
ssm_map='
{
"some_great_key": "theGreatKEYAlias",
"api_magic": "apiKey"
}
'
jq -r --argjson ssm_map "$ssm_map" '
.Parameters[] |
"ssm_\( $ssm_map[ .Name ] ): \"\( .Value )\""
'
Demo on jqplay
You could convert the variable you describe into JSON with varying levels of difficulty (depending on what limitations, if any, you're willing to accept). But since you're building the variable from the output of yq, should could simply have it output JSON directly.
Using mikefarah/yq or kislyuk/yq or itchyny/gojq is definitely the better approach if your actual task is to convert from YAML and to process as JSON.
However, to accomplish the task you have described in the original question, one could add the keys and values of the bash associative array using the --args option and access them within jq via the $ARGS.positional array (requires jq version 1.6), then restore the association by transposeing both halves of that array, and create an INDEX using the key from the first half, which can then be used with JOIN to create the desired text lines, which are output as raw text using the -r option.
jq -r '
JOIN(
$ARGS.positional | [.[:length/2], .[length/2:]] | transpose | INDEX(.[0]);
.Parameters[];
.Name;
"ssm_\(.[1][1] // .[0].Name): \(.[0].Value)"
)
' example.json --args "${!ssmMap[#]}" "${ssmMap[#]}"
ssm_theGreatKEYAlias: magicvaluenotrelevant
ssm_apiKey: blabla
The fallback // .[0].Name is unnecessary for the sample file, but was added to use the original key in case there's an input key not covered by the bash array.

How can I use jq 'try-catch' inside of an object construction

The general problem statement is that I have some json data with fields that may or may not exist. Patterns like try .foo catch "bar" or .foo? // "bar" work just fine on their own, but not inside of an object construction cat myfile.json | jq -r '{foo: try .foo catch "bar"}'. Using an object construction is important for my use case, where I need to pass the same input through many filters and format the data nicely for later functions.
The data I'm working with is the output of kubectl get pods xxxxx -o json, and the part I'm trying to parse looks like this
{
"items": [
{
"status": {
"conditions": [
{
"type": "Initialized",
"message": "foo"
},
{
"type": "Ready"
}
]
}
},
{
"status": {
}
}
]
}
For each item, if it has .status.conditions, I want the message from the first element in .status.conditions that has a message. The filter I used looks like this
jq -r '.items[] | {message: .status.conditions?|map(select(has("message")).message)[0]?}'
The problem is that when it gets to an item that doesn't have a .status.conditions, It returns the error Cannot iterate over null (null). This makes sense given that .status.conditions? passes null to the next filter, map which needs to iterate over a list. My attempted solution was to use various ways of try-catch to pass an empty list to map instead of null
jq -r '.items[] | {message: .status | try .conditions catch [] |map(select(has("message")).message)[0]?}'
jq -r '.items[] | {message: .status.conditions? // []|map(select(has("message")).message)[0]?}'
or to use a ? that includes the map call
jq -r '.items[] | {message: (.status.conditions?|map(select(has("message")).message)[0])?}'
jq -r '.items[] | {message: .status.conditions?|(map(select(has("message")).message)[0])?}'
All of these attempts return 1 compile error when written inside of an object constructor as shown, and work as expected when written on their own without an object constructor (e.g. '.items[] | .status.conditions?|(map(select(has("message")).message)[0])?')
The jq man page doesn't give any warnings (that I noticed) about how object construction changes the syntax requirements of what's inside, nor how try-catch can be affected by being inside an object construction. Any ideas?
Thanks to #Fravadona, it was just a matter of parentheses. I had tried it at some point and made some mistake or other, but a working solution for my case is
jq -r '{message: .status | (.conditions? // []) | map(select(has("message")).message)[0]?}'

bash loop error : Get JSON Object by property with jq / bash

I would like to get the values from Json file. Which is working.
JsonFileToTest:
{
"permissions": [
{
"emailid": "test1#test.com",
"rights": "read"
},
{
"emailid": "test2#test.com",
"rights": "read"
}
]
}
readPermissions=($(jq -r '.permissions' JsonFileToTest))
# The command below works perfectly, But when I Put it in a loop, It does not.
#echo ${readPermissions[#]} | jq 'values[].emailid'
for vals in ${readPermissions[#]}
do
# I would like o extract the email id of the user. The loop is not working atm.
echo ${vals[#]} | jq 'values[].emailid'
done
what am I missing here?
thanks
If you really want to do it this way, that might look like:
readarray -t permissions < <(jq -c '.permissions[]' JsonFileToTest)
for permissionSet in "${permissions[#]}"; do
jq -r '.emailid' <<<"$permissionSet"
done
Note that we're telling jq to print one line per item (with -c), and using readarray -t to read each line into an array element (unlike the array=( $(...command...) ) antipattern, which splits not just on newlines but on other whitespace as well, and expands globs in the process).
But there's no reason whatsoever to do any of that. You'll get the exact same result simply running:
jq -r '.permissions[].emailid' JsonFileToTest

How do I concatenate dummy values in JQ based on field value, and then CSV-aggregate these concatenations?

In my bash script, when I run the following jq against my curl result:
curl -u someKey:someSecret someURL 2>/dev/null | jq -r '.schema' | jq -r -c '.fields'
I get back a JSON array as follows:
[{"name":"id","type":"int","doc":"Documentation for the id field."},{"name":"test_string","type":"string","doc":"Documentation for the test_string field"}]
My goal is to do a call with jq applied to return the following (given the example above):
{"id":1234567890,"test_string":"xxxxxxxxxx"}
NB: I am trying to automatically generate templated values that match the "schema" JSON shown above.
So just to clarify, that is:
all array objects (there could be more than 2 shown above) returned in a single comma-delimited row
doc fields are ignored
the values for "name" (including their surrounding double-quotes) are concatenated with either:
:1234567890 ...when the "type" for that object is "int"
":xxxxxxxxxx" ...when the "type" for that object is "string"
NB: these will be the only types we ever get for now
Can someone show me how I can expand upon my initial jq to return this?
NB: I tried working down the following path but am failing beyond this...
curl -u someKey:someSecret someURL 2>/dev/null | jq -r '.schema' | jq -r -c '.fields' | "\(.name):xxxxxxxxxxx"'
If it's not possible in pure JQ (my preference) I'm also happy for a solution that mixes in a bit of sed/awk magic :)
Cheers,
Stan
Given the JSON shown, you could add the following to your pipeline:
jq -c 'map({(.name): (if .type == "int" then 1234567890 else "xxxxxxxxxx" end)})|add'
With that JSON, the output would be:
{"id":1234567890,"test_string":"xxxxxxxxxx"}
However, it would be far better if you combined the three calls to jq into one.

How can I convert a "key: value" sequence into JSON?

hokay, I am trying to write a script that takes information from the yum - repolist all and puts it into pretty JSON for me to use in some data collecting.. Right now I have my output from the yum command looking like this.
All I have for code right now is just the yum repolist command.
#!/bin/bash -x
yum -v repolist all | grep -B2 -A6 "enabled" | sed 's/[[:space:]]//g' , 's/--//g' , 's/name=name=/name=/g'
the output from that command looks like:
Repo-id: wazuh_repo
Repo-name: Wazuhrepository
Repo-status: enabled
Repo-revision: 1536348945
Repo-updated: FriSep712:35:512018
Repo-pkgs: 73
Repo-size: 920M
Repo-baseurl: https://packages.wazuh.com/3.x/yum/
Repo-expire: 21,600second(s)(last:WedOct3108:59:002018)
There are about 8 entries and the titles are always the same... Can someone explain like I am five how to convert this into json, I've read the jq man page, I've read about hash's. nothing seems to make sense. I know I need to have a "key"/"value" how to I designate these?
I just want to take the output and make it look like pretty JSON, this is part of a larger script I am writing to help keep ontop of the repos we use at work. I am just totally not getting JSON though.
edit: I would prefer not to use a wrapper function and do/learn the proper way
So, first, so people who don't have yum can test this, let's make a wrapper function:
write_output() { cat <<EOF
Repo-id: wazuh_repo
Repo-name: Wazuhrepository
Repo-status: enabled
Repo-revision: 1536348945
Repo-updated: FriSep712:35:512018
Repo-pkgs: 73
Repo-size: 920M
Repo-baseurl: https://packages.wazuh.com/3.x/yum/
Repo-expire: 21,600second(s)(last:WedOct3108:59:002018)
EOF
}
Notably, all your keys come before the string :, and the values come after them -- so we want to read line-by-line, split based on colon-space sequences, treat what was in front as a key, and treat what's in back as a value.
Given that:
jq -Rn '[inputs | split(": ")] | reduce .[] as $kv ({}; .[$kv[0]] = $kv[1])' < <(write_output)
...properly emits:
{
"Repo-id": "wazuh_repo",
"Repo-name": "Wazuhrepository",
"Repo-status": "enabled",
"Repo-revision": "1536348945",
"Repo-updated": "FriSep712:35:512018",
"Repo-pkgs": "73",
"Repo-size": "920M",
"Repo-baseurl": "https://packages.wazuh.com/3.x/yum/",
"Repo-expire": "21,600second(s)(last:WedOct3108:59:002018)"
}
...so, how does that work?
jq -R turns on raw input mode; input is parsed as a sequence of raw strings, not as a sequence of JSON documents.
jq -n treats null as the only direct input, so one can then use input and inputs primitives inside the script where needed.
[ inputs ] reads all your lines of input, and puts them into a single array.
[ inputs | split(": ")] changes that from an array of strings to an array of lists -- with content both before and after the ": " sequence.
reduce .[] as $kv ( {}; ... ) starts a reducer, with an initial value of {}, and then feeds each value that .[] evaluates to (which is to say, each item in your list) into that reducer (the ... code) as the $kv variable, replacing the . value each time.
To run this with your yum command as the real input, change < <(write_output) to < <(yum -v repolist all | grep -B2 -A6 "enabled" | sed 's/[[:space:]]//g' , 's/--//g' , 's/name=name=/name=/g').
Here is a slightly more robust variation of #CharlesDuffy's answer. Since the latter provides excellent explanatory notes, further explanations are not given here.
jq -nR '
[inputs | index(": ") as $ix | {(.[:$ix]): .[$ix+2:]}]
| add'
This avoids using split in case the "value" contains ": ". It might, however, be still better not to assume that a space follows the first relevant ":".
Notice also that add is used here instead of reduce, solely for compactness and simplicity.
For these sorts of problems, I would prefer to use a regular expression to match keys and values. Otherwise, I would take an approach similar to Charles's.
$ ... | jq -Rn 'reduce (inputs | capture("(?<k>[^:]+):\\s*(?<v>.+)")) as {$k, $v} ({}; .[$k] = $v)'

Resources