Is it possible to pass and use in jq a variable of type array?
jq --arg ips "${IPs[0]}" '.nodes.app.ip = $ips[0] | .nodes.data.ip = $ips[1]' nodes.json
The general case solution is to pass that array in on stdin with NUL delimiters:
IPs=( 1.2.3.4 5.6.7.8 )
original_doc='{"nodes": { "app": {}, "data": {} }}'
jq -Rn --argjson original_doc "$original_doc" '
input | split("\u0000") as $ips
| $original_doc
| .nodes.app.ip = $ips[0]
| .nodes.data.ip = $ips[1]
' < <(printf '%s\0' "${IPs[#]}")
...emits as output:
{
"nodes": {
"app": {
"ip": "1.2.3.4"
},
"data": {
"ip": "5.6.7.8"
}
}
}
This is overkill for an array of IP addresses, but it works in the general case, even for actively-hostile arrays (ones with literal quotes, literal newlines, and other data that's intentionally hard-to-parse).
If you want to keep stdin clean, you can use a second copy of jq to convert your array to JSON:
IPs=( 1.2.3.4 5.6.7.8 )
IPs_json=$(jq -Rns 'input | split("\u0000")' < <(printf '%s\0' "${IPs[#]}"))
jq --argjson ips "$IPs_json" '
.nodes.app.ip = $ips[0]
| .nodes.data.ip = $ips[1]
' <<<'{"nodes": { "app": {}, "data": {} }}'
Related
In the given script, the nested key is not getting appended with the value. I could not figure out where the script is going wrong.
#!/bin/bash
echo "Add the figma json file path"
read path
figma_json="$(echo -e "${path}" | tr -d '[:space:]')"
echo $(cat $figma_json | jq -r '.color | to_entries[] | "\(.key):\(.value| if .value == null then .[] | .value else .value end)"')
Sample input:
{
"color": {
"white": {
"description": "this is just plain white color",
"type": "color",
"value": "#ffffffff",
"extensions": {
"org.lukasoppermann.figmaDesignTokens": {
"styleId": "S:40940df38088633aa746892469dd674de8b147eb,",
"exportKey": "color"
}
}
},
"gray": {
"50": {
"description": "",
"type": "color",
"value": "#fafafaff",
"extensions": {
"org.lukasoppermann.figmaDesignTokens": {
"styleId": "S:748a0078c39ca645fbcb4b2a5585e5b0d84e5fd7,",
"exportKey": "color"
}
}
}
}
}
}
Actual output:
white:#ffffffff gray:#fafafaff
Excepted output:
white:#ffffffff gray:50:#fafafaff
Full input file
Here's a solution using tostream instead of to_entries to facilitate simultaneous access to the full path and its value:
jq -r '
.color | tostream | select(.[0][-1] == "value" and has(1)) | .[0][:-1]+.[1:] | join(":")
' "$figma_json"
white:#ffffffff
gray:50:#fafafaff
Demo
An approach attempting to demonstrate bash best-practices:
#!/bin/bash
figma_json=$1 # prefer command-line arguments to prompts
[[ $figma_json ]] || {
read -r -p 'Figma JSON file path: ' path # read -r means backslashes not munged
figma_json=${path//[[:space:]]/} # parameter expansion is more efficient than tr
}
jq -r '
def recurse_for_value($prefix):
to_entries[]
| .key as $current_key
| .value?.value? as $immediate_value
| if $immediate_value == null then
.value | recurse_for_value(
if $prefix != "" then
$prefix + ":" + $current_key
else
$current_key
end
)
else
if $prefix == "" then
"\($current_key):\($immediate_value)"
else
"\($prefix):\($current_key):\($immediate_value)"
end
end
;
.color |
recurse_for_value("")
' "$figma_json"
I have this output variable
OUTPUT=$(echo $ZONE_LIST | jq -r '.response | .data[]')
The Output:
{
"accountId": "xyz",
"addDate": "2020-09-05T10:57:11Z",
"content": "\"MyContent\"",
"id": "MyID",
"priority": null
}
{
"accountId": "xyz",
"addDate": "2020-09-05T06:58:52Z",
"content": "\"MyContent\"",
"id": "MyID",
"priority": null
}
How can I create a loop for this two values?
MyLoop
echo "$content - $id"
done
I tried this, but then I get a loop through every single value
for k in $(echo $ZONE_LIST | jq -r '.response | .data[]'); do
echo $k
done
EDIT 1:
My complete JSON:
{
"errors": [],
"metadata": {
"transactionId": "",
},
"response": {
"data": [
{
"accountId": "xyz",
"addDate": "2020-09-05T10:57:11Z",
"content": "\"abcd\"",
"id": "myID1",
"lastChangeDate": "2020-09-05T10:57:11Z",
},
{
"accountId": "xyz",
"addDate": "2020-09-05T06:58:52Z",
"content": "\"abc\"",
"id": "myID2",
"lastChangeDate": "2020-09-05T07:08:15Z",
}
],
"limit": 10,
"page": 1,
"totalEntries": 2,
},
"status": "success",
"warnings": []
}
Now I need a loop for data, because I need it for a curl
The curl NOW:
curl -s -v -X POST --data '{
"deleteEntries": [
Data_from_json
]
}' https://URL_to_Update 2>/dev/null)
Now I want to create a new variable from my JSON data. My CURL should look like this at the end:
curl -s -v -X POST --data '{
"deleteEntries": [
{
"readID": "myID1",
"date": "2020-09-05T10:57:11Z", <--Value from addDate
"content": "abcd"
},
{
"readID": "myID2",
"date": "2020-09-05T06:58:52Z", <--Value from addDate
"content": "abc"
}
]
}' https://URL_to_Update 2>/dev/null)
Something like:
#!/usr/bin/env bash
while IFS=$'\37' read -r -d '' id content; do
echo "$id" "$content"
done < <(
jq -j '.response | .data[] | .id + "\u001f" + .content + "\u0000"' \
<<<"$ZONE_LIST"
)
jq -j: Forces a raw output from jq.
.id + "\u001f" + .content + "\u0000": Assemble fields delimited by ASCII FS (Hexadecimal 1f or Octal 37), and end record by a null character.
It then becomes easy and reliable to iterate over null delimited records by having read -d '' (null delimiter).
Fields id content are separated by ASCII FS, so just set the Internal Field Separator IFS environment variable to the corresponding octal IFS=$'37' before reading.
The first step is to realize you can turn the set of fields into an array like this using a technique like this:
jq '(.accountId + "," + .addDate)'
Now you can update your bash loop:
for k in $(echo $ZONE_LIST | jq -r '.response | .data[]' | jq '(.content + "," + .id)'); do
echo $k
done
There is probably a way to combine the two jq commands but I don't have your original json data for testing.
UPDATE - inside the loop you can parse the comma-delimited string into separate fields. This are more efficient ways to handle this task but I prefer simplicity.
ID=$(echo $k | cut -d',' -f1)
PRIORITY=$(echo $k | cut -d',' -f2)
echo "ID($ID) PRIORITY($PRIORITY)"
Try this.
for k in $(echo $ZONE_LIST | jq -rc '.response | .data[]'); do
echo $k|jq '.content + " - " + .id' -r
done
I have a CSV file that I want to convert to a JSON file with the quotes from the CSV removed using JQ in a shell script.
Here is the CSV named input.csv:
1,"SC1","Leeds"
2,"SC2","Barnsley"
Here is the JQ extract:
jq --slurp --raw-input --raw-output \
'split("\n") | .[1:] | map(split(",")) |
map({
"ListElementCode": .[0],
"ListElement": "\(.[1]) \(.[2])
})' \
input.csv > output.json
this writes to output.json:
[
{
"ListElementCode": "1",
"ListElement": "\"SC1\" \"Leeds\""
},
{
"ListElementCode": "2",
"ListElement": "\"SC2\" \"Barnsley\""
}
]
Any idea how I can remove the quotes around the 2 text values that get put into the ListElement part?
To solve only the most immediate problem, one could write a function that strips quotes if-and-when they exist:
jq -n --raw-input --raw-output '
def stripQuotes: capture("^\"(?<content>.*)\"$").content // .;
[inputs | split(",") | map(stripQuotes) |
{
"ListElementCode": .[0],
"ListElement": "\(.[1]) \(.[2])"
}]
' <in.csv >out.json
That said, to really handle CSV correctly, you can't just split(","), but need to split only on commas that aren't inside quotes (and need to recognize doubled-up quotes as the escaped form of a single quote). Really, I'd use Python instead of jq for this job -- and of this writing, the jq cookbook agrees that native jq code is only suited for "trivially simple" CSV files.
As mentioned, a Ruby answer:
ruby -rjson -rcsv -e '
data = CSV.foreach(ARGV.shift)
.map do |row|
{
ListElementCode: row.first,
ListElement: row.drop(1).join(" ")
}
end
puts JSON.pretty_generate(data)
' input.csv
[
{
"ListElementCode": "1",
"ListElement": "SC1 Leeds"
},
{
"ListElementCode": "2",
"ListElement": "SC2 Barnsley"
}
]
Using a proper CSV/JSON parser in perl:
#!/usr/bin/env perl
use strict; use warnings;
use JSON::XS;
use Text::CSV qw/csv/;
# input.csv:
#1,"SC1","Leeds"
#2,"SC2","Barnsley"
my $vars = [csv in => 'input.csv'];
#use Data::Dumper;
#print Dumper $vars; # display the data structure
my $o = [ ];
foreach my $a (#{ $vars->[0] }) {
push #{ $o }, {
ListElementCode => $a->[0],
ListElement => $a->[1] . " " . $a->[2]
};
}
my $coder = JSON::XS->new->ascii->pretty->allow_nonref;
print $coder->encode($o);
Output
[
{
"ListElement" : "SC1 Leeds",
"ListElementCode" : "1"
},
{
"ListElement" : "SC2 Barnsley",
"ListElementCode" : "2"
}
]
Here's an uncomplicated and efficient way to solve this particular problem:
jq -n --raw-input --raw-output '
[inputs
| split(",")
| { "ListElementCode": .[0],
"ListElement": "\(.[1]|fromjson) \(.[2]|fromjson)"
} ]' input.csv
Incidentally, there are many robust command-line CSV-to-JSON tools, amongst which I would include:
any-json (https://www.npmjs.com/package/any-json)
csv2json (https://github.com/fadado/CSV)
In the below script, I am not able to successfully call the "repovar" variable in the jq command.
cat quayrepo.txt | while read line
do
export repovar="$line"
jq -r --arg repovar "$repovar" '.data.Layer| .Features[] | "\(.Name), \(.Version), $repovar"' severity.json > volume.csv
done
The script uses a text file to loop through the repo names
quayrepo.txt---> file has the list of names in this case the file has a value of "Reponame1"
sample input severity.json file:
{
"status": "scanned",
"data": {
"Layer": {
"IndexedByVersion": 3,
"Features": [
{
"Name": "elfutils",
"Version": "0.168-1",
"Vulnerabilities": [
{
"NamespaceName": "debian:9",
"Severity": "Medium",
"Name": "CVE-2016-2779"
}
]
}
]
}
}
}
desired output:
elfutils, 0.168-1, Medium, Reponame1
Required output: I need to retrieve the value of my environment variable as the last column in my output csv file
You need to surround $repovar with parenthesis, as the other values
repovar='qweqe'; jq -r --arg repovar "$repovar" '.data.Layer| .Features[] | "\(.Name), \(.Version), \($repovar)"' tmp.json
Result:
elfutils, 0.168-1, qweqe
There's no need for the export.
#!/usr/bin/env bash
while read line
do
jq -r --arg repovar "$line" '.data.Layer.Features[] | .Name + ", " + .Version + ", " + $repovar' severity.json
done < quayrepo.txt > volume.csv
with quayrepo.txt as
Reponame1
and severity.json as
{
"status": "scanned",
"data": {
"Layer": {
"IndexedByVersion": 3,
"Features": [
{
"Name": "elfutils",
"Version": "0.168-1",
"Vulnerabilities": [
{
"NamespaceName": "debian:9",
"Severity": "Medium",
"Name": "CVE-2016-2779"
}
]
}
]
}
}
}
produces volume.csv containing
elfutils, 0.168-1, Reponame1
To #peak's point, changing > to >> in ...severity.json >> volume.csv will create a multi-line csv instead of just overwriting until the last line
You don't need a while read loop in bash at all; jq itself can loop over your input lines, even when they aren't JSON, letting you run jq only once, not once per line in quayrepo.txt.
jq -rR --slurpfile inJson severity.json <quayrepo.txt >volume.csv '
($inJson[0].data.Layer | .Features[]) as $features |
[$features.Name, $features.Version, .] |
#csv
'
jq -R specifies raw input, letting jq directly read lines from quayrepo.txt into .
jq --slurpfile varname filename.json reads filename.json into an array of JSON objects parsed from that file. If the file contains only one object, one needs to refer to $varname[0] to refer to it.
#csv converts an array to a CSV output line, correctly handling data with embedded quotes or other oddities that require special processing.
I have some bash code that serializes and de-serializes a single dimensional associative array in bash using jq. It does what I want for now, but I have two issues.
The first issue is, this code feels really klunky. Especially the serialization part. Is there a better way to do this? Either with jq or some other way?
The second issue is, I can deserialize nested data (.e.g, with {"data":{...}}) but I can't figure out how to wrap the output in the same nested structure. How can I recreate the original structure?
Edit: Clarification. What I want to be able to do is, with the commented json, json='{"data": {"one": "1", "two": "2", "three": "3"}}' in the example code and have the final result of
json='{"data": {"four": "4", "one": "100", "two": "2"}} dumped.
I can read in the 'data' structure and assign the key/values correctly, but I'm not having any luck in figuring out how to embed the {"four": ...} construct into the "data": {...} object.
Edit 2: The answer to my second issue, in combination with peak's answer is the following:
for key in "${!aaname[#]}"; do
printf '%s\n%s\n' "$key" "${aaname[$key]}"
done | jq -SRn '.data = ([inputs | {(.): input}] | add)'
The code is:
#!/bin/bash
#json='{"data": {"one": "1", "two": "2", "three": "3"}}'
json='{"one": "1", "two": "2", "three": "3"}'
#------------------------------------------------------------------------------
# De-serialize data
declare -A aaname
while IFS='=' read -r key value; do
aaname["$key"]="$value"
done < <(echo "$json" | jq -r '. | to_entries | .[] | .key + "=" + .value ')
#done < <(echo "$json" | jq -r '.data | to_entries | .[] | .key + "=" + .value ')
#------------------------------------------------------------------------------
# Manipulate data
# Change existing value ...
aaname['one']='100'
# Add element ...
aaname['four']='4'
# Remove element ...
unset aaname['three']
#------------------------------------------------------------------------------
# Serialize data
# Why can't I use ${#aaname[#]} in ((...))?
total="${#aaname[#]}"
count=0
{
printf '['
for key in "${!aaname[#]}"; do
printf '{"key": "%s", "value": "%s"}' "$key" "${aaname[$key]}"
((++count < total)) && printf ','
done
printf ']'
#}
#} | jq -S ' . | "data{" + from_entries + "}"'
} | jq -S ' . | from_entries'
# gives
#
#{
# "four": "4",
# "one": "100",
# "two": "2"
#}
It would be less klunky, and perhaps a bit more robust, if instead of:
jq -r '. | to_entries | .[] | .key + "=" + .value ')
you had:
jq -r 'to_entries[] | "\(.key)=\(.value)"'
And similarly you could replace the for loop used to create the JSON object with something like:
for key in "${!aaname[#]}"; do
printf "%s\n" "$key"
printf "%s\n" "${aaname[$key]}"
done | jq -Rn '[inputs | { (.): input}] | add'
Regarding the second issue, I'm afraid your question isn't so clear to me.
What format are you expecting for the non-JSON representation?
How generic a serialization/deserialization solution are you expecting?
In this connection, you might like to look at the output of jq --stream . <<< "$json"
for various JSON texts.