json processing in .sh files - bash

[
{
"id": "1636ea48-28b7-783a-48dd-5e041f10d9e6",
"name": "Test_Component1",
"desiredVersions": [],
"children": false
},
{
"id": "1636f939-136f-4609-ab93-238b1af193fe",
"name": "Test_Component2",
"desiredVersions": [],
"children": false
}
]
I am writing command in Execute Shell window in Jenkins. I have this json in a variable. I want to extract both Id values so further processing in next set of command can be done.

Using jq:
$ echo "$var" | jq '.[].id'
"1636ea48-28b7-783a-48dd-5e041f10d9e6"
"1636f939-136f-4609-ab93-238b1af193fe"

is it a string? If so, you can use a regexp expression to extract the ID´s values. Like:
(\"id\"\:\W)\"(.+)(\"\,)

Related

How to extract data from a JSON file into a variable

I have the following json format, basically it is a huge file with several of such entries.
[
{
"id": "kslhe6em",
"version": "R7.8.0.00_BNK",
"hostname": "abacus-ap-hf-test-001:8080",
"status": "RUNNING",
},
{
"id": "2bkaiupm",
"version": "R7.8.0.00_BNK",
"hostname": "abacus-ap-hotfix-001:8080",
"status": "RUNNING",
},
{
"id": "rz5savbi",
"version": "R7.8.0.00_BNK",
"hostname": "abacus-ap-hf-test-005:8080",
"status": "RUNNING",
},
]
I wanted to fetch all the hostname values that starts with "abacus-ap-hf-test" and without ":8080" into a variable and then wanted to use those values for further commands over a for loop something like below. But, am bit confused how can I extract such informaion.
HOSTAME="abacus-ap-hf-test-001 abacus-ap-hf-test-005"
for HOSTANAME in $HOSTNAME
do
sh ./trigger.sh
done
The first line command update to this:
HOSTAME=$(grep -oP 'hostname": "\K(abacus-ap-hf-test[\w\d-]+)' json.file)
or if you sure that the hostname end with :8080", try this:
HOSTAME=$(grep -oP '(?<="hostname": ")abacus-ap-hf-test[\w\d-]+(?=:8080")' json.file)
you will find that abacus-ap-hf-test[\w\d-]+ is the regex, and other strings are the head or the end of the regex content which for finding result accuracy.
Assuming you have valid JSON, you can get the hostname values using jq:
while read -r hname ; do printf "%s\n" "$hname" ; done < <(jq -r .[].hostname j.json)
Output:
abacus-ap-hf-test-001:8080
abacus-ap-hotfix-001:8080
abacus-ap-hf-test-005:8080

bash or powershell parsing jason file content

I would like to manipulate the content of jason file.
I've tried with powershell or linux bash but I was unable to get what I want.
On linux, I was thinking to use jq tool, despite obtains data, I cannot manipulate them.
jq '.[].pathSpec, .[].scope' jasonfilepath
Current output:
"file"
"file"
"/u01/app/grid/*/bin/oracle"
"/u01/app/oracle/product/*/db_1/bin/oracle"
My goal is to obtain something similar as:
scope pathSpec
Like:
file /u01/app/grid/*/bin/oracle
file /u01/app/oracle/product/*/db_1/bin/oracle
Jason file sample
[
{
"actions": [
"upload",
"detect"
],
"deep": false,
"dfi": true,
"dynamic": true,
"inject": false,
"monitor": false,
"pathSpec": "/u01/app/grid/*/bin/oracle",
"scope": "file"
},
{
"actions": [
"upload",
"detect"
],
"deep": false,
"dfi": true,
"dynamic": true,
"inject": false,
"monitor": false,
"pathSpec": "/u01/app/oracle/product/*/db_1/bin/oracle",
"scope": "file"
}
]
Do you have any idea to get this kind of expected output in Powershell and bash?
Thanks by advance,
Assuming a JSON input file named file.json:
In a Linux / Bash environment, use the following:
jq -r '.[] | .scope + " " + .pathSpec' file.json
In PowerShell, use the following (adapted from a comment by JohnLBevan):
(Get-Content -Raw file.json | ConvertFrom-Json) |
ForEach-Object { '{0} {1}' -f $_.scope, $_.pathSpec }
Note the (...) around the pipeline with the ConvertFrom-Json call, which is necessary in Windows PowerShell (but no longer in PowerShell (Core) 7+) to ensure that the parsed JSON array is enumerated in the pipeline, i.e. to ensure that its elements are sent one by one - see this post for more information.

jq does not show null output

I have the following code in the command line script:
output_json=$(jq -n \
--argjson ID "${id}" \
--arg Title "${title}" \
--argjson like "\"${like}\"" \
'$ARGS.named')
I put the id, title and like variables into the jq. I get the following output:
[
{
"ID": 6,
"Title": "ABC",
"like": ""
},
{
"ID": 22,
"Title": "ABC",
"like": "Yes"
}
]
But, I am trying to get the output in the following format, i.e. with null:
[
{
"ID": 6,
"Title": "ABC",
"like": null
},
{
"ID": 22,
"Title": "ABC",
"like": "Yes"
}
]
I don't quite get it is it possible to do this in general, or is it a problem with my jq command?
And as far as I understood "like": "" is not the same as "like": null. I am also a little confused now, and do not really understand what is the correct choice to use.
By using --argjson you need to provide valid JSON-encoded argument, thus if you want to receive null the value needs to be literally null. Your solution, however, adds quotes around it, so it can never be evaluated to null. (Also, it will only be a valid JSON string if it follows the JSON encoding for special characters such as the quote characters itself).
If you want to have a JSON string in the regular case, and null in the case where it is empty, import the content of ${like} as string using --arg and without the extra quotes (just as you do with ${title}), then use some jq logic to turn the empty string into null. An if statement would do, for example:
like=
jq -n --arg like "${like}" '{like: (if $like == "" then null else $like end)}'
{
"like": null
}

Invalid numeric literal when passing a truncated JSON object to jq

I have a response from cURL which looks like this:
{"username": "bot", "verified": true, "locale": "en-US", "mfa_enabled": false, "bot": true, "id": "123", "flags": 0, "avatar": null, "discriminator": "3114", "email": null} 200
which is stored in variable called auth
Then I want to be able to loop that object doing this:
response=$(jq -c "." <<< "${auth::-3}")
Note that I remove the last 3 characters because those are the status code.
So technically it should work, but it returns: parse error: Invalid numeric literal at line 1, column 11
If I enter the raw JSON as a string, it works. But not like this. Why?
Consider:
response=$(jq -n --argjson auth "${auth% *}" '$auth')
...which will work correctly with versions of bash too old to correctly support ${auth::-3} (a 4.x-only feature), and which will also log enough details to track down any issue caused by the content passed to jq when run with bash -x yourscript.

Read YAML metadata from a Pandoc markdown file

Is it possible to extract Pandoc's metadata (title, date, et al.) from a markdown file without a Haskell filter, or parsing the --to=json output?
The JSON output is particularly inconvenient for this, since a two-word title looks like:
$ pandoc -t json posts/test.md | jq '.meta | .title'
{
"t": "MetaInlines",
"c": [
{
"t": "Str",
"c": "Test"
},
{
"t": "Space"
},
{
"t": "Str",
"c": "post"
}
]
}
so even after having jq read the title, we still need to reconstruct words, and any emphasis, code, or anything else is only going to make it more complicated.
We can use the template variable $meta-json$ for this.
Stick the variable in a file (with an extension, to stop Pandoc looking in it's own directories) and then use it with pandoc --template=file.ext.
Pandoc's output is a JSON object with keys "title", "date", "tags", etc. and their respective values from the markdown document, which we can easily parse, filter, and manipulate with jq.
$ echo '$meta-json$' > /tmp/metadata.pandoc-tpl
$ pandoc --template=/tmp/metadata.pandoc-tpl | jq '.title,.tags'
"The Title"
[
"a tag",
"another tag"
]

Resources