How to parse the values and assign to my local variable in windows bash? - windows

I have a json file
input= {
"credentials": {
"accessKeyId": "123456789",
"secretAccessKey": "654321",
"sessionToken": "valuedummy",
"expiration": "201925"
}
}
I am trying to parse the value and assign to a local variable using windows bash.
What I am trying to achieve is:
Assign the values to my local variables a,b,c,d
a=123456789
b=654321
c=valuedummy
d=201925
How can I do this?

jq is a reasonable approach:
a=$( jq -r .credentials.accessKeyId input )
where input is this file:
{
"credentials": {
"accessKeyId": "123456789",
"secretAccessKey": "654321",
"sessionToken": "valuedummy",
"expiration": "201925"
}
}

You should definitely use tools that are meant to parse JSON, such as jq. You won't be able to do it reliably with sed, awk etc.:
$ cat file
{
"credentials": {
"accessKeyId": "123456789",
"secretAccessKey": "654321",
"sessionToken": "valuedummy",
"expiration": "201925"
}
}
$ cat script
#!/usr/bin/env bash
i=0 var=(a b c d)
while IFS= read -rd '' line; do
printf -v "${var[i++]:-tmp}" "$line"
done < <(jq -j '.credentials | to_entries[] | (.value + "\u0000")' file)
declare -p a b c d
$ ./script
declare -- a="123456789"
declare -- b="654321"
declare -- c="valuedummy"
declare -- d="201925"

A non jq answer, but requires json package and python (typically installed with standard distributions):
a=$( python -c 'import sys, json; print(json.load(sys.stdin)["credentials"]["accessKeyId"])' <input )
Edit: Here is a slightly more general answer (IMHO):
$ cat input
{
"credentials": {
"accessKeyId": "123456789",
"secretAccessKey": "654321",
"sessionToken": "valuedummy",
"expiration": "201925"
}
}
$ cat json2env
eval $(
python -c '
import sys, json
for (k,v) in (json.load(sys.stdin)["'$1'"]).items():
print(k + "=" + v)
' < $2
)
$ . json2env credentials input
$ echo $accessKeyId
123456789
$ echo $secretAccessKey
654321
It's slightly more general in that it will accept any json of a similar format ( dictionary containing a dictionary ) . I simply parameterized the dictionary attribute.
The python script basically prints all the name value pairs in the specified attribute ( in your case, "credentials" ). The name is used as the variable name. This is convenient but I suspect there are issues there with some symbols that may be acceptable in key names but not in variable names.

Related

bash loop error : Get JSON Object by property with jq / bash

I would like to get the values from Json file. Which is working.
JsonFileToTest:
{
"permissions": [
{
"emailid": "test1#test.com",
"rights": "read"
},
{
"emailid": "test2#test.com",
"rights": "read"
}
]
}
readPermissions=($(jq -r '.permissions' JsonFileToTest))
# The command below works perfectly, But when I Put it in a loop, It does not.
#echo ${readPermissions[#]} | jq 'values[].emailid'
for vals in ${readPermissions[#]}
do
# I would like o extract the email id of the user. The loop is not working atm.
echo ${vals[#]} | jq 'values[].emailid'
done
what am I missing here?
thanks
If you really want to do it this way, that might look like:
readarray -t permissions < <(jq -c '.permissions[]' JsonFileToTest)
for permissionSet in "${permissions[#]}"; do
jq -r '.emailid' <<<"$permissionSet"
done
Note that we're telling jq to print one line per item (with -c), and using readarray -t to read each line into an array element (unlike the array=( $(...command...) ) antipattern, which splits not just on newlines but on other whitespace as well, and expands globs in the process).
But there's no reason whatsoever to do any of that. You'll get the exact same result simply running:
jq -r '.permissions[].emailid' JsonFileToTest

How to replace a variable inside a string in bash

I have a string env variable which looks like below
data={\"data\":{\"sources\":\"some value\", \"destination\":\"some other value\"}}
I would like to include date (say YEAR) within this env variable. That is, I have another env variable called YEAR (bash: YEAR=2019) and I would like to use this variable (YEAR) inside data. Here is what I need to do
data={\"data\":{\"sources\":\"some value ${YEAR}\", \"destination\":\"some other value\"}}
but it does not work, how can I make it work?
Use jq:
$ echo "$data" | jq --argjson y "$YEAR" '.data.sources += " \($y)"'
{
"data": {
"sources": "some value 2019",
"destination": "some other value"
}
}
or
# Note the -c argument to compress the data to a single line
$ data=$(echo "$data" | jq -c --argjson y "$YEAR" '.data.sources += " \($y)"')
$ echo "$data"
{"data":{"sources":"some value 2019","destination":"some other value"}}
Alternative, using here documents, minimizing the need to escape quotes, while still supporting variable substitutions:
data=$(cat <<DATA
"data": {
"sources":"some value ${YEAR}",
"destination":"some other value"
}
}
DATA
)

cannot call bash environment variable inside jq

In the below script, I am not able to successfully call the "repovar" variable in the jq command.
cat quayrepo.txt | while read line
do
export repovar="$line"
jq -r --arg repovar "$repovar" '.data.Layer| .Features[] | "\(.Name), \(.Version), $repovar"' severity.json > volume.csv
done
The script uses a text file to loop through the repo names
quayrepo.txt---> file has the list of names in this case the file has a value of "Reponame1"
sample input severity.json file:
{
"status": "scanned",
"data": {
"Layer": {
"IndexedByVersion": 3,
"Features": [
{
"Name": "elfutils",
"Version": "0.168-1",
"Vulnerabilities": [
{
"NamespaceName": "debian:9",
"Severity": "Medium",
"Name": "CVE-2016-2779"
}
]
}
]
}
}
}
desired output:
elfutils, 0.168-1, Medium, Reponame1
Required output: I need to retrieve the value of my environment variable as the last column in my output csv file
You need to surround $repovar with parenthesis, as the other values
repovar='qweqe'; jq -r --arg repovar "$repovar" '.data.Layer| .Features[] | "\(.Name), \(.Version), \($repovar)"' tmp.json
Result:
elfutils, 0.168-1, qweqe
There's no need for the export.
#!/usr/bin/env bash
while read line
do
jq -r --arg repovar "$line" '.data.Layer.Features[] | .Name + ", " + .Version + ", " + $repovar' severity.json
done < quayrepo.txt > volume.csv
with quayrepo.txt as
Reponame1
and severity.json as
{
"status": "scanned",
"data": {
"Layer": {
"IndexedByVersion": 3,
"Features": [
{
"Name": "elfutils",
"Version": "0.168-1",
"Vulnerabilities": [
{
"NamespaceName": "debian:9",
"Severity": "Medium",
"Name": "CVE-2016-2779"
}
]
}
]
}
}
}
produces volume.csv containing
elfutils, 0.168-1, Reponame1
To #peak's point, changing > to >> in ...severity.json >> volume.csv will create a multi-line csv instead of just overwriting until the last line
You don't need a while read loop in bash at all; jq itself can loop over your input lines, even when they aren't JSON, letting you run jq only once, not once per line in quayrepo.txt.
jq -rR --slurpfile inJson severity.json <quayrepo.txt >volume.csv '
($inJson[0].data.Layer | .Features[]) as $features |
[$features.Name, $features.Version, .] |
#csv
'
jq -R specifies raw input, letting jq directly read lines from quayrepo.txt into .
jq --slurpfile varname filename.json reads filename.json into an array of JSON objects parsed from that file. If the file contains only one object, one needs to refer to $varname[0] to refer to it.
#csv converts an array to a CSV output line, correctly handling data with embedded quotes or other oddities that require special processing.

fetch the number of record from a JSON file using shell

I have a test.txt file in this format
{
"user": "sthapa",
"ticket": "LIN-5867_3",
"start_date": "2018-03-16",
"end_date": "2018-03-16",
"demo_nos": [692],
"service_names": [
"service1",
"service2",
"service3",
"service4",
"service5",
"service6",
"service7",
"service8",
"service9"
]
}
I need to look for a tag called demo_nos and provide the count of it.
For example in the above file "demo_nos": [692] which means only one demo nos...similarly if it had "demo_nos": [692,300] then the count would be 2
so what shell script can i write to fetch and print the count?
The output should say the demo nos = 1 or 2 depending on the values inside the tag [].
i.e I have a variable in my shell script called market_nos which should give me it's count
The gold standard for manipulating JSON data from the command line is jq:
$ jq '.demo_nos | length' test.txt
1
.demo_nos returns the value associated with the demo_nos key in the object, and that array is piped to the length function which does the obvious.
I'm assuming you have python and the file is JSON :)
$ cat some.json
{
"user": "sthapa",
"ticket": "LIN-5867_3",
"start_date": "2018-03-16",
"end_date": "2018-03-16",
"demo_nos": [692],
"service_names": [
"service1",
"service2",
"service3",
"service4",
"service5",
"service6",
"service7",
"service8",
"service9"
]
}
$ python -c 'import sys,json; print(len(json.load(sys.stdin)["demo_nos"]))' < some.json
1
Not the most elegant solution but this should do it
cat test.txt | grep -o -P 'demo_nos.{0,200}' | cut -d'[' -f2 | cut -d']' -f1 | awk -F',' '{ print NF }'
Please note that this is a quick and dirty solution treating input as raw text, and not taking into account JSON structure. In exceptional cases were "demo_nos" string would also appear elsewhere in the file, the output from the command above might be incorrect.

How to manipulate a jq output using bash?

I have the following jq code snippet:
https://jqplay.org/s/QzOttRHoz1
I want to loop each element from the result array using bash such as the pseudo code shows:
#!/bin/bash
foreach result
print "My name is {name}, I'm {age} years old"
print "--"
The result would be:
My name is A, I'm 1 years old.
---
My name is B, I'm 2 years old.
---
My name is C, I'm 3 years old.
---
Of course this is a trivial example just to clarify that my goal is to manipulate each array from the jq result individually.
Any suggestions on how to write the pseudo code into valid bash statements?
Saving the json:
{
"Names": [
{ "Name": "A", "Age": "1" },
{ "Name": "B", "Age": "2" },
{ "Name": "C", "Age": "3" }
]
}
as /tmp/input.txt I can run:
</tmp/input.txt jq --raw-output 'foreach .Names[] as $name ([];[];$name | .Name, .Age )' \
| while read -r name && read -r age; do
printf "My name is %s, I'm %d years old.\n" "$name" "$age";
printf -- "--\n";
done
The --raw-output with | .Name, .Age just prints two lines per .Names array member, one with name and another with age. Then I read two lines at a time with while read && read and use that to loop through them.
If you rather have:
["A","1"]
["B","2"]
["C","3"]
that's sad, the best would be to write a full parser that would take strings like "\"" into account. Anyway then you can:
</tmp/input2.txt sed 's/^\[//;s/\]$//;' \
| while IFS=, read name age; do
name=${name%\"};
name=${name#\"};
age=${age%\"};
age=${age#\"};
printf "My name is %s, I'm %d years old.\n" "$name" "$age";
printf -- "--\n";
done
The first sed removed the leading and enclosing [ and ] in each line. Then I read two strings separated by , (so vars like "a,b","c,d" will be read incorrectly). Then these two strings are stripped of leading and enclosing ". Then the usuall printf is used to output the result.
I have a written a simple script to achieve what you need:
My Json file test.json which is similar to your snippet:
{
"Names": [
{ "Name": "A", "Age": "1" },
{ "Name": "B", "Age": "2" },
{ "Name": "C", "Age": "3" }
]
}
My script:
#!/bin/bash
for i in $(cat test.json | jq -r '.Names[] | #base64'); do
_jq() {
echo ${i} | base64 --decode | jq -r ${1}
}
echo "My Name is $(_jq '.Name'), I'm $(_jq '.Age') years old"
done
Note that foreach .Names[] as $name ([];[];$name | .Name, .Age )
can be simplified to:
.Names[] | ( .Name, .Age )
or even in this specific case to:
.Names[][]
or for that matter to:
.[][][]
The important point, however, is that foreach is not needed to achieve simple iteration.

Resources