I have this json which I get from an API.
{
"result": {
"key1": "val1",
"key2": "val2",
"key3": "val3"
}
}
I have the following shell script which creates headers for table in a text file. I want to extract key values from the above result object and put in the same text file where keys should go under KEYS and values under VALUES in the table. I am new to jq and shell and struggling to achieve this.
echo "%table"
echo -e "KEYS\tVALUES" > outputfile.txt
KEYVALS=$(curl -uuser:password
"http://localhost:8080/customapi")
# here I want to split the key values using jq and write to the outputfile.txt
cat outputfile.txt
Outcome I am expecting is:
KEYS VALUES
key1 val1
key2 val2
key3 val3
How can I achieve this?
The key is to convert .result to an array of key/value pairs using to_entries, then outputing a set of strings (created using string interpolation) in raw mode.
% cat tmp.json
{
"result": {
"key1": "val1",
"key2": "val2",
"key3": "val3"
}
}
% jq -r '{"KEYS": "VALUES"} + .result | to_entries[] | "\(.key)\t\(.value)"' tmp.json
KEYS VALUES
key1 val1
key2 val2
key3 val3
I added the header to the input before conversion to the key/value list.
by adding the column call at the end the alignment will work for longer values as well ...
Note the usage of the # char as token separator ... of course if your data contains it this will not work ...
aws cognito-idp list-user-pools --max-results 20 | \
jq -r '.UserPools[]|to_entries[]|select (.key == "Name")|("\(.key):#\(.value)")'| column -t -s'#'
output
Name: corp_stg_user_pool
Name: corp_dev_user_pool
Related
I have a yaml file say sample.yaml with the following structure
items:
- version: v1
data:
file1.json: |-
{
"key1": val1,
"key2": val2
}
- version: v2
data:
file2.json: |-
{
"key3": val3,
"key4": val4
}
In order to achieve the following output by splitting the data into separate files, what commands/operators should I be using in yq4? The documentation is rather confusing.
f1.json:
{
"key1": val1,
"key2": val2
}
f2.json:
{
"key3": val3,
"key4": val4
}
With yq3, I was able to run
files=$(yq read sample.yaml "items[*].data.*" -pp)
for file in $files; do
yq read sample.yaml "$file" > f1.json
done
Any pointers would be greatly appreciated. Thanks!
Using mikefarah/yq v4.14.1+ with from_json and the -oj option to convert to JSON, and the -s option to split up into files
yq -oj -s '"f" + ($index + 1)' '.items[].data[] | from_json' sample.yaml
will produce f1.json:
{
"key1": "val1",
"key2": "val2"
}
and f2.json:
{
"key3": "val3",
"key4": "val4"
}
If you want to use the filenames from the keys instead, you need to employ to_entries to have access to .key and .value. This will produce the files file1.json and file2.json:
yq -oj -s 'parent | .key | sub("\.json$","")' \
'(.items[].data | to_entries)[].value | from_json' sample.yaml
How to use sed with jq to replace _ in key name with symbol a
{ "product_name":"kl" }
should become
{ "productaname":"kl" }
in a bash script
No need for sed; it's easy to do in just jq:
$ jq '{ productaname: .product_name }' <<<'{ "product_name":"kl" }'
{"productaname":"kl"}
If you want to replace underscores with a's in all keys of an object:
$ jq 'with_entries(.key |= gsub("_"; "a"))' <<<'{ "product_name":"kl", "foo_bar":12 }'
{"productaname":"kl","fooabar":12}
From the documentation for with_entries:
to_entries, from_entries, with_entries
These functions convert between an object and an array of key-value pairs. If to_entries is passed an object, then for each k: v entry in the input, the output array includes {"key": k, "value": v}.
from_entries does the opposite conversion, and with_entries(foo) is a shorthand for to_entries | map(foo) | from_entries, useful for doing some operation to all keys and values of an object. from_entries accepts key, Key, name, Name, value and Value as keys.
I am getting below output after executing a script, which I store in a variable
{
"VariaB": "DONE",
"VariaC": "DONE",
"VariaD": null,
"VariaE": true
}
I have another variable VariaA="ABCD" & VariaF which contains value true or false and want to insert in variable value. I want to print the final variable in below format
{
"VariaA": "ABCD",
"VariaB": "DONE",
"VariaC": "DONE",
"VariaD": null,
"VariaE": true,
"VariaF": false
}
As others said, please do your best to use a JSON-aware tool rather than basic string manipulation, it would be bound to save yourself some effort in the future if not some trouble.
Since you said you currently can't, here's a string manipulation "solution" :
printf "{
\"VariaA\": \"$VariaA\",
%s
\"VariaF\": $VariaF
}" "$(grep -v '[{}]' <<< "$input")"
printf handles the whole structure, and takes as parameter the middle block that is from your input. To get that middle block, we use grep to exclude the lines that contain brackets.
Note that this will fail in a lot of cases such as the input not being formatted as usual (a one liner would be proper JSON, but would make that script fail) or suddenly containing nested objects, the variables containing double-quotes, etc.
You can try it here.
Looks yout output is JSON, for appending to your output object you could use jq for example try this:
cat json.txt | jq --arg VariaA ABCD '. + {VariaA: $VariaA}'
In this case, if json.txt contains your input:
{
"VariaB": "DONE",
"VariaC": "DONE",
"VariaD": null,
"VariaE": true
}
By using jq --arg VariaA ABCD '. + {VariaA: $VariaA}' it will then output:
{
"VariaB": "DONE",
"VariaC": "DONE",
"VariaD": null,
"VariaE": true,
"VariaA": "ABCD"
}
If you would like to use more variables you need to use --arg multiple times, for example:
jq --arg VariaA ABCD --argjson VariaX true '. + {VariaA: $VariaA, VariaX: $VariaX}'
The output will be:
{
"VariaB": "DONE",
"VariaC": "DONE",
"VariaD": null,
"VariaE": true,
"VariaA": "ABCD",
"VariaX": "true"
}
In this example cat json.txt simulates your command output but worth mention that if you wanted to process an existing file you could use (notice the <):
jq --arg VariaA ABCD '. + {VariaA: $VariaA}' < file.txt
By doing this you do all in one single process.
This pipeline does what you need
echo "{"`echo $VariaA | sed "s/{\|}//g"`,`echo " "$VariaF | sed "s/{\|}//g"`"}" | sed "s/,/,\n/g" | sed "s/{/{\n/" | sed "s/}/\n}/" | uniq
I have an array of json objects that I'd like to convert to an associative array in bash with a slight alteration to the key
{
"Parameters": [
{
"Name": "/path/user_management/api_key",
"Type": "SecureString",
"Value": "1234",
"Version": 1
},
{
"Name": "/path/user_management/api_secret",
"Type": "SecureString",
"Value": "5678",
"Version": 1
}
]
}
I know I need to use jq and sed but I just can't quite find the proper combination of doing what I'm looking for. Need to strip out "/path/user_management/" and set the remaining as the key, and use Value for value.
Trying to find a fairly clean one liner piping commands together. What I'd like to end up with is a bash associative array of something like:
myArray[api_key]="1234"
myArray[api_secret]="5678"
Asking for a one-liner code is as good as asking for unreadable code. If you want do this in a proper way, read the output of jq command in a while loop and strip out unwanted characters as required.
#!/usr/bin/env bash
# declare an associative array, the -A defines the array of this type
declare -A _my_Array
# The output of jq is separated by '|' so that we have a valid delimiter
# to read our keys and values. The read command processes one line at a
# time and puts the values in the variables 'key' and 'value'
while IFS='|' read -r key value; do
# Strip out the text until the last occurrence of '/'
strippedKey="${key##*/}"
# Putting the key/value pair in the array
_my_Array["$strippedKey"]="$value"
done< <(jq -r '.Parameters[] | "\(.Name)|\(.Value)"' json)
# Print the array using the '-p' or do one by one
declare -p _my_Array
Or print the array, the traditional way
for key in "${!_my_Array[#]}"; do
printf '%s %s\n' "${key}" "${_my_Array[$key]}"
done
I am totaly new to shell..... let me ut the proper use case.
Use case:-
I have written two get method in my shell script, and when a user calls that script I will perform some operation for many id's using a for loop. like below
test_get1(){
value1=//performing some operation and storing it
value2=//performing some operation and storing it
//below line I am converting the o/p of value1 and value2 in json
value=$JQ_TOOL -n --arg key1 "$value1" --arg key2 "$value2" '{"key1":"\($value1)","key2":"\($value2)"}'
}
test_get2(){
arr=(1,2,3)
local arr_values=()
for value in arr
do
// Calling test_get1 for each iteraion of this loop, like below
val=$(test_get1 $value)
//below line will store the values in array
arr_values+=("$val")
done
}
When I am doing echo for the above arr_values, I am getting the below output
Output.
arr_values={
"key1":"value1",
"key2":"value2"
}
{
"key1":"value1",
"key2":"value2"
}
I want to convert the above value in json format like below.
json_value=[
{
"key1":"value1",
"key2":"value2"
},
{
"key1":"value1",
"key2":"value2"
}
]
I tried to do it with JQ, but unable to get the proper result.
Use the slurp option:
jq -s . in.json > out.json
in.json
{
"key1": "value1",
"key2": "value2"
}
{
"key1": "value1",
"key2": "value2"
}
out.json
[
{
"key1": "value1",
"key2": "value2"
}
]
[
{
"key1": "value1",
"key2": "value2"
}
]
1) Your existing "value=" line can be simplified to:
value=$(jq -n --arg key1 "$value1" --arg key2 "$value2" '\
{key1: $value1, key2: $value2}')
because --arg always interprets the provided value as a string, and because jq expressions need not follow all the rules of JSON.
2) From your script, arr_value is a bash array of JSON values. To convert it into a JSON array, you should be able to use an incantation such as:
for r in "${a[#]}" ; do printf "%s" "$r" ; done | jq -s .
3) There is almost surely a much better way to achieve your ultimate goal. Perhaps it would help if you thought about calling jq just once.