Generating a JSON map containing shell variables named in a list - bash

My shell-fu is at a below-beginner level. I have a file that contains some lines that happen to be the names of environment variables.
e.g.
ENV_VAR_A
ENV_VAR_B
...
What I want to do is use this file to generate a JSON string containing the names and current values of the named variables using jq like this:
jq -n --arg arg1 "$ENV_VAR_A" --arg arg2 "$ENV_VAR_B" '{ENV_VAR_A:$arg1,ENV_VAR_B:$arg2}'
# if ENV_VAR_A=one and ENV_VAR_B=two then the preceding command would output
# {"ENV_VAR_A":"one","ENV_VAR_B":"two"}
I'm trying to create the jq command through a shell script and I have no idea what I'm doing :(

Short and sweet (if you have jq 1.5 or higher):
jq -Rn '[inputs | {(.): env[.]}] | add' tmp.txt

What you want here is an indirect reference. Those can be done with ${!varname}. As a trivial example limited to exactly two lines:
# read arg1_varname and arg2_varname from the first two lines of file.txt
{ read -r arg1_varname; read -r arg2_varname; } <file.txt
# pass the variable named by the contents of arg1_varname as $arg1 in jq
# and the variable named by the contents of arg2_varname as $arg2 in jq
jq -n --arg arg1_name "$arg1_varname" --arg arg1_value "${!arg1_varname}" \
--arg arg2_name "$arg2_varname" --arg arg2_value "${!arg2_varname}" \
'{($arg1_name):$arg1_value, ($arg2_name):$arg2_value}'
To support an arbitrary number of key/value pairs, consider instead something like:
# Transform into NUL-separate key=value pairs (same format as /proc/*/environ)
while IFS= read -r name; do # for each variable named in file.txt
printf '%s=%s\0' "$name" "${!name}" # print its name and value, and a NUL
done \
<file.txt \
| jq -Rs 'split("\u0000") # split on those NULs
| [.[] | select(.) # ignore any empty strings
| capture("^(?<name>[^=]+)=(?<val>.*)$") # break into k/v pairs
| {(.name): .val}] # make each a JSON map
| add # combine those maps
'

jq can look up the values from the environment itself.
$ export A=1
$ export B=2
$ cat tmp.txt
A
B
$ jq -Rn '[inputs] | map({key: ., value: $ENV[.]}) | from_entries' tmp.txt
{
"A": "1",
"B": "2"
}
A few notes on how this works:
-R reads raw text, rather than trying to parse the input as JSON
-n prevents jq from reading input itself.
inputs reads all the input explicitly, allowing an array of names to be built.
map creates an array of objects with key and value as the keys; . is the current array input (a variable name), and $ENV[.] is the value of the environment variable whose name is the current array input.
from_entries finally coalesces all those {"key": ..., "value": ...} objects into a single object.

Try something along the following script in bash:
# array of arguments to pass to jq
jqarg=()
# the script to pass to jq
jqscript=""
# just a number for the arg$num for indexing
# suggestion: just index using variable names...
num=1
# for each variable name from the input
while IFS= read -r varname; do
# just an assertion - check if the variable is not empty
# the syntax ${!var} is indirect reference
# you could do more here, ex. see if such variable exists
# or if $varname is a valid variable name
if [[ -z "${!varname}" ]]; then
echo "ERROR: variable $varname has empty value!" >&2
exit 50
fi
# add the arguments to jqarg array
jqarg+=(--arg "arg$num" "${!varname}")
# update jqscript
# if jqscript is not empty, add a comma on the end
if [[ -n "$jqscript" ]]; then
jqscript+=","
fi
# add the ENV_VAR_A:$arg<number>
jqscript+="$varname:\$arg$num"
# update number - one up!
num=$((num + 1))
# the syntax of while read loop is that input file is on the end
done < input_file_with_variable_names.txt
# finally execute jq
# note the `{` and `}` in `{$jqscript}` are concious
jq -n "${jqarg[#]}" "{$jqscript}"
Just something that hopefully will give you a easier start with your journey in bash.
I guess I would do something unreadable with xargs like:
< input_file_with_variable_names.txt xargs -d$'\n' -n1 bash -c '
printf %s\\0%s\\0%s\\0 --arg "$1" "${!1}"
' -- |
xargs -0 sh -c 'jq -n "$#" "$0"' "{$(
sed 's/\(.*\)/\1: $\1 /' input_file_with_variable_names.txt |
paste -sd,
)}"

Related

How to read yaml file into bash associative array?

I want to read into bash associative array the content of one yaml file, which is a simple key value mapping.
Example map.yaml
---
a: "2"
b: "3"
api_key: "somekey:thatcancontainany#chara$$ter"
the key can contain any characters excluding space
the value can contain any characters without limitations $!:=#etc
What will always be constant is the separator between key and value is :
the script proceses.sh
#!/usr/bin/env bash
declare -A map
# how to read here into map variable, from map.yml file
#map=populatesomehowfrommap.yaml
for key in "${!map[#]}"
do
echo "key : $key"
echo "value: ${map[$key]}"
done
I tried to play around with yq tool, similar to json tool jq but did not have success yet.
With the following limitations:
simple YAML key: "value" in single lines
keys cannot contain :
values are always wrapped in "
#!/usr/bin/env bash
declare -A map
regex='^([^:]+):[[:space:]]+"(.*)"[[:space:]]*$'
while IFS='' read -r line
do
if [[ $line =~ $regex ]]
then
printf -v map["${BASH_REMATCH[1]}"] '%b' "${BASH_REMATCH[2]}"
else
echo "skipping: $line" 1>&2
fi
done < map.yaml
Update
Here's a robust solution using yq, which would be simpler if the builtin #tsv filter implemented the lossless TSV escaping rules instead of the CSV ones.
#!/usr/bin/env bash
declare -A map
while IFS=$'\t' read key value
do
printf -v map["$key"] '%b' "$value"
done < <(
yq e '
to_entries | .[] |
[
(.key | sub("\\","\\") | sub("\n","\n") | sub("\r","\r") | sub("\t","\t")),
(.value | sub("\\","\\") | sub("\n","\n") | sub("\r","\r") | sub("\t","\t"))
] |
join(" ")
' map.yaml
)
note: the join needs a literal Tab
One way, is by letting yq output each key/value pair on a single line, in the following syntax:
key#value
Then we can use bash's IFS to split those values.
The # is just an example and can be replaced with any single char
This works, but please note the following limitations:
It does not expect nested values, only a flat list`
The field seperator (# in the example) does not exist in the YAML key/value's
#!/bin/bash
declare -A arr
while IFS="#" read -r key value
do
arr[$key]="$value"
done < <(yq e 'to_entries | .[] | (.key + "#" + .value)' input.yaml)
for key in "${!arr[#]}"
do
echo "key : $key"
echo "value: ${arr[$key]}"
done
$ cat input.yaml
---
a: "bar"
b: "foo"
$
$
$ ./script.sh
key : a
value: bar
key : b
value: foo
$
I used #Fravadona s answer so will mark it as answer
After some modification to my use case, what worked for me looks like:
DEFS_PATH="definitions"
declare -A ssmMap
for file in ${DEFS_PATH}/*
do
filename=$(basename -- "$file")
projectName="${filename%.*}"
regex='^([^:]+):[[:space:]]*"(.*)"[[:space:]]*$'
while IFS='' read -r line
do
if [[ $line =~ $regex ]]
then
value="${BASH_REMATCH[2]}"
value=${value//"{{ ssm_env }}"/$INFRA_ENV}
value=${value//"{{ ssm_reg }}"/$SSM_REGION}
value=${value//"{{ projectName }}"/$projectName}
printf -v ssmMap["${BASH_REMATCH[1]}"] '%b' "$value"
else
echo "skipping: $line" 1>&2
fi
done < "$file"
done
Basically in real use case I have one folder where yaml definitions are located. I iterate over all of them to form the associative array ssmMap

How to use variable with jq cmd in shell

I am facing issue with below commands. I need to use variable but it is returning me null whereas when I hardcode its value, it return me correct response.
Can anybody help me whats the correct way of writing this command?
My intension is to pull value of corresponding key passed as a variable?
temp1="{ \"SSM_DEV_SECRET_KEY\": \"Smkfnkhnb48dh\", \"SSM_DEV_GRAPH_DB\": \"Prod=bolt://neo4j:Grt56#atc.preprod.test.com:7687\", \"SSM_DEV_RDS_DB\": \"sqlite:////var/local/ecosystem_dashboard/config.db\", \"SSM_DEV_SUPPERUSER_USERNAME\": \"admin\", \"SSM_DEV_SUPPERUSER_PASSWORD\": \"9dW6JE8#KH9qiO006\" }"
var_name=SSM_DEV_SECRET_KEY
echo $temp1 | jq -r '.SSM_DEV_SECRET_KEY' <----- return Smkfnkhnb48dh // output
echo $temp1 | jq -r '."$var_name"' <---- return null
echo $temp1 | jq -r --arg var_name "$var_name" '."$var_name"' <---- return null , alternative way
Update: I am adding actual piece of where I am trying to use above fix. My intension is to first read all values which start with SSM_DEV_... and then get there original values from aws than replace it in. one key pair look like this --> SECRET_KEY=$SSM_DEV_SECRET_KEY
temp0="dev"
temp1="DEV"
result1=$(aws secretsmanager get-secret-value --secret-id "xxx-secret-$temp0" | jq '.SecretString')
while IFS= read -r line; do
if [[ "$line" == *"=\$SSM_$temp1"* ]]; then
before=${line%%"="*}
after=${line#*"="}
var_name="${after:1}"
jq -r --arg var_name "$var_name" '.[$var_name]' <<< "$result1"
fi
done < sample_file.txt
Fix: I have solved my issue which was of carriage return character.
Below cmd help me:
var_name=`echo ${after:1} | tr -d '\r'`
jq -r --arg var_name "$var_name" '.[$var_name]' <<< "$result1"
You'll need to use Generic Object Index (.[$var_name]) to let jq know the variable should be seen as a key
The command should look like:
jq -r --arg var_name "$var_name" '.[$var_name]' <<< "$temp1"
Wich will output:
Smkfnkhnb48dh
Note: <<< "$temp1" instead off the echo
Let's look at the following statement:
echo $temp1 | jq -r '."$var_name"' <---- return null
Your problem is actually with the shell quoting and not jq. The single quotes tell the shell not to interpolate (do variable substitution) among other things (like escaping white space and preventing globing). Thus, jq is receiving literally ."$var_name" as it's script - which is not what you want. You simply need to remove the single quotes and you'll be good:
echo $temp1 | jq -r ."$var_name" <---- Does this work?
That said, I would never write my script that way. I would definitely want to include the '.' in the quoted string like this:
echo $temp1 | jq -r ".$var_name" <---- Does this work?
Some would also suggest that you quote "$temp1" as well (typically all variable references should be quoted to protect against white space, but this is not a problem with echo):
echo "$temp1" | jq -r ".$var_name" <---- Does this work?

How to get the output using jq for a json array for each value [duplicate]

I parsed a json file with jq like this :
# cat test.json | jq '.logs' | jq '.[]' | jq '._id' | jq -s
It returns an array like this : [34,235,436,546,.....]
Using bash script i described an array :
# declare -a msgIds = ...
This array uses () instead of [] so when I pass the array given above to this array it won't work.
([324,32,45..]) this causes problem. If i remove the jq -s, an array forms with only 1 member in it.
Is there a way to solve this issue?
We can solve this problem by two ways. They are:
Input string:
// test.json
{
"keys": ["key1","key2","key3"]
}
Approach 1:
1) Use jq -r (output raw strings, not JSON texts) .
KEYS=$(jq -r '.keys' test.json)
echo $KEYS
# Output: [ "key1", "key2", "key3" ]
2) Use #sh (Converts input string to a series of space-separated strings). It removes square brackets[], comma(,) from the string.
KEYS=$(<test.json jq -r '.keys | #sh')
echo $KEYS
# Output: 'key1' 'key2' 'key3'
3) Using tr to remove single quotes from the string output. To delete specific characters use the -d option in tr.
KEYS=$((<test.json jq -r '.keys | #sh')| tr -d \')
echo $KEYS
# Output: key1 key2 key3
4) We can convert the comma-separated string to the array by placing our string output in a round bracket().
It also called compound Assignment, where we declare the array with a bunch of values.
ARRAYNAME=(value1 value2 .... valueN)
#!/bin/bash
KEYS=($((<test.json jq -r '.keys | #sh') | tr -d \'\"))
echo "Array size: " ${#KEYS[#]}
echo "Array elements: "${KEYS[#]}
# Output:
# Array size: 3
# Array elements: key1 key2 key3
Approach 2:
1) Use jq -r to get the string output, then use tr to delete characters like square brackets, double quotes and comma.
#!/bin/bash
KEYS=$(jq -r '.keys' test.json | tr -d '[],"')
echo $KEYS
# Output: key1 key2 key3
2) Then we can convert the comma-separated string to the array by placing our string output in a round bracket().
#!/bin/bash
KEYS=($(jq -r '.keys' test.json | tr -d '[]," '))
echo "Array size: " ${#KEYS[#]}
echo "Array elements: "${KEYS[#]}
# Output:
# Array size: 3
# Array elements: key1 key2 key3
To correctly parse values that have spaces, newlines (or any other arbitrary characters) just use jq's #sh filter and bash's declare -a. (No need for a while read loop or any other pre-processing)
// foo.json
{"data": ["A B", "C'D", ""]}
str=$(jq -r '.data | #sh' foo.json)
declare -a arr="($str)" # must be quoted like this
$ declare -p arr
declare -a arr=([0]="A B" [1]="C'D" [2]="")
The reason that this works correctly is that #sh will produce a space-separated list of shell-quoted words:
$ echo "$str"
'A B' 'C'\''D' ''
and this is exactly the format that declare expects for an array definition.
Use jq -r to output a string "raw", without JSON formatting, and use the #sh formatter to format your results as a string for shell consumption. Per the jq docs:
#sh:
The input is escaped suitable for use in a command-line for a POSIX shell. If the input is an array, the output will be a series of space-separated strings.
So can do e.g.
msgids=($(<test.json jq -r '.logs[]._id | #sh'))
and get the result you want.
From the jq FAQ (https://github.com/stedolan/jq/wiki/FAQ):
𝑸: How can a stream of JSON texts produced by jq be converted into a bash array of corresponding values?
A: One option would be to use mapfile (aka readarray), for example:
mapfile -t array <<< $(jq -c '.[]' input.json)
An alternative that might be indicative of what to do in other shells is to use read -r within a while loop. The following bash script populates an array, x, with JSON texts. The key points are the use of the -c option, and the use of the bash idiom while read -r value; do ... done < <(jq .......):
#!/bin/bash
x=()
while read -r value
do
x+=("$value")
done < <(jq -c '.[]' input.json)
++ To resolve this, we can use a very simple approach:
++ Since I am not aware of you input file, I am creating a file input.json with the following contents:
input.json:
{
"keys": ["key1","key2","key3"]
}
++ Use jq to get the value from the above file input.json:
Command: cat input.json | jq -r '.keys | #sh'
Output: 'key1' 'key2' 'key3'
Explanation: | #sh removes [ and "
++ To remove ' ' as well we use tr
command: cat input.json | jq -r '.keys | #sh' | tr -d \'
Explanation: use tr delete -d to remove '
++ To store this in a bash array we use () with `` and print it:
command:
KEYS=(`cat input.json | jq -r '.keys | #sh' | tr -d \'`)
To print all the entries of the array: echo "${KEYS[*]}"

exporting environment variables with spaces using jq

So, I'm trying to export an environment variable that comes from an api that returns json values. Would like to use jq to just do a one liner, but if the values have spaces I cannot get it working
Trying without surrounding the value in quotes
/app/src $ $(echo '{"params":[{ "Name":"KEY","Value":"value with space"}]}' | jq
-r '.params[] | "export " + .Name + "=" + .Value')
/app/src $ printenv KEY
value
/app/src $
Next, I try wrapping the value in quotes
/app/src $ $(echo '{"params":[{ "Name":"KEY","Value":"value with space"}]}' | jq
-r '.params[] | "export " + .Name + "=\"" + .Value + "\""')
sh: export: space": bad variable name
/app/src $
For all of the below, I'm assuming that:
json='{"params":[{ "Name":"KEY","Value":"value with space"}]}'
It can be done, but ONLY IF YOU TRUST YOUR INPUT.
A solution that uses eval might look like:
eval "$(jq -r '.params[] | "export \(.Name | #sh)=\(.Value | #sh)"' <<<"$json")"
The #sh builtin in jq escapes content to be eval-safe in bash, and the eval invocation then ensures that the content goes through all parsing stages (so literal quotes in the data emitted by jq become syntactic).
However, all solutions that allow arbitrary shell variables to be assigned have innate security problems, as the ability to set variables like PATH, LD_LIBRARY_PATH, LD_PRELOAD and the like can be leveraged into arbitrary code execution.
Better form is to generate a NUL-delimited key/value list...
build_kv_nsv() {
jq -j '.params[] |
((.Name | gsub("\u0000"; "")),
"\u0000",
(.Value | gsub("\u0000"; "")),
"\u0000")'
}
...and either populate an associative array...
declare -A content_received=( )
while IFS= read -r -d '' name && IFS= read -r -d '' value; do
content_received[$name]=$value
done < <(build_kv_nsv <<<"$json")
# print the value of the populated associative array
declare -p content_received
...or to use a namespace that's prefixed to guarantee safety.
while IFS= read -r -d '' name && IFS= read -r -d '' value; do
printf -v "received_$name" %s "$value" && export "received_$name"
done < <(build_kv_nsv <<<"$json")
# print names and values of our variables that start with received_
declare -p "${!received_#}" >&2
If the values are known not to contain (raw) newlines, and if you have access to mapfile, it may be worthwhile considering using it, e.g.
$ json='{"params":[{ "Name":"KEY","Value":"value with space"}]}'
$ mapfile -t KEY < <( jq -r '.params[] | .Value' <<< "$json" )
$ echo N=${#KEY[#]}
N=1
If the values might contain (raw) newlines, then you'd need a version of mapfile with the -d option, which could be used as illustrated below:
$ json='{"params":[{ "Name":"KEY1","Value":"value with space"}, { "Name":"KEY2","Value":"value with \n newline"}]}'
$ mapfile -d $'\0' KEY < <( jq -r -j '.params[] | .Value + "\u0000"' <<< "$json" )
$ echo N=${#KEY[#]}
N=2

How get variable value from properties file in Shell Script?

I have a properties file test.properties and it contents are as follows:
x.T1 = 125
y.T2 = 256
z.T3 = 351
How can I assign the value of y.T2 (256) to some variable within shell script and echo this value?
Check this, will help exactly:
expVal=`cat test.properties | grep "y.T2" | cut -d'=' -f2`
You want to use read in a loop in your script. While you can source a file, it doesn't work if there are spaces surrounding the = sign. Here is a way to handle reading the file:
#!/bin/sh
# test for required input filename
if [ ! -r "$1" ]; then
printf "error: insufficient input or file not readable. Usage: %s property_file\n" "$0"
exit 1
fi
# read each line into 3 variables 'name, es, value`
# (the es is just a junk variable to read the equal sign)
# test if '$name=y.T2' if so use '$value'
while read -r name es value; do
if [ "$name" == "y.T2" ]; then
myvalue="$value"
fi
done < "$1"
printf "\n myvalue = %s\n\n" "$myvalue"
Output
$ sh read_prop.sh test.properties
myvalue = 256
I know this is an old question, but I just came across this and if I understand correctly, the question is to get value for a specific key from the properties file.
Why not just use grep to find the key and awk to get the value?
Use grep and awk to extract the value from test.properties
export yT2=$(grep -iR "^y.T2" test.properties | awk -F "=" '{print $2}')
echo y.T2=$yT2
This value will contain space if there's one after the '='. Trim the leading whitespace
yT2="${yT2#"${yT2%%[![:space:]]*}"}"
Reference for trim:
How to trim whitespace from a Bash variable?.
Explanation available at reference link.

Resources