Export multiple environment variables extracted from a single jq invocation - shell

When I use
<some commands that output a json> | jq -r '.Credentials | .AccessKeyId, .SecretKey, .SessionToken'
I get the following output:
ABCDEF
123456
AAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBB
Three distinct lines with various keys.
I would like to export these outputs as exports:
export AWS_ACCESS_KEY_ID=<the first line of the output>
export AWS_SECRET_KEY=<the second line of the output>
export AWS_SESSION_TOKEN=<the third line of the output>
How do I do that (and still remain with oneliner)?
I tried doing the following:
<some commands that output a json> | jq -r '.Credentials | .AccessKeyId, .SecretKey, .SessionToken' | export AWS_ACCESS_KEY_ID=`awk 'NR==1'`
and it works but
<some commands that output a json> | jq -r '.Credentials | .AccessKeyId, .SecretKey, .SessionToken' | export AWS_ACCESS_KEY_ID=`awk 'NR==1'; export AWS_SECRET_KEY=`awk 'NR==2'`
hangs.
I'm using zsh.

An option not yet discussed by other answers is using the jq #sh filter to generate code that's directly safe to eval.
eval "$(printf '%s\n' "$json" | jq -r '
.Credentials | ("export AWS_ACCESS_KEY=\(.AccessKeyId | #sh)",
"export AWS_SECRET_KEY=\(.SecretKey | #sh)",
"export AWS_SESSION_TOKEN=\(.SessionToken | #sh)")')"
Note that the above could trivially be one line, and was broken up to generate three separate shell commands only for the sake of readability:
eval "$(printf '%s\n' "$json" | jq -r '.Credentials | "export AWS_ACCESS_KEY=\(.AccessKeyId | #sh) AWS_SECRET_KEY=\(.SecretKey | #sh) AWS_SESSION_TOKEN=\(.SessionToken | #sh)"')"
One advantage of this approach, which no other answers currently provide as-written, is that it will correctly handle keys or tokens that contain literal newlines.

As suggested by Charles Duffy, you can use something like this:
{ read -r AWS_ACCESS_KEY_ID && read -r AWS_SECRET_KEY && read -r AWS_SESSION_TOKEN; } << EOF
$(<some commands that output a json> | jq -r '.Credentials | .AccessKeyId, .SecretKey, .SessionToken')
EOF
export AWS_ACCESS_KEY_ID AWS_SECRET_KEY AWS_SESSION_TOKEN
Also, as suggested by Charles, you can use a here string, like this:
{ read -r AWS_ACCESS_KEY_ID && read -r AWS_SECRET_KEY && read -r AWS_SESSION_TOKEN; } <<<"$(some commands that output a json | jq -r '.Credentials | .AccessKeyId, .SecretKey, .SessionToken')"
export AWS_ACCESS_KEY_ID AWS_SECRET_KEY AWS_SESSION_TOKEN
And here is a proof of concept:
$ unset a b c
$ { read -r a && read -r b && read -r c; }<< EOF
$(cat t.txt)
EOF
$ echo $a $b $c
ABCDEF 123456 AAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBB
$ unset a b c
$ { read -r a && read -r b && read -r c; }<<<"$(cat t.txt)"
$ echo $a $b $c
ABCDEF 123456 AAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBB

I'd do it like this:
for i in AWS_ACCESS_KEY_ID AWS_SECRET_KEY AWS_SESSION_TOKEN; do
read "$i" &&
export "$i"
done \
< <(json commands |
jq -r '...')
The variables are only exported if something is successfully read. If you want them exported regardless (empty), just remove the "and" operator (&&).

Related

How to extract two attribute values from a json output and assign each of them to a dedicated variable in bash?

Please, observe:
~$ az ad sp show --id $spn_id --query '[appId, displayName]'
[
"c...1",
"xyz"
]
~$
I would like to assign each returned value to its own bash variable, namely to APP_ID and APP_DISPLAY_NAME respectively. My current solution is this:
~$ x=(`az ad sp show --id $spn_id --query '[appId, displayName]' -o tsv | tr -d '\r'`)
~$ APP_ID=${x[0]}
~$ APP_DISPLAY_NAME=${x[1]}
~$
Which works fine:
~$ echo $APP_ID
c...1
~$ echo $APP_DISPLAY_NAME
xyz
~$
I am curious if it is possible to do it in a more concise way?
Yes, using jq would be the cleanest way:
str='[
"c...1",
"xyz"
]'
# The '.[N]' syntax gets the Nth item in the array
app_id=$(jq -r '.[0]' <<< "$str")
# 'jq' requires feeding the input string through standard input. use the '<<<' to do so
app_display_name=$(jq -r '.[1]' <<< "$str")
printf '%s\n' "id: $app_id"
printf '%s\n' "name: $app_display_name"
You can use the following with sed and xargs:
export $( az ad sp show --id $spn_id --query '[appId, displayName]' | tr -d "\n" | sed -E "s/\[\s+(.+)\,\s+(.+)\]/APP_ID=\"\1\" \nAPP_DISPLAY_NAME=\"\2\"/" | xargs -L 1)
~$ echo $APP_ID
c...1
~$ echo $APP_DISPLAY_NAME
xyz

Writing a comparison BATCH file to verify sha256sum to released code

Trying to write a script that takes 2 arguments ($1 and $2) one to represent the $hash and the $file_name.
I am trying to utilize jq to parse the required data to download and compare PASS or FAIL.
I see to be stuck trying to think this out.
Here is my code
#!/usr/bin/env sh
#
# Sifchain shasum check (revised).
#
# $1
hash_url=$( curl -R -s https://api.github.com/repos/Sifchain/sifnode/releases | jq '.[] | select(.name=="v0.10.0-rc.4")' | jq '.assets[]' | jq 'select(.name=="sifnoded-v0.10.0-rc.4-linux-amd64.zip.sha256")' | jq '.browser_download_url' | xargs $1 $2 )
echo $hash_url
# $2
hash=$( curl -s -L $hash_url | jq'.$2')
file_name=$(curl -R -s https://api.github.com/repos/Sifchain/sifnode/releases | jq '.[] | .name')
#
#
echo $hash | sha256sum
echo $file_name | sha256sum #null why?
echo "\n"
## version of the release $1, and the hash $2
## sha256 <expected_sha_256_sum> <name_of_the_file>
sha256() {
if echo "$1 $2" #| sha256sum -c --quiet
then
echo pass $1 $2
exit 0
else
echo FAIL $1 $2
exit 1
fi
}
# Invoke sha256
sha256 $hash_url $file_name
Ideally this should work for any comparison of hash with correct file, pulling the 2 parameters when the BASH script is invoked.
I can suggest the following corrections/modifications:
#!/bin/bash
#sha file
SHA_URL=$(curl -R -s https://api.github.com/repos/Sifchain/sifnode/releases | \
jq --arg VERSION v0.10.0-rc.4 -r \
'.[] | select(.name==$VERSION) | .assets[] | select(.name |test("\\.sha256$")) | .browser_download_url')
SHA_VALUE=$(curl -s -L $SHA_URL| tr 1 2)
FILENAME=$(curl -R -s https://api.github.com/repos/Sifchain/sifnode/releases | \
jq --arg VERSION v0.10.0-rc.4 -r \
'.[] | select(.name==$VERSION) | .assets[] | select(.content_type =="application/zip") | .name')
#added just for testing, I'm assuming you have the files locally allready
FILEURL=$(curl -R -s https://api.github.com/repos/Sifchain/sifnode/releases | \
jq --arg VERSION v0.10.0-rc.4 -r \
'.[] | select(.name==$VERSION) | .assets[] | select(.content_type =="application/zip") | .browser_download_url')
wget --quiet $FILEURL -O $FILENAME
echo $SHA_VALUE $FILENAME | sha256sum -c --quiet >/dev/null 2>&1
RESULT=$?
if [ $RESULT -eq 0 ]; then
echo -n "PASS "
else
echo -n "FAIL "
fi
echo $SHA_VALUE $FILENAME
exit $RESULT
Notes:
jq
--arg VERSION v0.10.0-rc.4 creates a "variable" to be used in the script
-r - raw output, strings are not quoted
test("\\.sha256$") - regular expresion, used to search for a generic sha256, so you don't have to hardcode the full name
select(.content_type =="application/zip") - I'm assuming that's the file you are searching for
wget is used just for demo purpose, to download the file, I'm assuming you already have the file on your machine
sha256sum -c --quiet >/dev/null 2>&1 - redirecting to /dev/null is necessary because in case of error sha256sum is not quiet

Using yq in for loop bash

I have a yaml array like below,
identitymappings:
- arn: "arn:aws:iam::12345567:role/AdmRole"
group: "system:masters"
user: "user1"
- arn: "arn:aws:iam::12345567:role/TestRole"
group: "system:masters"
user: "user2"
I am trying to parse this yaml in a bash script using for loop and yq.
for identityMapping in $(yq read test.yaml "identitymappings[*]"); do
roleArn=$identityMapping["arn"]
group=$identityMapping.group
user=$identityMapping.user
done
But I am not getting the expected results like not able to fetch the values of roleArn,group,user.
Please let me know how to fix this.
The way I would do it is:
# load array into a bash array
# need to output each entry as a single line
readarray identityMappings < <(yq e -o=j -I=0 '.identitymappings[]' test.yml )
for identityMapping in "${identityMappings[#]}"; do
# identity mapping is a yaml snippet representing a single entry
roleArn=$(echo "$identityMapping" | yq e '.arn' -)
echo "roleArn: $roleArn"
done
output:
roleArn: arn:aws:iam::12345567:role/AdmRole
roleArn: arn:aws:iam::12345567:role/TestRole
Disclaimer: I wrote yq
I wasn't able to comment on Charles Duffy's proper answer, but this works for yq v4 without the use of jq...
while IFS=$'\t' read -r roleArn group user _; do
echo "Role: $roleArn"
echo "Group: $group"
echo "User: $user"
done < <(yq e '.identitymappings[] | [.arn, .group, .user] | #tsv' test.yaml)
The easiest way to read from jq or yq into bash is to use a BashFAQ #1 while read loop to handle line-oriented data; in the below, we use #tsv to generate line-oriented output:
while IFS=$'\t' read -r roleArn group user _; do
echo "Role: $roleArn"
echo "Group: $group"
echo "User: $user"
done < <(yq -j read test.yaml \
| jq -r '.identitymappings[] | [.arn, .group, .user] | #tsv')
Note that if you were using the Python yq rather than the Go one, you could remove the yq -j read and just use yq -r '...' in place of jq -r '...'.
There is an improvement of #Rad4's answer that worked for me.
You can neatly loop through using latest yq and jq via:
for im in $(yq eval -o=j test.yaml | jq -cr '.identitymappings[]'); do
arn=$(echo $im | jq -r '.arn' -)
group=$(echo $im | jq -r '.group' -)
user=$(echo $im | jq -r '.user' -)
echo $arn $group $user
done
This loops through valid online jsons, which makes jq still work inside the loop.
The answer by #mike.f is a good one. However, it does not work on OSX machines, because readarray is not an available command.
You can read more about this here.
Here is the equivalent that would work on a mac:
# load array into a bash array
# need to output each entry as a single line
identitymappings=( $(yq e -o=j -I=0 '.identitymappings[]' test.yml ) )
for identityMapping in "${identityMappings[#]}"; do
# identity mapping is a yaml snippet representing a single entry
roleArn=$(echo "$identityMapping" | yq e '.arn' -)
echo "roleArn: $roleArn"
done
Get identitymappings length
Using index to access element of identitymappings
array_length=`yq e ". identitymappings | length - 1" test.yaml`
if [ $array_length -le 0 ] ; then
exit
fi
for element_index in `seq 0 $array_length`;do
arn=`yq e ".identitymappings[$element_index]. arn" test.yml`
group=`yq e ".identitymappings[$element_index]. group" test.yml`
user=`yq e ".identitymappings[$element_index]. user" test.yml`
done
How to get length of array in yq?
https://mikefarah.gitbook.io/yq/operators/length
I figured out..
for identityMapping in $(yq read test.yaml -j "identitymappings[*]"); do
echo $identityMapping
roleArn= echo $identityMapping | jq -r '.arn'
echo $roleArn
group= echo $identityMapping | jq -r '.group'
echo $group
user= echo $identityMapping | jq -r '.user'
echo $user

exporting environment variables with spaces using jq

So, I'm trying to export an environment variable that comes from an api that returns json values. Would like to use jq to just do a one liner, but if the values have spaces I cannot get it working
Trying without surrounding the value in quotes
/app/src $ $(echo '{"params":[{ "Name":"KEY","Value":"value with space"}]}' | jq
-r '.params[] | "export " + .Name + "=" + .Value')
/app/src $ printenv KEY
value
/app/src $
Next, I try wrapping the value in quotes
/app/src $ $(echo '{"params":[{ "Name":"KEY","Value":"value with space"}]}' | jq
-r '.params[] | "export " + .Name + "=\"" + .Value + "\""')
sh: export: space": bad variable name
/app/src $
For all of the below, I'm assuming that:
json='{"params":[{ "Name":"KEY","Value":"value with space"}]}'
It can be done, but ONLY IF YOU TRUST YOUR INPUT.
A solution that uses eval might look like:
eval "$(jq -r '.params[] | "export \(.Name | #sh)=\(.Value | #sh)"' <<<"$json")"
The #sh builtin in jq escapes content to be eval-safe in bash, and the eval invocation then ensures that the content goes through all parsing stages (so literal quotes in the data emitted by jq become syntactic).
However, all solutions that allow arbitrary shell variables to be assigned have innate security problems, as the ability to set variables like PATH, LD_LIBRARY_PATH, LD_PRELOAD and the like can be leveraged into arbitrary code execution.
Better form is to generate a NUL-delimited key/value list...
build_kv_nsv() {
jq -j '.params[] |
((.Name | gsub("\u0000"; "")),
"\u0000",
(.Value | gsub("\u0000"; "")),
"\u0000")'
}
...and either populate an associative array...
declare -A content_received=( )
while IFS= read -r -d '' name && IFS= read -r -d '' value; do
content_received[$name]=$value
done < <(build_kv_nsv <<<"$json")
# print the value of the populated associative array
declare -p content_received
...or to use a namespace that's prefixed to guarantee safety.
while IFS= read -r -d '' name && IFS= read -r -d '' value; do
printf -v "received_$name" %s "$value" && export "received_$name"
done < <(build_kv_nsv <<<"$json")
# print names and values of our variables that start with received_
declare -p "${!received_#}" >&2
If the values are known not to contain (raw) newlines, and if you have access to mapfile, it may be worthwhile considering using it, e.g.
$ json='{"params":[{ "Name":"KEY","Value":"value with space"}]}'
$ mapfile -t KEY < <( jq -r '.params[] | .Value' <<< "$json" )
$ echo N=${#KEY[#]}
N=1
If the values might contain (raw) newlines, then you'd need a version of mapfile with the -d option, which could be used as illustrated below:
$ json='{"params":[{ "Name":"KEY1","Value":"value with space"}, { "Name":"KEY2","Value":"value with \n newline"}]}'
$ mapfile -d $'\0' KEY < <( jq -r -j '.params[] | .Value + "\u0000"' <<< "$json" )
$ echo N=${#KEY[#]}
N=2

Bash Parse Variable Values

I have a command and it returns 108 set of week/enumeration:
Command:
impala-shell -B -f query.sql
Results:
20180203 1
20180127 2
20180120 3
...
I parsed the results and read the week and enumeration as two variables. However, I have to use a variable wk to store intermediate results first:
wk="$(impala-shell -B -f query.sql)"
echo "$wk" | while read -r a b; do echo $a--$b; done
I tried to avoid using additional variable wk:
"$(impala-shell -B -f query.sql)" | while read -r a b; do echo $a--$b; done
But it returned:
...
20160213 104
20160206 105
20160130 106
20160123 107
20160116 108: command not found
I understand you can use wk="$(impala-shell -B -f query.sql)" && echo "$wk" | while read -r a b; do echo $a--$b; done but that doesn't avoid using a variable in the middle. How to compose a one-liner without using the variable wk?
or
awk to the rescue!
$ impala-shell -B -f query.sql | awk '{print $1"--"$2}'
You can execute commands first (inline) when using special quotes ``
Try this (untested, as i neither have your shell, nor that script):
`impala-shell -B -f query.sql` | while read -r a b; do echo $a--$b; done
Most elegant answer goes to choroba in the question comments! You just need to remove the quotes!
impala-shell -B -f query.sql | while read -r a b ; do echo $a--$b; done

Resources