How to extract two attribute values from a json output and assign each of them to a dedicated variable in bash? - bash

Please, observe:
~$ az ad sp show --id $spn_id --query '[appId, displayName]'
[
"c...1",
"xyz"
]
~$
I would like to assign each returned value to its own bash variable, namely to APP_ID and APP_DISPLAY_NAME respectively. My current solution is this:
~$ x=(`az ad sp show --id $spn_id --query '[appId, displayName]' -o tsv | tr -d '\r'`)
~$ APP_ID=${x[0]}
~$ APP_DISPLAY_NAME=${x[1]}
~$
Which works fine:
~$ echo $APP_ID
c...1
~$ echo $APP_DISPLAY_NAME
xyz
~$
I am curious if it is possible to do it in a more concise way?

Yes, using jq would be the cleanest way:
str='[
"c...1",
"xyz"
]'
# The '.[N]' syntax gets the Nth item in the array
app_id=$(jq -r '.[0]' <<< "$str")
# 'jq' requires feeding the input string through standard input. use the '<<<' to do so
app_display_name=$(jq -r '.[1]' <<< "$str")
printf '%s\n' "id: $app_id"
printf '%s\n' "name: $app_display_name"

You can use the following with sed and xargs:
export $( az ad sp show --id $spn_id --query '[appId, displayName]' | tr -d "\n" | sed -E "s/\[\s+(.+)\,\s+(.+)\]/APP_ID=\"\1\" \nAPP_DISPLAY_NAME=\"\2\"/" | xargs -L 1)
~$ echo $APP_ID
c...1
~$ echo $APP_DISPLAY_NAME
xyz

Related

Export multiple environment variables extracted from a single jq invocation

When I use
<some commands that output a json> | jq -r '.Credentials | .AccessKeyId, .SecretKey, .SessionToken'
I get the following output:
ABCDEF
123456
AAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBB
Three distinct lines with various keys.
I would like to export these outputs as exports:
export AWS_ACCESS_KEY_ID=<the first line of the output>
export AWS_SECRET_KEY=<the second line of the output>
export AWS_SESSION_TOKEN=<the third line of the output>
How do I do that (and still remain with oneliner)?
I tried doing the following:
<some commands that output a json> | jq -r '.Credentials | .AccessKeyId, .SecretKey, .SessionToken' | export AWS_ACCESS_KEY_ID=`awk 'NR==1'`
and it works but
<some commands that output a json> | jq -r '.Credentials | .AccessKeyId, .SecretKey, .SessionToken' | export AWS_ACCESS_KEY_ID=`awk 'NR==1'; export AWS_SECRET_KEY=`awk 'NR==2'`
hangs.
I'm using zsh.
An option not yet discussed by other answers is using the jq #sh filter to generate code that's directly safe to eval.
eval "$(printf '%s\n' "$json" | jq -r '
.Credentials | ("export AWS_ACCESS_KEY=\(.AccessKeyId | #sh)",
"export AWS_SECRET_KEY=\(.SecretKey | #sh)",
"export AWS_SESSION_TOKEN=\(.SessionToken | #sh)")')"
Note that the above could trivially be one line, and was broken up to generate three separate shell commands only for the sake of readability:
eval "$(printf '%s\n' "$json" | jq -r '.Credentials | "export AWS_ACCESS_KEY=\(.AccessKeyId | #sh) AWS_SECRET_KEY=\(.SecretKey | #sh) AWS_SESSION_TOKEN=\(.SessionToken | #sh)"')"
One advantage of this approach, which no other answers currently provide as-written, is that it will correctly handle keys or tokens that contain literal newlines.
As suggested by Charles Duffy, you can use something like this:
{ read -r AWS_ACCESS_KEY_ID && read -r AWS_SECRET_KEY && read -r AWS_SESSION_TOKEN; } << EOF
$(<some commands that output a json> | jq -r '.Credentials | .AccessKeyId, .SecretKey, .SessionToken')
EOF
export AWS_ACCESS_KEY_ID AWS_SECRET_KEY AWS_SESSION_TOKEN
Also, as suggested by Charles, you can use a here string, like this:
{ read -r AWS_ACCESS_KEY_ID && read -r AWS_SECRET_KEY && read -r AWS_SESSION_TOKEN; } <<<"$(some commands that output a json | jq -r '.Credentials | .AccessKeyId, .SecretKey, .SessionToken')"
export AWS_ACCESS_KEY_ID AWS_SECRET_KEY AWS_SESSION_TOKEN
And here is a proof of concept:
$ unset a b c
$ { read -r a && read -r b && read -r c; }<< EOF
$(cat t.txt)
EOF
$ echo $a $b $c
ABCDEF 123456 AAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBB
$ unset a b c
$ { read -r a && read -r b && read -r c; }<<<"$(cat t.txt)"
$ echo $a $b $c
ABCDEF 123456 AAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBBAAAAAABBBBB
I'd do it like this:
for i in AWS_ACCESS_KEY_ID AWS_SECRET_KEY AWS_SESSION_TOKEN; do
read "$i" &&
export "$i"
done \
< <(json commands |
jq -r '...')
The variables are only exported if something is successfully read. If you want them exported regardless (empty), just remove the "and" operator (&&).

Using yq in for loop bash

I have a yaml array like below,
identitymappings:
- arn: "arn:aws:iam::12345567:role/AdmRole"
group: "system:masters"
user: "user1"
- arn: "arn:aws:iam::12345567:role/TestRole"
group: "system:masters"
user: "user2"
I am trying to parse this yaml in a bash script using for loop and yq.
for identityMapping in $(yq read test.yaml "identitymappings[*]"); do
roleArn=$identityMapping["arn"]
group=$identityMapping.group
user=$identityMapping.user
done
But I am not getting the expected results like not able to fetch the values of roleArn,group,user.
Please let me know how to fix this.
The way I would do it is:
# load array into a bash array
# need to output each entry as a single line
readarray identityMappings < <(yq e -o=j -I=0 '.identitymappings[]' test.yml )
for identityMapping in "${identityMappings[#]}"; do
# identity mapping is a yaml snippet representing a single entry
roleArn=$(echo "$identityMapping" | yq e '.arn' -)
echo "roleArn: $roleArn"
done
output:
roleArn: arn:aws:iam::12345567:role/AdmRole
roleArn: arn:aws:iam::12345567:role/TestRole
Disclaimer: I wrote yq
I wasn't able to comment on Charles Duffy's proper answer, but this works for yq v4 without the use of jq...
while IFS=$'\t' read -r roleArn group user _; do
echo "Role: $roleArn"
echo "Group: $group"
echo "User: $user"
done < <(yq e '.identitymappings[] | [.arn, .group, .user] | #tsv' test.yaml)
The easiest way to read from jq or yq into bash is to use a BashFAQ #1 while read loop to handle line-oriented data; in the below, we use #tsv to generate line-oriented output:
while IFS=$'\t' read -r roleArn group user _; do
echo "Role: $roleArn"
echo "Group: $group"
echo "User: $user"
done < <(yq -j read test.yaml \
| jq -r '.identitymappings[] | [.arn, .group, .user] | #tsv')
Note that if you were using the Python yq rather than the Go one, you could remove the yq -j read and just use yq -r '...' in place of jq -r '...'.
There is an improvement of #Rad4's answer that worked for me.
You can neatly loop through using latest yq and jq via:
for im in $(yq eval -o=j test.yaml | jq -cr '.identitymappings[]'); do
arn=$(echo $im | jq -r '.arn' -)
group=$(echo $im | jq -r '.group' -)
user=$(echo $im | jq -r '.user' -)
echo $arn $group $user
done
This loops through valid online jsons, which makes jq still work inside the loop.
The answer by #mike.f is a good one. However, it does not work on OSX machines, because readarray is not an available command.
You can read more about this here.
Here is the equivalent that would work on a mac:
# load array into a bash array
# need to output each entry as a single line
identitymappings=( $(yq e -o=j -I=0 '.identitymappings[]' test.yml ) )
for identityMapping in "${identityMappings[#]}"; do
# identity mapping is a yaml snippet representing a single entry
roleArn=$(echo "$identityMapping" | yq e '.arn' -)
echo "roleArn: $roleArn"
done
Get identitymappings length
Using index to access element of identitymappings
array_length=`yq e ". identitymappings | length - 1" test.yaml`
if [ $array_length -le 0 ] ; then
exit
fi
for element_index in `seq 0 $array_length`;do
arn=`yq e ".identitymappings[$element_index]. arn" test.yml`
group=`yq e ".identitymappings[$element_index]. group" test.yml`
user=`yq e ".identitymappings[$element_index]. user" test.yml`
done
How to get length of array in yq?
https://mikefarah.gitbook.io/yq/operators/length
I figured out..
for identityMapping in $(yq read test.yaml -j "identitymappings[*]"); do
echo $identityMapping
roleArn= echo $identityMapping | jq -r '.arn'
echo $roleArn
group= echo $identityMapping | jq -r '.group'
echo $group
user= echo $identityMapping | jq -r '.user'
echo $user

Bash script with long command as a concatenated string

Here is a sample bash script:
#!/bin/bash
array[0]="google.com"
array[1]="yahoo.com"
array[2]="bing.com"
pasteCommand="/usr/bin/paste -d'|'"
for val in "${array[#]}"; do
pasteCommand="${pasteCommand} <(echo \$(/usr/bin/dig -t A +short $val)) "
done
output=`$pasteCommand`
echo "$output"
Somehow it shows an error:
/usr/bin/paste: invalid option -- 't'
Try '/usr/bin/paste --help' for more information.
How can I fix it so that it works fine?
//EDIT:
Expected output is to get result from the 3 dig executions in a string delimited with | character. Mainly I am using paste that way because it allows to run the 3 dig commands in parallel and I can separate output using a delimiter so then I can easily parse it and still know the dig output to which domain (e.g google.com for first result) is assigned.
First, you should read BashFAQ/050 to understand why your approach failed. In short, do not put complex commands inside variables.
A simple bash script to give intended output could be something like that:
#!/bin/bash
sites=(google.com yahoo.com bing.com)
iplist=
for site in "${sites[#]}"; do
# Capture command's output into ips variable
ips=$(/usr/bin/dig -t A +short "$site")
# Prepend a '|' character, replace each newline character in ips variable
# with a space character and append the resulting string to the iplist variable
iplist+=\|${ips//$'\n'/' '}
done
iplist=${iplist:1} # Remove the leading '|' character
echo "$iplist"
outputs
172.217.18.14|98.137.246.7 72.30.35.9 98.138.219.231 98.137.246.8 72.30.35.10 98.138.219.232|13.107.21.200 204.79.197.200
It's easier to ask a question when you specify input and desired output in your question, then specify your try and why doesn't it work.
What i want is https://i.postimg.cc/13dsXvg7/required.png
$ array=("google.com" "yahoo.com" "bing.com")
$ printf "%s\n" "${array[#]}" | xargs -n1 sh -c '/usr/bin/dig -t A +short "$1" | paste -sd" "' _ | paste -sd '|'
172.217.16.14|72.30.35.9 98.138.219.231 98.137.246.7 98.137.246.8 72.30.35.10 98.138.219.232|204.79.197.200 13.107.21.200
I might try a recursive function like the following instead.
array=(google.com yahoo.com bing.com)
paster () {
dn=$1
shift
if [ "$#" -eq 0 ]; then
dig -t A +short "$dn"
else
paster "$#" | paste -d "|" <(dig -t A +short "$dn") -
fi
}
output=$(paster "${array[#]}")
echo "$output"
Now finally clear with expected output:
domains_arr=("google.com" "yahoo.com" "bing.com")
out_arr=()
for domain in "${domains_arr[#]}"
do
mapfile -t ips < <(dig -tA +short "$domain")
IFS=' '
# Join the ips array into a string with space as delimiter
# and add it to the out_arr
out_arr+=("${ips[*]}")
done
IFS='|'
# Join the out_arr array into a string with | as delimiter
echo "${out_arr[*]}"
If the array is big (and not just 3 sites) you may benefit from parallelization:
array=("google.com" "yahoo.com" "bing.com")
parallel -k 'echo $(/usr/bin/dig -t A +short {})' ::: "${array[#]}" |
paste -sd '|'

exporting environment variables with spaces using jq

So, I'm trying to export an environment variable that comes from an api that returns json values. Would like to use jq to just do a one liner, but if the values have spaces I cannot get it working
Trying without surrounding the value in quotes
/app/src $ $(echo '{"params":[{ "Name":"KEY","Value":"value with space"}]}' | jq
-r '.params[] | "export " + .Name + "=" + .Value')
/app/src $ printenv KEY
value
/app/src $
Next, I try wrapping the value in quotes
/app/src $ $(echo '{"params":[{ "Name":"KEY","Value":"value with space"}]}' | jq
-r '.params[] | "export " + .Name + "=\"" + .Value + "\""')
sh: export: space": bad variable name
/app/src $
For all of the below, I'm assuming that:
json='{"params":[{ "Name":"KEY","Value":"value with space"}]}'
It can be done, but ONLY IF YOU TRUST YOUR INPUT.
A solution that uses eval might look like:
eval "$(jq -r '.params[] | "export \(.Name | #sh)=\(.Value | #sh)"' <<<"$json")"
The #sh builtin in jq escapes content to be eval-safe in bash, and the eval invocation then ensures that the content goes through all parsing stages (so literal quotes in the data emitted by jq become syntactic).
However, all solutions that allow arbitrary shell variables to be assigned have innate security problems, as the ability to set variables like PATH, LD_LIBRARY_PATH, LD_PRELOAD and the like can be leveraged into arbitrary code execution.
Better form is to generate a NUL-delimited key/value list...
build_kv_nsv() {
jq -j '.params[] |
((.Name | gsub("\u0000"; "")),
"\u0000",
(.Value | gsub("\u0000"; "")),
"\u0000")'
}
...and either populate an associative array...
declare -A content_received=( )
while IFS= read -r -d '' name && IFS= read -r -d '' value; do
content_received[$name]=$value
done < <(build_kv_nsv <<<"$json")
# print the value of the populated associative array
declare -p content_received
...or to use a namespace that's prefixed to guarantee safety.
while IFS= read -r -d '' name && IFS= read -r -d '' value; do
printf -v "received_$name" %s "$value" && export "received_$name"
done < <(build_kv_nsv <<<"$json")
# print names and values of our variables that start with received_
declare -p "${!received_#}" >&2
If the values are known not to contain (raw) newlines, and if you have access to mapfile, it may be worthwhile considering using it, e.g.
$ json='{"params":[{ "Name":"KEY","Value":"value with space"}]}'
$ mapfile -t KEY < <( jq -r '.params[] | .Value' <<< "$json" )
$ echo N=${#KEY[#]}
N=1
If the values might contain (raw) newlines, then you'd need a version of mapfile with the -d option, which could be used as illustrated below:
$ json='{"params":[{ "Name":"KEY1","Value":"value with space"}, { "Name":"KEY2","Value":"value with \n newline"}]}'
$ mapfile -d $'\0' KEY < <( jq -r -j '.params[] | .Value + "\u0000"' <<< "$json" )
$ echo N=${#KEY[#]}
N=2

Redirect output to a bash array

I have a file containing the string
ipAddress=10.78.90.137;10.78.90.149
I'd like to place these two IP addresses in a bash array. To achieve that I tried the following:
n=$(grep -i ipaddress /opt/ipfile | cut -d'=' -f2 | tr ';' ' ')
This results in extracting the values alright but for some reason the size of the array is returned as 1 and I notice that both the values are identified as the first element in the array. That is
echo ${n[0]}
returns
10.78.90.137 10.78.90.149
How do I fix this?
Thanks for the help!
do you really need an array
bash
$ ipAddress="10.78.90.137;10.78.90.149"
$ IFS=";"
$ set -- $ipAddress
$ echo $1
10.78.90.137
$ echo $2
10.78.90.149
$ unset IFS
$ echo $# #this is "array"
if you want to put into array
$ a=( $# )
$ echo ${a[0]}
10.78.90.137
$ echo ${a[1]}
10.78.90.149
#OP, regarding your method: set your IFS to a space
$ IFS=" "
$ n=( $(grep -i ipaddress file | cut -d'=' -f2 | tr ';' ' ' | sed 's/"//g' ) )
$ echo ${n[1]}
10.78.90.149
$ echo ${n[0]}
10.78.90.137
$ unset IFS
Also, there is no need to use so many tools. you can just use awk, or simply the bash shell
#!/bin/bash
declare -a arr
while IFS="=" read -r caption addresses
do
case "$caption" in
ipAddress*)
addresses=${addresses//[\"]/}
arr=( ${arr[#]} ${addresses//;/ } )
esac
done < "file"
echo ${arr[#]}
output
$ more file
foo
bar
ipAddress="10.78.91.138;10.78.90.150;10.77.1.101"
foo1
ipAddress="10.78.90.137;10.78.90.149"
bar1
$./shell.sh
10.78.91.138 10.78.90.150 10.77.1.101 10.78.90.137 10.78.90.149
gawk
$ n=( $(gawk -F"=" '/ipAddress/{gsub(/\"/,"",$2);gsub(/;/," ",$2) ;printf $2" "}' file) )
$ echo ${n[#]}
10.78.91.138 10.78.90.150 10.77.1.101 10.78.90.137 10.78.90.149
This one works:
n=(`grep -i ipaddress filename | cut -d"=" -f2 | tr ';' ' '`)
EDIT: (improved, nestable version as per Dennis)
n=($(grep -i ipaddress filename | cut -d"=" -f2 | tr ';' ' '))
A variation on a theme:
$ line=$(grep -i ipaddress /opt/ipfile)
$ saveIFS="$IFS" # always save it and put it back to be safe
$ IFS="=;"
$ n=($line)
$ IFS="$saveIFS"
$ echo ${n[0]}
ipAddress
$ echo ${n[1]}
10.78.90.137
$ echo ${n[2]}
10.78.90.149
If the file has no other contents, you may not need the grep and you could read in the whole file.
$ saveIFS="$IFS"
$ IFS="=;"
$ n=$(</opt/ipfile)
$ IFS="$saveIFS"
A Perl solution:
n=($(perl -ne 's/ipAddress=(.*);/$1 / && print' filename))
which tests for and removes the unwanted characters in one operation.
You can do this by using IFS in bash.
First read the first line from file.
Seoncd convert that to an array with = as delimeter.
Third convert the value to an array with ; as delimeter.
Thats it !!!
#!/bin/bash
IFS='\n' read -r lstr < "a.txt"
IFS='=' read -r -a lstr_arr <<< $lstr
IFS=';' read -r -a ip_arr <<< ${lstr_arr[1]}
echo ${ip_arr[0]}
echo ${ip_arr[1]}

Resources