I have below command:
ExpirationDate=$(date -d '+60 days' +'%Y-%m-%d')
VaultName="abc"
getapp=$(az keyvault secret list --vault-name $VaultName --query "[].{SecretName:name,ExpiryDate:attributes.expires} [?ExpiryDate<='$ExpirationDate']" | jq '.[].SecretName' | tr -d '"')
getserviceprincipal=$(az keyvault secret list --vault-name $VaultName --query "[].{Type:contentType,ExpiryDate:attributes.expires} [?ExpiryDate<='$ExpirationDate']" | jq '.[].Type' | tr -d '"')
## get length of $distro array
len=${#getapp[#]}
## Use bash for loop
for (( i=0; i-le$len-1; i++ ))
do
echo "${getapp[$i]}"
./resetpassword.sh -a ${getapp[$i]} -s ${getserviceprincipal[$i]} -y
echo "${getserviceprincipal[$i]}"
done
in this command I want store all value of vault name getapp and similarly getserviceprincipal. Example If I have more then 2 vault in getapp variable then script is not working due to $getapp is not storing variable in array.
Is anyone help me to put out this simple solutions!! Thanks In Advance..
readarray -t getapp < <( az keyvault ... | tr -d '"' ) should do the trick here.
Note that this requires newlines to be valid delimiters. If there can be newlines in your data then you'll have to pick a different delimiter with the -d delim option. If there isn't any single delimiter that works everywhere then bash may not be the best choice for this.
Since you are using jq, I think you could so something like that:
declare -a getapp=()
declare -a getserviceprincipal=()
# note: be sure to check that the resulting bash is valid!
eval(az keyvault secret list \
--vault-name $VaultName \
--query "[].{SecretName:name,ExpiryDate:attributes.expires} [?ExpiryDate<='$ExpirationDate']" \
| jq --raw-output '.[] | #sh "getapp+=( \(.SecretName) ) ; getserviceprincipal+=( \(.Type) );' ")
If all goes well, this will result in getapp and getserviceprincipal being filled as array: https://jqplay.org/s/BbHMn9i79KB
Note:
as you can see, you don't need to invoke your command (az) twice.
you can also extract the jq expression to a file using the --from-file option, which may help when reading it and handling shell quotes.
Related
Background
I want to be able to pass a json file to WP CLI, to iteratively create posts.
So I thought I could create a JSON file:
[
{
"post_type": "post",
"post_title": "Test",
"post_content": "[leaflet-map][leaflet-marker]",
"post_status": "publish"
},
{
"post_type": "post",
"post_title": "Number 2",
"post_content": "[leaflet-map fitbounds][leaflet-circle]",
"post_status": "publish"
}
]
and iterate the array with jq:
cat posts.json | jq --raw-output .[]
I want to be able to iterate these to execute a similar function:
wp post create \
--post_type=post \
--post_title='Test Map' \
--post_content='[leaflet-map] [leaflet-marker]' \
--post_status='publish'
Is there a way I can do this with jq, or similar?
The closest I've gotten so far is this:
> for i in $(cat posts.json | jq -c .[]); do echo $i; done
But this seems to take issue with the (valid) spaces in the strings. Output:
{"post_type":"post","post_title":"Test","post_content":"[leaflet-map][leaflet-marker]","post_status":"publish"}
{"post_type":"post","post_title":"Number
2","post_content":"[leaflet-map
fitbounds][leaflet-circle]","post_status":"publish"}
Am I way off with this approach, or can it be done?
Use a while to read entire lines, rather than iterating over the words resulting from the command substitution.
while IFS= read -r obj; do
...
done < <(jq -c '.[]' posts.json)
Maybe this would work for you:
Make a bash executable, maybe call it wpfunction.sh
#!/bin/bash
wp post create \
--post_type="$1"\
--post_title="$2" \
--post_content="$3" \
--post_status="$4"
Then run jq on your posts.json and pipe it into xargs
jq -M -c '.[] | [.post_type, .post_title, .post_content, .post_status][]' \
posts.json | xargs -n4 ./wpfunction`
I am experimenting to see how this would handle post_content that contained quotes...
First generate an array of the arguments you wish to pass then convert to a shell compatible form using #sh. Then you could pass to xargs to invoke the command.
$ jq -r '.[] | ["post", "create", (to_entries[] | "--\(.key)=\(.value|tojson)")] | #sh' input.json | xargs wp
I'm using git, then posting the commit message and other bits as a JSON payload to a server.
Currently I have:
MSG=`git log -n 1 --format=oneline | grep -o ' .\+'`
which sets MSG to something like:
Calendar can't go back past today
then
curl -i -X POST \
-H 'Accept: application/text' \
-H 'Content-type: application/json' \
-d "{'payload': {'message': '$MSG'}}" \
'https://example.com'
My real JSON has another couple of fields.
This works fine, but of course when I have a commit message such as the one above with an apostrophe in it, the JSON is invalid.
How can I escape the characters required in bash? I'm not familiar with the language, so am not sure where to start. Replacing ' with \' would do the job at minimum I suspect.
jq can do this.
Lightweight, free, and written in C, jq enjoys widespread community support with over 15k stars on GitHub. I personally find it very speedy and useful in my daily workflow.
Convert string to JSON
echo -n '猫に小判' | jq -Rsa .
# "\u732b\u306b\u5c0f\u5224"
To explain,
-R means "raw input"
-s means "include linebreaks" (mnemonic: "slurp")
-a means "ascii output" (optional)
. means "output the root of the JSON document"
Git + Grep Use Case
To fix the code example given by the OP, simply pipe through jq.
MSG=`git log -n 1 --format=oneline | grep -o ' .\+' | jq -Rsa .`
Using Python:
This solution is not pure bash, but it's non-invasive and handles unicode.
json_escape () {
printf '%s' "$1" | python -c 'import json,sys; print(json.dumps(sys.stdin.read()))'
}
Note that JSON is part of the standard python libraries and has been for a long time, so this is a pretty minimal python dependency.
Or using PHP:
json_escape () {
printf '%s' "$1" | php -r 'echo json_encode(file_get_contents("php://stdin"));'
}
Use like so:
$ json_escape "ヤホー"
"\u30e4\u30db\u30fc"
Instead of worrying about how to properly quote the data, just save it to a file and use the # construct that curl allows with the --data option. To ensure that the output of git is correctly escaped for use as a JSON value, use a tool like jq to generate the JSON, instead of creating it manually.
jq -n --arg msg "$(git log -n 1 --format=oneline | grep -o ' .\+')" \
'{payload: { message: $msg }}' > git-tmp.txt
curl -i -X POST \
-H 'Accept: application/text' \
-H 'Content-type: application/json' \
-d #git-tmp.txt \
'https://example.com'
You can also read directly from standard input using -d #-; I leave that as an exercise for the reader to construct the pipeline that reads from git and produces the correct payload message to upload with curl.
(Hint: it's jq ... | curl ... -d#- 'https://example.com' )
I was also trying to escape characters in Bash, for transfer using JSON, when I came across this. I found that there is actually a larger list of characters that must be escaped – particularly if you are trying to handle free form text.
There are two tips I found useful:
Use the Bash ${string//substring/replacement} syntax described in this thread.
Use the actual control characters for tab, newline, carriage return, etc. In vim you can enter these by typing Ctrl+V followed by the actual control code (Ctrl+I for tab for example).
The resultant Bash replacements I came up with are as follows:
JSON_TOPIC_RAW=${JSON_TOPIC_RAW//\\/\\\\} # \
JSON_TOPIC_RAW=${JSON_TOPIC_RAW//\//\\\/} # /
JSON_TOPIC_RAW=${JSON_TOPIC_RAW//\'/\\\'} # ' (not strictly needed ?)
JSON_TOPIC_RAW=${JSON_TOPIC_RAW//\"/\\\"} # "
JSON_TOPIC_RAW=${JSON_TOPIC_RAW// /\\t} # \t (tab)
JSON_TOPIC_RAW=${JSON_TOPIC_RAW//
/\\\n} # \n (newline)
JSON_TOPIC_RAW=${JSON_TOPIC_RAW//^M/\\\r} # \r (carriage return)
JSON_TOPIC_RAW=${JSON_TOPIC_RAW//^L/\\\f} # \f (form feed)
JSON_TOPIC_RAW=${JSON_TOPIC_RAW//^H/\\\b} # \b (backspace)
I have not at this stage worked out how to escape Unicode characters correctly which is also (apparently) required. I will update my answer if I work this out.
OK, found out what to do. Bash supports this natively as expected, though as always, the syntax isn't really very guessable!
Essentially ${string//substring/replacement} returns what you'd image, so you can use
MSG=${MSG//\'/\\\'}
To do this. The next problem is that the first regex doesn't work anymore, but that can be replaced with
git log -n 1 --pretty=format:'%s'
In the end, I didn't even need to escape them. Instead, I just swapped all the ' in the JSON to \". Well, you learn something every day.
git log -n 1 --format=oneline | grep -o ' .\+' | jq --slurp --raw-input
The above line works for me. refer to
https://github.com/stedolan/jq for more jq tools
I found something like that :
MSG=`echo $MSG | sed "s/'/\\\\\'/g"`
The simplest way is using jshon, a command line tool to parse, read and create JSON.
jshon -s 'Your data goes here.' 2>/dev/null
[...] with an apostrophe in it, the JSON is invalid.
Not according to https://www.json.org. A single quote is allowed in a JSON string.
How can I escape the characters required in bash?
You can use xidel to properly prepare the JSON you want to POST.
As https://example.com can't be tested, I'll be using https://api.github.com/markdown (see this answer) as an example.
Let's assume 'çömmít' "mêssågè" as the exotic output of git log -n 1 --pretty=format:'%s'.
Create the (serialized) JSON object with the value of the "text"-attribute properly escaped:
$ git log -n 1 --pretty=format:'%s' | \
xidel -se 'serialize({"text":$raw},{"method":"json","encoding":"us-ascii"})'
{"text":"'\u00E7\u00F6mm\u00EDt' \"m\u00EAss\u00E5g\u00E8\""}
Curl (variable)
$ eval "$(
git log -n 1 --pretty=format:'%s' | \
xidel -se 'msg:=serialize({"text":$raw},{"method":"json","encoding":"us-ascii"})' --output-format=bash
)"
$ echo $msg
{"text":"'\u00E7\u00F6mm\u00EDt' \"m\u00EAss\u00E5g\u00E8\""}
$ curl -d "$msg" https://api.github.com/markdown
<p>'çömmít' "mêssågè"</p>
Curl (pipe)
$ git log -n 1 --pretty=format:'%s' | \
xidel -se 'serialize({"text":$raw},{"method":"json","encoding":"us-ascii"})' | \
curl -d#- https://api.github.com/markdown
<p>'çömmít' "mêssågè"</p>
Actually, there's no need for curl if you're already using xidel.
Xidel (pipe)
$ git log -n 1 --pretty=format:'%s' | \
xidel -s \
-d '{serialize({"text":read()},{"method":"json","encoding":"us-ascii"})}' \
"https://api.github.com/markdown" \
-e '$raw'
<p>'çömmít' "mêssågè"</p>
Xidel (pipe, in-query)
$ git log -n 1 --pretty=format:'%s' | \
xidel -se '
x:request({
"post":serialize(
{"text":$raw},
{"method":"json","encoding":"us-ascii"}
),
"url":"https://api.github.com/markdown"
})/raw
'
<p>'çömmít' "mêssågè"</p>
Xidel (all in-query)
$ xidel -se '
x:request({
"post":serialize(
{"text":system("git log -n 1 --pretty=format:'\''%s'\''")},
{"method":"json","encoding":"us-ascii"}
),
"url":"https://api.github.com/markdown"
})/raw
'
<p>'çömmít' "mêssågè"</p>
This is an escaping solution using Perl that escapes backslash (\), double-quote (") and control characters U+0000 to U+001F:
$ echo -ne "Hello, 🌵\n\tBye" | \
perl -pe 's/(\\(\\\\)*)/$1$1/g; s/(?!\\)(["\x00-\x1f])/sprintf("\\u%04x",ord($1))/eg;'
Hello, 🌵\u000a\u0009Bye
I struggled with the same problem. I was trying to add a variable on the payload of cURL in bash and it kept returning as invalid_JSON. After trying a LOT of escaping tricks, I reached a simple method that fixed my issue. The answer was all in the single and double quotes:
curl --location --request POST 'https://hooks.slack.com/services/test-slack-hook' \
--header 'Content-Type: application/json' \
--data-raw '{"text":'"$data"'}'
Maybe it comes in handy for someone!
I had the same idea to send a message with commit message after commit.
First i tryed similar was as autor here.
But later found a better and simpler solution.
Just created php file which is sending message and call it with wget.
in hooks/post-receive :
wget -qO - "http://localhost/git.php"
in git.php:
chdir("/opt/git/project.git");
$git_log = exec("git log -n 1 --format=oneline | grep -o ' .\+'");
And then create JSON and call CURL in PHP style
Integrating a JSON-aware tool in your environment is sometimes a no-go, so here's a POSIX solution that should work on every UNIX/Linux:
json_stringify() {
[ "$#" -ge 1 ] || return 1
LANG=C awk '
BEGIN {
for ( i = 1; i <= 127; i++ )
repl[ sprintf( "%c", i) ] = sprintf( "\\u%04x", i )
for ( i = 1; i < ARGC; i++ ) {
s = ARGV[i]
printf("%s", "\"")
while ( match( s, /[\001-\037\177"\\]/ ) ) {
printf("%s%s", \
substr(s,1,RSTART-1), \
repl[ substr(s,RSTART,RLENGTH) ] \
)
s = substr(s,RSTART+RLENGTH)
}
print s "\""
}
exit
}
' "$#"
}
Or using the widely available perl:
json_stringify() {
[ "$#" -ge 1 ] || return 1
LANG=C perl -le '
for (#ARGV) {
s/[\x00-\x1f\x7f"\\]/sprintf("\\u%04x",ord($0))/ge;
print "\"$_\""
}
' -- "$#"
}
Then you can do:
json_stringify '"foo\bar"' 'hello
world'
"\u0022foo\bar\u0022"
"hello\u000aworld"
limitations:
Doesn't handle NUL bytes.
Doesn't validate the input for UNICODE, it only escapes the mandatory ASCII characters specified in the RFC 8259.
Replying to OP's question:
MSG=$(git log -n 1 --format=oneline | grep -o ' .\+')
curl -i -X POST \
-H 'Accept: application/text' \
-H 'Content-type: application/json' \
-d '{"payload": {"message": '"$(json_stringify "$MSG")"'}}' \
'https://example.com'
I have a yq read command as below,
groups=$(yq read generated/identity-mapping.yaml "iamIdentityMappings.[0].groups")
It reads iamIdentityMappings from below yaml:
iamIdentityMappings:
- groups:
- Appdeployer
- Moregroups
It stores group as below,
- Appdeployer
- Moregroups
But I want to store groups as below.(comma separated values)
groups="Appdeployer","Moregroups"
How to do this in bash?
yq is just a wrapper for jq, which supports CSV output:
$ groups="$(yq -r '.iamIdentifyMappings[0].groups | #csv' generated/identity-mapping.yaml)"
$ echo "$groups"
"Appdeployer","Moregroups"
The yq invocation in your question just causes an error. Note the fixed version.
Use mapfile and format a null delimited list with yq:
mapfile -d '' -t groups < <(
yq -j '.iamIdentityMappings[0].groups[]+"\u0000"' \
generated/identity-mapping.yaml
)
typeset -p groups
Output:
declare -a groups=([0]="Appdeployer" [1]="Moregroups")
And now you can fulfill this second part of your question:
Construct a command based upon a count variable in bash
# Prepare eksctl's arguments into an array
declare -a eksctl_args=(create iamidentitymapping --cluster "$name" --region "$region" --arn "$rolearn" )
# Read the groups from the yml into an array
mapfile -d '' -t groups < <(
yq -j '.iamIdentityMappings[0].groups[]+"\u0000"' \
generated/identity-mapping.yaml
)
# Add arguments per group
for group in "${groups[#]}"; do
eksctl_args+=(--group "$group")
done
# add username argument
eksctl_args+=(--username "$username")
# call eksctl with its arguments
eksctl "${eksctl_args[#]}"
yq 4.16+ now has a built in #csv operator:
yq e '.iamIdentityMappings.[0].groups | #csv' file.yaml
Note that #csv will only wrap values in quotes if needed (e.g. they have a comma).
If you want quotes, then sub then in and join with commas:
yq e '
.iamIdentityMappings.[0].groups |
(.[] |= sub("(.*)", "\"${1}\""))
| join(",")'
Disclaimer: I wrote yq.
yq version 3 is deprecated now and you can achieve the same output using version 4
#!/bin/bash
while IFS= read -r value; do
groups_array+=($value)
done < <(yq eval '.iamIdentityMappings.[0].groups.[]' generated/identity-mapping.yaml)
printf -v comma_seperated '%s,' "${groups_array[#]}"
echo "${comma_seperated%,}"
This code prints the comma seperated values as you wanted
I am using JQ module the parse some of the data and then running the final loop over it to parse few more data.
cluster_list=`databricks --profile hq_dev clusters list --output JSON | jq 'select(.clusters != null) | .clusters[] | [.cluster_name,.autotermination_minutes,.state,.cluster_id] | #csv' | grep -v "job-"`
for cluster in ${cluster_list[#]}
do
cluster_id=`echo $cluster| cut -d "," -f 4 | sed 's/\"//g' | sed 's/\\\//g'`
cluster_name=`echo "${cluster}"| cut -d "," -f 1| sed 's/\"//g' | sed 's/\\\//g'`
echo $cluster_name
done
cluster_list contains following value.
"\"Test Space Cluster\",15,\"TERMINATED\",\"ddd-dese23-can858\""
"\"GatewayCluster\",15,\"TERMINATED\",\"ddd-ddsd-ddsds\""
"\"delete_later\",15,\"TERMINATED\",\"1120-195800-93839\""
"\"GatewayCluster_old\",15,\"TERMINATED\",\"0108-2y7272-393893\""
it prints following.
Test
Space
Cluster
GatewayCluster
delete_later
GatewayCluster_old
Desired output
it shouldn't break to newline if there is a space, I am doing few more action by the name I am getting here.
Test Space Cluster
GatewayCluster
delete_later
GatewayCluster_old
Your script seems a bit overly complex to achieve your goal. Better use read to store each value in a separate variable, and set a comma for the input field separator IFS:
databricks --profile hq_dev clusters list --output JSON |
jq 'select(.clusters != null) | .clusters[] |
[.cluster_name,.autotermination_minutes,.state,.cluster_id] | #csv' |
grep -v "job-" |
sed 's/\\\?"//g' |
while IFS=, read name autotermination_minutes state id ; do
echo $name
done
Note: I didn't touch your jq command. The sed line I put aims to remove quotes, protected or not. You can tune jq to remove these quotes with -r, as said in the man page:
INVOKING JQ
[...]
--raw-output / -r::
With this option, if the filter´s result is a string then it will be written directly to standard output rather than being formatted as a JSON string with quotes. This can be useful for making jq filters talk to non-JSON-based systems.
I have used jq with aws cli to print the instances .
Eg:
Retrieve instances list
aws ec2 describe-instances --filters "Name=tag:bld_env,Values=test" --output json > all-inst.json
Jq to print instances id :
jq -r '.Reservations[].Instances[].InstanceId' all-inst.json
Output of Jq:
i-09e0d805cc
i-091a61038
i-07d3022
i-0428ac7c4c
i-970dc5c4d99
i-014c4ea
i-0ac924df
i-031f6 and so on..
I want to print them in a line like this :
i-09e0d805cc,i-091a61038,i-07d3022,i-0428ac7c4c,i-970dc5c4d99,i-014c4ea,i-0ac924df,i-031f6 and so on..
Are the angle bracket characters really there? Otherwise you can simply tr '\n' ','.
Here are some jq-only approaches.
It's often simplest just to "join" the lines (e.g. using join(",")). This is typically done with the -r command-line option.
In cases where this is impractical or inefficient, one can use the --join (or -j) command-line option. Here are two illustrations using this approach. In neither of the examples does the output include a newline.
With a terminating comma
jq -n -j 'range(0;5) | "\(.),"'
Without a terminating comma
oneline.jq:
def oneline(f):
foreach f as $i (null;
if . == null then "\($i)" else ",\($i)" end;
.);
oneline( range(0;5) )
Invocation: jq -n -j -f oneline.jq
Output:
0,1,2,3,4