I am using the following curl command:
curl -s -v --user admin:orca --insecure -X GET https://insecure.registry.com/api/v0/repositories/authi-api/tags
Getting following output:
{
"name": "Dev_ReleaseRollout_Lane-3",
"inRegistry": true,
"hashMismatch": false,
"inNotary": false
},
{
"name": "Dev_ReleaseRollout_Lane-latest",
"inRegistry": true,
"hashMismatch": false,
"inNotary": false
},
{
"name": "Payments_Dev_Lane-267",
"inRegistry": true,
"hashMismatch": false,
"inNotary": false
}
I want to get only name values in a variable.
I need only Dev_ReleaseRollout_Lane-3 Dev_ReleaseRollout_Lane-latest Payments_Dev_Lane-267 in a variable
Assuming you actually have an array around the three objects:
$ curl ... | jq -r '.[].name'
Dev_ReleaseRollout_Lane-3
Dev_ReleaseRollout_Lane-latest
Payments_Dev_Lane-267
It's fairly simple, . is the array, [].name take name from each element in the array. -r is raw output.
--raw-output / -r:
With this option, if the filter’s result is a string then it will be written directly to standard output rather than being formatted as a JSON string with quotes. This can be useful for making jq filters talk to non-JSON-based systems.
If the cURL output is actually as mentioned above the following will work:
jq -rRs '"[\(.)]" | fromjson[].name' file.json
However I think there is a better way to wrap an array around input,
-R is raw input and -s is slurp. \(...) is string interpolation.
--slurp/-s:
Instead of running the filter for each JSON object in the input, read the entire input stream into a large array and run the filter just once.
Related
given the following (simplified) json file:
{
"data": [
{
"datum": "2023-01-11 00:00:00",
"prijs": "0.005000",
"prijsZP": "0.161550",
"prijsEE": "0.181484",
"prijsTI": "0.160970",
},
{
"datum": "2023-01-11 01:00:00",
"prijs": "0.000000",
"prijsZP": "0.155500",
"prijsEE": "0.175434",
"prijsTI": "0.154920",
}
]
}
I want to specify in my jq command which fields to retreive, i.e. only "datum" and "prijsTI". But on another moment this selection will be different.
I use the following command to gather all the fields, but would like to set the field selection via a variable:
cat data.json |jq -r '.data[]|[.datum, .prijsTI]|#csv'
I already tried using arguments, but this did not work :-(
myJQselect=".datum, .prijsTI"
cat data.json |jq -r --arg myJQselect "$myJQselect" '.data[$myHour |tonumber]|[$myJQselect]|#csv'
gives the following result: ".datum, .prijs" instead of the correct values.
Would this be possible?
Thanks,
Jeroen
You can use the --args option to provide a variable number of fields to query, then use the $ARGS.positional array to retrieve them:
jq -r '.data[] | [.[$ARGS.positional[]]] | #csv' data.json --args datum prijsTI
"2023-01-11 00:00:00","0.160970"
"2023-01-11 01:00:00","0.154920"
I am reading aws ssm parameters via jq with json, and want to put them into a yaml file after some processing and alias mapping.
I have one associative array of mapping parameter name to some internal name (see process.sh)
My simplified example.json is
{
"Parameters": [
{
"Name": "/dev/applications/web/some_great_key",
"Value": "magicvaluenotrelevant"
},
{
"Name": "/dev/applications/web/api_magic",
"Value": "blabla"
}
]
}
My bash script process.sh is
#!/usr/bin/env bash
declare -A ssmMap
ssmMap[/dev/applications/web/some_great_key]=theGreatKEYAlias
ssmMap[/dev/applications/web/api_magic]=apiKey
jq -r '
.Parameters
| map({
Name: .Name,
parameterValue: .Value
})
| map((${!ssmMap[.Name]} + ": \"" + .parameterValue + "\""))
| join("\n")
' < example.json;
I want/expected the output to be:
ssm_theGreatKEYAlias: "magicvaluenotrelevant"
ssm_apiKey: "singleblabla"
With the process.sh script that I provided I get error because cannot find a way to use the associative array sshMap inside jq map, to transform the parameter name from json into the mapped alias from sshMap.
Any idea how that can be achieved ?
If I change the line 13 of process.sh into
| map((.Name + ": \"" + .parameterValue + "\""))
it works fine but without mapping, uses original parameter name as comes from json file.
You're trying to access a shell var from jq.
Fixed:
ssm_map='
{
"some_great_key": "theGreatKEYAlias",
"api_magic": "apiKey"
}
'
jq -r --argjson ssm_map "$ssm_map" '
.Parameters[] |
"ssm_\( $ssm_map[ .Name ] ): \"\( .Value )\""
'
Demo on jqplay
You could convert the variable you describe into JSON with varying levels of difficulty (depending on what limitations, if any, you're willing to accept). But since you're building the variable from the output of yq, should could simply have it output JSON directly.
Using mikefarah/yq or kislyuk/yq or itchyny/gojq is definitely the better approach if your actual task is to convert from YAML and to process as JSON.
However, to accomplish the task you have described in the original question, one could add the keys and values of the bash associative array using the --args option and access them within jq via the $ARGS.positional array (requires jq version 1.6), then restore the association by transposeing both halves of that array, and create an INDEX using the key from the first half, which can then be used with JOIN to create the desired text lines, which are output as raw text using the -r option.
jq -r '
JOIN(
$ARGS.positional | [.[:length/2], .[length/2:]] | transpose | INDEX(.[0]);
.Parameters[];
.Name;
"ssm_\(.[1][1] // .[0].Name): \(.[0].Value)"
)
' example.json --args "${!ssmMap[#]}" "${ssmMap[#]}"
ssm_theGreatKEYAlias: magicvaluenotrelevant
ssm_apiKey: blabla
The fallback // .[0].Name is unnecessary for the sample file, but was added to use the original key in case there's an input key not covered by the bash array.
I would like to get the values from Json file. Which is working.
JsonFileToTest:
{
"permissions": [
{
"emailid": "test1#test.com",
"rights": "read"
},
{
"emailid": "test2#test.com",
"rights": "read"
}
]
}
readPermissions=($(jq -r '.permissions' JsonFileToTest))
# The command below works perfectly, But when I Put it in a loop, It does not.
#echo ${readPermissions[#]} | jq 'values[].emailid'
for vals in ${readPermissions[#]}
do
# I would like o extract the email id of the user. The loop is not working atm.
echo ${vals[#]} | jq 'values[].emailid'
done
what am I missing here?
thanks
If you really want to do it this way, that might look like:
readarray -t permissions < <(jq -c '.permissions[]' JsonFileToTest)
for permissionSet in "${permissions[#]}"; do
jq -r '.emailid' <<<"$permissionSet"
done
Note that we're telling jq to print one line per item (with -c), and using readarray -t to read each line into an array element (unlike the array=( $(...command...) ) antipattern, which splits not just on newlines but on other whitespace as well, and expands globs in the process).
But there's no reason whatsoever to do any of that. You'll get the exact same result simply running:
jq -r '.permissions[].emailid' JsonFileToTest
In my bash script, when I run the following jq against my curl result:
curl -u someKey:someSecret someURL 2>/dev/null | jq -r '.schema' | jq -r -c '.fields'
I get back a JSON array as follows:
[{"name":"id","type":"int","doc":"Documentation for the id field."},{"name":"test_string","type":"string","doc":"Documentation for the test_string field"}]
My goal is to do a call with jq applied to return the following (given the example above):
{"id":1234567890,"test_string":"xxxxxxxxxx"}
NB: I am trying to automatically generate templated values that match the "schema" JSON shown above.
So just to clarify, that is:
all array objects (there could be more than 2 shown above) returned in a single comma-delimited row
doc fields are ignored
the values for "name" (including their surrounding double-quotes) are concatenated with either:
:1234567890 ...when the "type" for that object is "int"
":xxxxxxxxxx" ...when the "type" for that object is "string"
NB: these will be the only types we ever get for now
Can someone show me how I can expand upon my initial jq to return this?
NB: I tried working down the following path but am failing beyond this...
curl -u someKey:someSecret someURL 2>/dev/null | jq -r '.schema' | jq -r -c '.fields' | "\(.name):xxxxxxxxxxx"'
If it's not possible in pure JQ (my preference) I'm also happy for a solution that mixes in a bit of sed/awk magic :)
Cheers,
Stan
Given the JSON shown, you could add the following to your pipeline:
jq -c 'map({(.name): (if .type == "int" then 1234567890 else "xxxxxxxxxx" end)})|add'
With that JSON, the output would be:
{"id":1234567890,"test_string":"xxxxxxxxxx"}
However, it would be far better if you combined the three calls to jq into one.
I have problem with assigning curl as variable and assign curl's output to variable:
#get results url, format json
URL=$(curl https://api.apifier.com/xy)
#jq is a cli json interpreter
#resultUrl contains the final URL which we want download
OK= "$URL" | jq '.resultsUrl'
#api probably is running
sleep 5
curl "$OK"
Maybe it is trivial, but I don't know where is the problem.
My guess is:
jq '.resultsUrl'
outputs the field resultsUrl with quotes, so curl does not process it correctly. Furthermore, $URL | ... does not work, you would have to use echo or curl directly.
Try
OK=$(curl -s https://api.apifier.com/v1/xHbBnrZ9rxF4CdKjo/crawlers/Example_Alcatraz_Cruises/execute?token=nJ9ohCHZPaJRFEb7nFqtzm76u | jq -r '.resultsUrl')
curl -s "$OK"
which results for me in
[{ "id": 2, "url": "https://www.alcatrazcruises.com/SearchEventDaySpan.aspx?date=02-25-2016&selected=", "loadedUrl": "https://www.alcatrazcruises.com/SearchEventDaySpan.aspx?date=02-25-2016&selected=", "requestedAt": "2016-02-25T23:24:52.611Z", "loadingStartedAt": "2016-02-25T23:24:54.663Z", "loadingFinishedAt": "2016-02-25T23:24:55.642Z", "loadErrorCode": null, "pageFunctionStartedAt": "2016-02-25T23:24:55.839Z", "pageFunctionFinishedAt": "2016-02-25T23:24:55.841Z", "uniqueKey": "https://www.alcatrazcruises.com/SearchEventDaySpan.aspx?date=02-25-2016&selected=", "type": "UserEnqueued", ...
This should be what you expect.
However, sometimes the first API call yields an error:
{
"type": "ALREADY_RUNNING",
"message": "The act is already running and concurrent execution is not allowed"
}
so resultsURL will be null, you will have to handle this error case.
Your line
OK= "$URL" | jq '.resultsURL'
sets the environment variable OK to an empty string, then tries to execute "$URL" as a command and pipe its output to jq. If you want to setOK to the result of a command, you have to use $OK=(...), just like you did when setting URL. The correct syntax is:
OK=$(echo "$URL" | jq '.resultsURL')
And to remove the quotes from the output of .jq, you can do:
OK=$(echo "$URL" | jq '.resultsURL' | tr -d '"')