how to pass empty string to value?
-D <property=value> use value for given property
I use hadoop pipes
I tried
-D prop1=
-D prop2=value2
but it doesn't work
# HadoopPipes::JobConf
# jobconf.hasKey(prop1) is false
Related
In my bash script, when I run the following jq against my curl result:
curl -u someKey:someSecret someURL 2>/dev/null | jq -r '.schema' | jq -r -c '.fields'
I get back a JSON array as follows:
[{"name":"id","type":"int","doc":"Documentation for the id field."},{"name":"test_string","type":"string","doc":"Documentation for the test_string field"}]
My goal is to do a call with jq applied to return the following (given the example above):
{"id":1234567890,"test_string":"xxxxxxxxxx"}
NB: I am trying to automatically generate templated values that match the "schema" JSON shown above.
So just to clarify, that is:
all array objects (there could be more than 2 shown above) returned in a single comma-delimited row
doc fields are ignored
the values for "name" (including their surrounding double-quotes) are concatenated with either:
:1234567890 ...when the "type" for that object is "int"
":xxxxxxxxxx" ...when the "type" for that object is "string"
NB: these will be the only types we ever get for now
Can someone show me how I can expand upon my initial jq to return this?
NB: I tried working down the following path but am failing beyond this...
curl -u someKey:someSecret someURL 2>/dev/null | jq -r '.schema' | jq -r -c '.fields' | "\(.name):xxxxxxxxxxx"'
If it's not possible in pure JQ (my preference) I'm also happy for a solution that mixes in a bit of sed/awk magic :)
Cheers,
Stan
Given the JSON shown, you could add the following to your pipeline:
jq -c 'map({(.name): (if .type == "int" then 1234567890 else "xxxxxxxxxx" end)})|add'
With that JSON, the output would be:
{"id":1234567890,"test_string":"xxxxxxxxxx"}
However, it would be far better if you combined the three calls to jq into one.
So, I am getting a response from an API that I am calling in a shell script in the following form
[{"id":100000004,"name":"Customs Clearance Requested"},{"id":100000005,"name":"Customs Cleared"},{"id":100000006,"name":"Cargo Loaded to Vessel"}]
I want to create a map out of it that will help me lookup the id's from a name and use it in the shell script. So something like map["Customs Clearance Requested"] would give me 100000004 which I can use further. Can this be done using jq? I am pretty new to shell scripting and jq and got stuck with above thing
json='[{"id":100000004,"name":"Customs Clearance Requested"},{"id":100000005,"name":"Customs Cleared"},{"id":100000006,"name":"Cargo Loaded to Vessel"}]'
declare -A map
while IFS= read -r -d '' name && IFS= read -r -d '' value; do
map[$name]=$value
done < <(jq -j '.[] | "\(.name)\u0000\(.id)\u0000"' <<<"$json")
declare -p map # demo purposes: print the map we created as output
...emits as output:
declare -A map=(["Cargo Loaded to Vessel"]="100000006" ["Customs Clearance Requested"]="100000004" ["Customs Cleared"]="100000005" )
...which you can query exactly as requested:
$ echo "${map['Cargo Loaded to Vessel']}"
100000006
You could use the select function, e.g.:
data='[{"id":100000004,"name":"Customs Clearance Requested"},{"id":100000005,"name":"Customs Cleared"},{"id":100000006,"name":"Cargo Loaded to Vessel"}]'
jq 'map(select(.["name"] == "Customs Clearance Requested"))' <<< $data
It will get all elements which name equals "Customs Clearance Requested", e.g.:
[
{
"id": 100000004,
"name": "Customs Clearance Requested"
}
]
If you want to get the id field:
jq 'map(select(.["name"] == "Customs Clearance Requested")["id"])' <<< $data
This will output:
[
100000004
]
Please note that it will return an array and not a single element because the search does not know how many results will be found.
If you want to generalize this in a shell function, you could write:
function get_id_from_name
{
# $1=name to search for
local filter=$(printf 'map(select(.["name"] == "%s")["id"])' "$1")
jq "$filter"
}
Then call it like that:
get_id_from_name "Customs Clearance Requested" <<< $data
If your data is stored in a file, you could call it this way:
get_id_from_name "Customs Clearance Requested" < /path/to/file.json
The following is very similar to #CharlesDuffy's excellent answer but does not assume that the .name and .id values are NUL-free (i.e., do not have any "\u0000" characters):
declare -A map
while read -r name
do
name=$(sed -e 's/^"//' -e 's/"$//' <<< "$name")
read -r id
map[$name]="$id"
done < <(echo "$json" | jq -c '.[]|.name,.id')
The point is that the -j option is like -r (i.e., produces "raw output"), whereas the -c option produces JSON.
This means that if you don't want the .id values as JSON strings, then the above won't be a solution; also, if the .name values contain double-quotes, then you might want to deal with the occurrences of \".
There is a parameter $1. It's an email address. I want it to be part of the string within a string of this curl command:
curl 'xy.com' --data-binary '{"email":"variableValueHere!!!"}'
So a command including the parameter $1 should result in this command...
curl 'xy.com' --data-binary '{"email":"xy#z.de"}'
if $1 equals xy#z.de.
How can I put it in there?
What I tried so far:
curl 'xy.com' --data-binary '{"email":"$1"}'
curl 'xy.com' --data-binary '{"email":"`echo $1`"}'
Try this:
curl 'xy.com' --data-binary '{"email":"'"$1"'"}'
It's the concatenation of
'{"email":"'
"$1"
'"}'
and inserts your parameter as a quoted string.
Better use a proper JSON parser like jq:
curl 'xy.com' --data-binary "$(
jq \
--arg email "$1" \
--null-input \
--compact-output \
'.email = $email'
)"
"$(jq ...)": Captures the output of jq as a string.
jq: Is a command-line JSON processor.
--arg email "$1": Pass the shell's argument $1's value as the jq variable $email.
--null-input: Tells jq there is no JSON input stream to parse.
--compact-output: Tells jq to compact its output by putting each JSON object on a single line.
'.email = $email': This is the jq query to assign the value of the $email variable as the JSON string value of the email key from the root JSON object ..
This can also be written more compact:
curl 'xy.com' --data-binary "$(jq -cn --arg e "$1" '.email=$e')"
What do you see when you try this:
curl 'xy.com' --data-binary '{"email":"`$1`"}'
How about
curl xy.com --data-binary "{email:$1}"
?
If you want the 3rd argument to be in JSON Format, write it as:
curl xy.com --data-binary '{"email":"'"$1"'"}'
You can simplify this to
curl xy.com --data-binary '{"email":"'$1'"}'
if you are sure that your email address ($1) does not contain spaces or other troublesome characters.
If you really want to pass additional single quotes to curl (as you stated in your comment), just prepend and append to your argument a
"'"
I want to use this command into a bash script where each time I will have a different array input containing the parameters?
knowing that I have an array (as input from the user) where each column contains "parameteri=valuei".
I want to get rid of the hardcoded aspect in introducing the name and the value of each parameter.
For instance, with this input:
"id=123,verbosity=high"
I will eventually get this final instruction:
curl -X POST JENKINS_URL/job/JOB_NAME/build \
--user USER:TOKEN \
--data-urlencode json='{"parameter": [{"name":"id", "value":"123"}, {"name":"verbosity", "value":"high"}]}'
What is a clean way to do so?
You can make it the sexy way, building the jsonParameters from specified key=value parameters:
#!/bin/bash
jsonParameters=""
while IFS=',' read -r -a parameterEntries; do
for parameterEntry in "${parameterEntries[#]}"; do
IFS='=' read -r key value <<< "$parameterEntry"
[ ! -z "$jsonParameters" ] && jsonParameters="$jsonParameters,"
jsonParameters="$jsonParameters {\"name\":\"$key\", \"value\": \"$value\"}"
done
done <<< "$#"
Explanations:
the first loop will create the array named parameterEntries, with all your specified parameters, each element will contain key=value
then, the second loop, which iterates on each element of this array, will extract key, and value of it
eventually, it is only syntax writting to get the JSON output you want
the [ ! -z "$jsonParameters" ] && jsonParameters="$jsonParameters," is just here to add a separating coma, only if there is more than one element
Then you simply have to use the $jsonParameters where you want:
curl -X POST JENKINS_URL/job/JOB_NAME/build \
--user USER:TOKEN \
--data-urlencode json="{\"parameter\": [$jsonParameters]}"
I want to parsing JSON data using jq(as described here ) and delete any newlines character from the resulting string.
I've already tried to use tr but this approach remove also all the white spaces between parsed values.
My code:
IP=$(curl -s https://ipinfo.io/ip) # Get ip address
curl -s https://ipinfo.io/${IP}/geo | jq -r '.ip, .city, .country' | tr -d '\n' # parse only few values from the JSON data and remove new lines.
What i get with the code above is the following string:
XXX.XXX.XXX.XXXCity_NameCountry_Name but i want something like this:
XXX.XXX.XXX.XXX City_Name Country_Name
You could craft a single string from the three pieces of data so that it appears on a single line (single result by input) :
IP=$(curl -s https://ipinfo.io/ip)
curl -s https://ipinfo.io/${IP}/geo | jq -r '.ip + " " + .city + " " + .country'
> myIp myCity myCountryCode
Another option for a different but similar output format would be to use the #csv output format, where you would want to output an array of cells for each input :
IP=$(curl -s https://ipinfo.io/ip)
curl -s https://ipinfo.io/${IP}/geo | jq -r '[.ip, .city, .country] | #csv'
> "myIp","myCity","myCountryCode"
This result could be easily worked on from any spreadsheet software.