I'm trying to write a sh script that reads data from a JSON, and by parsing, pass the data as variables for a curl request (GET)
The parsing is done correctly but the CURL request continues to give problems.
I parse the variables, after opening the JSON, with the help of JQ:
ruleId=$($JSON | jq '.[].ruleId')
alert=$($JSON | jq '.[].alertName')
...
I'm passing the variables in this way:
curl --data-urlencode "ruleId=$ruleId" --data-urlencode "newLevel=$newLevel" --data-urlencode "url=$url" --data-urlencode "urlIsRegex=$urlIsRegex" --data-urlencode "enabled=$enabled" --data-urlencode "parameter=$parameter" --data-urlencode "evidence=$evidence" "http://localhost:8090/JSON/alertFilter/action/addGlobalAlertFilter"
The error that i get back is:
curl: (3) URL using bad/illegal format or missing URL
Can you help me?
Here is my test script:
!/bin/sh
#set -e
# Info to test this script
# 1. Start docker
# 2. Run this command from terminal: docker run -u zap -p 8080:8090 -i owasp/zap2docker-stable zap.sh -daemon -host 0.0.0.0 -port 8080 -config api.disablekey=true -config api.addrs.addr.name=.* -config api.addrs.addr.regex=true
# 2. Get the JSON from api (test for now)
# 3. bash filter.sh
# 4. Check for filter to be set with browser http://localhost:8080/UI/alertFilter/ => view global filter
#open example JSON file (API) and check for alert filter list
curl -s 'https://api.npoint.io/c29e3a68be632f73fc22' > whitelist_tmp.json
whitelist=whitelist_tmp.json
#Parser WITOUT loop
ruleId=$($whitelist | jq '.[].ruleId')
alert=$($whitelist | jq '.[].alertName')
newLevel=$($whitelist | jq '.[].newLevel')
url=$($whitelist | jq '.[].url')
urlIsRegex=$($whitelist | jq '.[].urlIsRegex')
enabled=$($whitelist | jq '.[].enabled')
parameter=$($whitelist | jq '.[].parameter')
evidence=$($whitelist | jq '.[].evidence')
#Fist test-query WITHOUT url-encoding => not working
#curl -X -4 --retry 2 --retry-connrefused --retry-delay 3 "http://localhost:8080/JSON/alertFilter/action/addGlobalAlertFilter/?ruleId=${ruleId}\&newLevel=${newLevel}\&url=${url}\&urlIsRegex=${urlIsRegex}\¶meter=${parameter}\&enabled=${enabled}\&evidence=${evidence}"
#Second test-query WITH url-encoding
curl --data-urlencode "ruleId=$ruleId" --data-urlencode "newLevel=$newLevel" --data-urlencode "url=$url" --data-urlencode "urlIsRegex=$urlIsRegex" --data-urlencode "enabled=$enabled" --data-urlencode "parameter=$parameter" --data-urlencode "evidence=$evidence" "http://localhost:8080/JSON/alertFilter/action/addGlobalAlertFilter"
jq accepts a filename argument, and it would be better to use that than change whitelist to include a call to cat:
jq -r 'MYFILTER' "$whitelist"
Using filters such as .[].evidence' makes your script very fragile, as it assumes the top-level array will always have just one entry. If you want just the first entry in the array, you could specify that, though maybe using first would be even more robust:
first(.[].evidence)
Although jq is fairly light-weight, it is worth noting that all those calls to jq could be reduced to one, though of course your shell script would then have to be modified as well. See e.g.
Attempting to assign multiple variables using jq
Try changing
whitelist=whitelist_tmp.json
to
whitelist="cat whitelist_tmp.json"
I think the error was occurring because shell was trying & failing to execute "whitelist_tmp.json" as its own command/function instead of piping the contents of the file to jq.
You also want to add the -r option to jq in the variables you are going to pass in curl, like so:
ruleId=$($whitelist | jq -r '.[].ruleId')
alert=$($whitelist | jq -r '.[].alertName')
newLevel=$($whitelist | jq -r '.[].newLevel')
url=$($whitelist | jq -r '.[].url')
urlIsRegex=$($whitelist | jq -r '.[].urlIsRegex')
enabled=$($whitelist | jq -r '.[].enabled')
parameter=$($whitelist | jq -r '.[].parameter')
evidence=$($whitelist | jq -r '.[].evidence')
Without the -r option, any strings that jq outputs will be quoted, which could be escaping your script's quotation in the curl command. The -r option tells jq to output the raw string data without any quotation.
Related
I am trying to run the following cURL command (taken from the ArgoCD documentation).
$ curl $ARGOCD_SERVER/api/v1/session -d $'{"username":"admin","password":"password"}'
With the proper credentials entered manually, it runs fine, but I have my password stored in an environment variable ($PASSWORD) and with the both the ' and " quotes it does not insert the password correctly.
$ curl $ARGOCD_SERVER/api/v1/session -d $'{"username":"admin","password":"$PASSWORD"}'
I suspect it uses the string literal $PASSWORD as the actual password, rather than the variable's content.
How would I insert this variable correctly?
You can use jq to create the JSON:
json=$(jq -c -n --arg username admin --arg password "$password" '$ARGS.named')
curl $ARGOCD_SERVER/api/v1/session -d "$json"
Using jq ensures that the JSON is properly formulated, no matter the contents of the variable:
$ password=$'with\'both"quotes'
$ declare -p password
declare -- password="with'both\"quotes"
$ jq -cn --arg username admin --arg password "$password" '$ARGS.named'
{"username":"admin","password":"with'both\"quotes"}
Like this:
curl $ARGOCD_SERVER/api/v1/session -d '{"username":"admin","password":"'$PASSWORD'"}'
or this:
curl $ARGOCD_SERVER/api/v1/session -d "{\"username\":\"admin\",\"password\":\"$PASSWORD\"}"
this'll probalby works too:
curl $ARGOCD_SERVER/api/v1/session -d "{'username':'admin','password':'$PASSWORD'}"
or:
printf -v data '{"username":"admin","password":"%s"}' "$PASSWORD"
curl $ARGOCD_SERVER/api/v1/session -d "$data"
wasn't able to find and I'm not sure if it possible at all. I know that I can pass the whole payload via #file.json, something like
$ curl -X POST -H "Content-Type: application/json" -d #test.json \
http://localhost/api/v3/endpoint
$ cat test.json
{
"key1": "value1",
"key2": "value2"
}
But now I need to upload gpg public key and I can't store the key inside test.json. What I need is something like
$ curl -X POST -H "Content-Type: application/json" \
http://localhost/api/v3/endpoint \
-d '{"key1": "value1", "key2": "#gpg-public.key"}'
Thanks in advance
As an option in bash I can use the following
$ curl -X POST -H "Content-Type: application/json" \
http://localhost/api/v3/endpoint \
-d "{\"key1\": \"value1\", \"key2\": \"$(cat gpg-public.key)\"}"
but I'm looking for more elegant way
I can't say this is more elegant, but at least it's different.
jq -Rn '{key2: input}' <pgp-public.key |
jq -s '.[0] * .[1]' test.json - |
curl -d #- ...
Credit for the jq hacks: Simo Kinnunen's answer here and Charles Duffy's answer here.
The first rule of JSON is to use a tool that knows JSON to generate dynamic values. A good tool is jq. Now, instead of storing a JSON file on disk, store a jq filter that can act as a template for your JSON file.
# test.jq
{
key1: "value1",
key2: $gpg
}
Now, you can run jq using this template as a filter, and supply a value for it to replace $gpg:
$ jq -n -f test.jq --arg gpg hi
{
"key1": "value1",
"key2": "hi"
}
Once you have this command, you can use it to pass the generated JSON directly to curl via standard input; the -d option can take #- as an argument to read from standard input.
jq -n -f test.q --arg gpg hi | curl ... -d #-
For simple jobs, you can specify the filter on the command line instead of reading it from a file.
jq -n '{key1: "value1", key2: $gpg}' --arg gpg hi | curl -d #-
use multipart/form-data with the -F parameter, it supports uploading multiple files at once.
curl -F json=#test.json -F pgp=#pgp-public.key http://localhost/api/v3/endpoint
ofc, the API endpoint will also need to support multipart/form-data for this to work.
edit: going by your comments, if you have to send a single json file, then edit the json file as needed and send the edited version, unfortunately bash is not a suitable language for editing json (although i'm sure it's possible), i'd recommend using some other scripting language for that. here's an example with php-cli to edit the json, then sending the edited json to curl via stdin:
php -r '$p=json_decode(file_get_contents("test.json"),true);$p["key2"]=file_get_contents("pgp-public.key");echo json_encode($p);' |
curl -d #- -X POST -H "Content-Type: application/json" http://localhost/api/v3/endpoint
(but any language with json support should suffice, like python/perl/php)
I have a simple bash file as below
#!/bin/bash
net=$(curl -s -H "Content-Type: application/json" -H "X-Auth-Token: $token" -d '{"network": {"name": "net1"}}' http://10.1.10.146:18090/network/v2.0/networks 2>&1 | awk '/id/{print $1}' | jq -r .network.id)
echo $net
Running this file gives me an error as below
parse error: Invalid numeric literal at line 2, column 0
tried making the changes according to these links
https://unix.stackexchange.com/questions/354943/setting-jq-output-to-a-bash-variable
Working with Bash and cURL
but nothing helped me, unable to figure out where i am going wrong. let me know the reason for the error and possible changes.
The curl output for the command
curl -s -H "Content-Type: application/json" -H "X-Auth-Token: $token" -d '{"network": {"name": "net1"}}' http://10.1.10.146:18090/network/v2.0/networks
output:
{"network":{"status":"ACTIVE","router:external":false,"availability_zone_hints":[],"availability_zones":[],"description":"","subnets":[],"shared":false,"tenant_id":"d0e75710820c401db3291ac6278f326f","created_at":"2018-05-15T07:37:42Z","tags":[],"ipv6_address_scope":null,"mtu":1450,"updated_at":"2018-05-15T07:37:42Z","admin_state_up":true,"revision_number":2,"ipv4_address_scope":null,"is_default":false,"port_security_enabled":true,"project_id":"d0e75710820c401db3291ac6278f326f","id":"1548df56-a35b-4232-9550-54a3c2266d60","name":"net1"}}
the idea is to get only the id from the output and store into a bash variable, to get the id i used the below command
curl -s -H "Content-Type: application/json" -H "X-Auth-Token: $token" -d '{"network": {"name": "net1"}}' http://10.1.10.146:18090/network/v2.0/networks 2>&1 | awk '/id/{print $1}' | jq -r .network.id
output:
be831582-90c1-499c-875f-9c0b0d1969a6
I have also tried removing the awk and parsing the curl json response, the same error is showing up.
thanks in advance.
The "parse error" message appears because the 2>&1 redirects any STDERR message into jq, which cannot parse it. Compare the output from these commands:
> curl http://no.such.host/network/v2.0/networks 2>&1
curl: (6) Could not resolve host: no.such.host
> curl http://no.such.host/network/v2.0/networks 2>&1 | jq '.'
parse error: Invalid numeric literal at line 1, column 4
Here are some ideas:
Separate all of the piped commands into separate commands.
Try adding set -x near the top of the script to "debug" all of the
commands.
Remove the 2>&1 because it is NOT helping you!
Add error handling to the separate commands.
We could help more if you provided the curl output. (We do not have HTTP access to the 10.1.10.146 host.)
I am trying to assign a string I get from curl and jq to a variable. this is my code below, but it doesn't work. I am a Mac user.
value=$(curl -X GET curl -X GET https://apitest.onkore.com/onkore/api/v1/storeCategories | jq '.[2] | ._id')
$ value=$(curl -X GET https://apitest.onkore.com/onkore/api/v1/storeCategories |
jq '.[2] | ._id')
$ echo "$value"
"59178d2a4ca53714085a0903"
In other words, "curl -X GET curl -X GET" is incorrect.
P.S. You might want to use the -r command-line option of jq.
I have a curl request like this :
curl -s -u $user:$password -X GET -H "Content-Type: application/json" $url
Which returns a json as response. So I will parse the response using jq to get some specific data. Like this :
curl -s -u $user:$password -X GET -H "Content-Type: application/json" $url | jq '<expression>'
Now if the curl request fails then obviously the parsing operation throws ugly error. I want to avoid this. How to store the response first and then later parse it if the request is successful. I don't want to display the json whole response. Also if I add -w "%{http_code}" in my request it appends the status code with the JSON response which messes up the parsing. How to solve this ? I basically want to first check if the curl request is successful or not then get the JSON response and parse it.I also want to get the status code, so that if it fails I can display the status code. But status code is now messing up with json response.
You can combine the --write and --fail options:
# separating the (verbose) curl options into an array for readability
curl_args=(
--write "%{http_code}\n"
--fail
--silent
--user "$user:$password"
--request GET
--header "Content-Type: application/json"
)
if ! output=$(curl "${curl_args[#]}" "$url"); then
echo "Failure: code=$output"
else
# remove the "http_code" line from the end of the output, and parse it
sed '$d' <<<"$output" | jq '...'
fi
Also note: quote your variables!
I found glenn jackman's answer good, but a bit confusingly written, so I rewrote it, and altered it so I can use it as a safer alternative to curl | jq.
#!/bin/bash
# call this with normal curl arguments, especially url argument, e.g.
# safecurl.sh "http://example.com:8080/something/"
# separating the (verbose) curl options into an array for readability
curl_args=(
-H 'Accept:application/json'
-H 'Content-Type:application/json'
--write '\n%{http_code}\n'
--fail
--silent
)
echo "${curl_args[#]}"
# prepend some arguments, but pass on whatever arguments this script was called with
output=$(curl "${curl_args[#]}" "$#")
return_code=$?
if [ 0 -eq $return_code ]; then
# remove the "http_code" line from the end of the output, and parse it
echo "$output" | sed '$d' | jq .
else
# echo to stderr so further piping to jq will process empty output
>&2 echo "Failure: code=$output"
fi
Note: This code does not test for services that ignore the requested content type and respond with HTML. You'd need to test for grep -l '</html>' for that.