Escaping multiple layers of mixed quotes for a curl command executed inside a bash script - bash

I have the following bash script that uses its arguments to hit a RESTful web service (via curl) and prints out both the curl request made as well as the response:
#! /bin/bash
# arguments:
# $1 - username
# $2 - password
#
# outputs:
# if the script exits with a non-zero status then something went wrong
# verify that we have all 6 required arguments and fail otherwise
if [ "$#" -ne 2 ]; then
echo "Required arguments not provided"
exit 1
fi
# set the script arguments to meaningful variable names
username=$1
password=$2
# login and fetch a valid auth token
req='curl -k -i -H "Content-Type: application/json" -X POST -d ''{"username":"$username","password":"$password"}'' https://somerepo.example.com/flimflam'
resp=$(curl -k -i -H "Content-Type: application/json" -X POST -d ''{"username":"$username","password":"$password"}'' https://somerepo.example.com/flimflam)
# echo the request for troubleshooting
echo "req = $req"
if [ -z "$resp" ]; then
echo "Login failed; unable to parse response"
exit 1
fi
echo "resp = $resp"
When I run this I get:
$ sh myscript.sh myUser 12345#45678
curl: (3) Port number ended with '"'
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0curl: (6) Could not resolve host: 12345#45678"
100 1107 100 1093 100 14 2849 36 --:--:-- --:--:-- --:--:-- 2849
req = curl -k -i -H "Content-Type: application/json" -X POST -d {"username":"$username","password":"$password"} https://somerepo.example.com/flimflam
resp = HTTP/1.1 400 Bad Request...(rest omitted for brevity)
Obviously, I'm not escaping the various layers of single- and double-quotes inside the curl statement correctly, as is indicated by outputs like:
curl: (6) Could not resolve host: 12345#45678"
and:
req = curl -k -i -H "Content-Type: application/json" -X POST -d {"username":"$username","password":"$password"} https://somerepo.example.com/flimflam
where the username/password variables are not parsing.
In reality my script takes a lot more than 2 arguments, which is why I'm changing them to have meaningful variable names (such as $username instead of $1) so its more understandable and readable.
Can anyone spot where I'm going awry? Thanks in advance!
Update
I tried the suggestion which turns the req into:
curl -k -i -H 'Content-Type: application/json' -X POST -d "{'username':'myUser','password':'12345#45678'}" https://somerepo.example.com/flimflam
However this is still an illegal curl command and instead needs to be:
curl -k -i -H 'Content-Type: application/json' -X POST -d '{"username":"myUser","password":"12345#45678"}' https://somerepo.example.com/flimflam

First, as I said in a comment, storing commands in variables just doesn't work right. Variables are for data, not executable code. Second, you have two levels of quoting here: quotes that're part of the shell syntax (which are parsed, applied, and removed by the shell before the arguments are passed to `curl), and quotes that're part of the JSON syntax.
But the second problem is actually worse than that, because simply embedding an arbitrary string into some JSON may result in JSON syntax errors if the string contains characters that're part of JSON syntax. Which passwords are likely to do. To get the password (and username for that matter) embedded correctly in your JSON, use a tool that understands JSON syntax, like jq:
userinfo=$(jq -n -c --arg u "$username" --arg p "$password" '{"username":$u,"password":$p}')
Explanation: this uses --arg to set the jq variables u and p to the shell variables $username and $password respectively (and the double-quotes around the shell variables will keep the shell from doing anything silly to the values), and creates a JSON snippet with them embedded. jq will automatically add appropriate quoting/escaping/whatever is needed.
Then, to use it with curl, use something like this:
resp=$(curl -k -i -H "Content-Type: application/json" -X POST -d "$userinfo" https://somerepo.example.com/flimflam)
Again, the double-quotes around $userinfo keep the shell from doing anything silly. You should almost always put double-quotes around variables references in the shell.
Note that I never used the req variable to store the command. If you need to print the command (or its equivalent), use something like this:
printf '%q ' curl -k -i -H "Content-Type: application/json" -X POST -d "$userinfo" https://somerepo.example.com/flimflam
echo
The %q format specifier tells the shell to add appropriate quoting/escaping so that you could run the result as a shell command, and it'd work properly. (And the echo is there because printf doesn't automatically add a newline at the end of its output.)

try changing this:
req='curl -k -i -H "Content-Type: application/json" -X POST -d ''{"username":"$username","password":"$password"}'' https://somerepo.example.com/flimflam'
to this
req="curl -k -i -H 'Content-Type: application/json' -X POST -d \"{'username':'$username','password':'$password'}\" https://somerepo.example.com/flimflam"
and similarly for the resp

ah those pesky "curly" thingies...
how 'bout...
req="curl -k -i -H 'Content-Type: application/json' -X POST -d '{\"username\":\"$username\",\"password\":\"$password\"}' https://somerepo.example.com/flimflam"

This needs even more escaping:
With:
resp=$(curl -k -i -H "Content-Type: application/json" -X POST -d "{\"username\":\"$username\",\"password\":\"$password\"}" https://somerepo.example.com/flimflam)
In bash, the variables are still expanded when they're inside single quotes that are inside double quotes.
And you'll need the \" double quotes in the payload as per the JSON definition.
EDIT: I rerun the curl through a HTTP proxy and corrected the script line (see above, removed the single quotes). Results (in raw HTTP) are now:
POST /flimflam HTTP/1.1
Host: somerepo.example.com
User-Agent: curl/7.68.0
Accept: */*
Content-Type: application/json
Content-Length: 44
Connection: close
{"username":"user","password":"12345#abcde"}
(which should be fine)

Related

Using Bash to make a POST request

I have a 100 Jetpacks that I have to sign in to configure. I am trying to do it in a bash script but I am having no luck. I can connect to the wifi no problem but my POST request are not achieving anything. Any Advice? Here is link to my github. I have copies of what I captured on Burp suite https://github.com/Jdelgado89/post_Script
TYIA
#!/bin/bash
nmcli device wifi rescan
nmcli device wifi list
echo "What's they last four?"
read last4
echo "What's the Key?"
read key
nmcli device wifi connect Ellipsis\ \Jetpack\ $last4 password $key
echo "{"Command":"SignIn","Password":"$key"}" > sign_on.json
echo "{"CurrentPassword":"$key","NewPassword":"G1l4River4dm1n","SecurityQuestion":"NameOfStreet","SecurityAnswer":"Allison"}" > change_admin.json
echo "{"SSID":"GRTI Jetpack","WiFiPassword":"G1l4River3r","WiFiMode":0,"WiFiAuthentication":6,"WiFiEncription":4,"WiFiChannel":0,"MaxConnectedDevice":8,"PrivacySeparator":false,"WMM":true,"Command":"SetWifiSetting"}" > wifi.json
cat sign_on.json
cat change_admin.json
cat wifi.json
sleep 5
curl -X POST -H "Cookie: jetpack=6af5e293139d989bdcfd66257b4f5327" -H "Content-Type: application/json" -d #sign_on.json http://192.168.1.1/cgi-bin/sign_in.cgi
sleep 5
curl -X POST -H "Cookie: jetpack=6af5e293139d989bdcfd66257b4f5327" -H "Content-Type: application/json" -d #change_admin.json http://192.168.1.1/cgi-bin/settings_admin_password.cgi
sleep 5
curl -X POST -H "Cookie: jetpack=6af5e293139d989bdcfd66257b4f5327" -H "Content-Type: application/json" -d #wifi.json http://192.168.1.1/cgi-bin/settings_admin_password.cgi
This is not correct:
echo "{"Command":"SignIn","Password":"$key"}" > sign_on.json
The double quotes are not being put literally into the file, they're just terminating the shell string beginning with the previous double quote. So this is writing
{Command:SignIn,Password:keyvalue}
into the file, with no double quotes. You need to escape the nested double quotes.
echo "{\"Command\":\"SignIn\",\"Password\":\"$key\"}" > sign_on.json
However, it would be best if you used the jq utility instead of formatting JSON by hand. See Create JSON file using jq.
jq -nc --arg key "$key" '{"Command":"SignIn","Password":$key}' >sign_on.json

Optionally include a user and password in curl request?

I am optionally including a user and password in a curl request as follows:
declare creds=""
if [ -n "$user" ] && [ -n "$password" ]; then
creds="-u ${user}:${password}"
fi
output=$(curl ${creds} -X PUT -v --write-out "%{http_code}" "$url" \
-H 'Content-Type: application/json' -s -o /dev/null --data "${payload}")
This seems to work fine, but I'm getting this shellcheck warning:
Double quote to prevent globbing and word splitting
https://github.com/koalaman/shellcheck/wiki/SC2086
Puting quotes around it doesn't work, e.g. if I do this:
output=$(curl "${creds}" -X PUT -v --write-out "%{http_code}" "$url" \
-H 'Content-Type: application/json' -s -o /dev/null --data "${payload}")
then when the username and password are not supplied, this results in empty double quotes in the curl request curl "" -X PUT ..., which generates a <url> malformed error.
I could use an if-else for the curl command, but I'd rather avoid the duplication. Is the above approach acceptable despite the shellcheck warning?
You were doing right in putting quotes around the variable, but shellcheck doesn't catch the issue of storing commands in a variable which has its own downfalls. Since this being a an issue with function of the shell, shellcheck can't quite catch it out-of-the-box. When you did below
creds="-u ${user}:${password}"
and quoted "$creds", it is passed as one single argument word to curl instead of being broken down to -u and "${user}:${password}" separately. The right approach should have been to use an array to store the commands and expand it, so that the words are preserved and not split by the shell (foremost reason to quote the variable, as indicated by shellcheck)
creds=(-u "${user}:${password}")
and invoke
curl "${creds[#]}" <rest-of-the-cmd>
Also explore the following
I'm trying to put a command in a variable, but the complex cases always fail!
How to store a command in a variable in a shell script?

use variable between two question mark in a command option in the bash script

here is two samples:
this is a sample that would cause 400 error
curl -i -k -u $account:password -H "Content-Type: application/json" -X PUT -d '{"source-path": "http://${ip}/LTMBlackList_Postbody${filename_extension}","type":"ip"}' https://$ip2$api2
and this is a normal one, it can get a 200 OK response:
curl -i -k -u $account:$password -H "Content-Type: application/json" -X PUT -d '{"source-path": "http://127.0.0.1/LTMBlackList_Postbody-test.log","type":"ip"}' https://$ip2$api2
how could i called the curl command in script with variable?
The $ip is not expanded because it is in single quotes. First close single quotes then do double quotes, expand the variable, close double quotes and conitnue single quoting.
Remember to always quote your variable expansions to disable word splitting
curl -i -k -u "$account:$password" -H "Content-Type: application/json" -X PUT \
-d '{"source-path": "http:/'"$ip"'/LTMBlackList_Postbody'"$filename_extension"'","type":"ip"}' "https://$ip2$api"

"Unexpected end of JSON input" error when trying to do curl POST command [duplicate]

This question already has answers here:
When to wrap quotes around a shell variable?
(5 answers)
Closed 2 years ago.
I am having issues with sending the proper json data in a curl -X POST command. I have successfully run the POST command locally on my mac by copy and pasting the hardcoded values in but I am now trying to create a .sh script to automate this process. Upon running the code below I get this error:
{"message":"Unexpected end of JSON input"}
Here is the output json from JSON_STRING with names made generic and nothing else changed:
{ "url": "api_url", "tileset": "username.filename" }
Once I can figure out how to properly format the json in the POST command I know it will work, but I can't seem to get the syntax right. Hoping a set of fresh/experience bash eyes will be able to catch my mistake:). Also, all variables that I have are correct and already been confirmed by running variable values in mac terminal. Thanks in advance for the help!
curl_url="http://${bucket}.s3.amazonaws.com/${key}"
echo $curl_url
tileset_id="username.filename"
JSON_STRING=$(jq -n \
--arg bn "$curl_url" \
--arg on "$tileset_id" \
'{url: $bn, tileset: $on}')
echo $JSON_STRING
curl -X POST -H "Content-Type: application/json" -H "Cache-Control: no-cache" -d $JSON_STRING 'apiurl'
Absolutely required to quote the shell variable:
curl ... -d "$JSON_STRING" http://example.com/end/point
Otherwise, the shell will do word splitting, and the argument to -d becomes just {"url":
as a side note, bash arrays can help with the readability:
curl_url="http://${bucket}.s3.amazonaws.com/${key}"
tileset_id="username.filename"
JSON_STRING=$(
jq -n \
--arg bn "$curl_url" \
--arg on "$tileset_id" \
'{url: $bn, tileset: $on}'
)
curl_opts=(
-X POST
-H "Content-Type: application/json"
-H "Cache-Control: no-cache"
-d "$JSON_STRING"
)
curl "${curl_opts[#]}" 'apiurl'

How to verify a curl request in bash script?

I have a curl request like this :
curl -s -u $user:$password -X GET -H "Content-Type: application/json" $url
Which returns a json as response. So I will parse the response using jq to get some specific data. Like this :
curl -s -u $user:$password -X GET -H "Content-Type: application/json" $url | jq '<expression>'
Now if the curl request fails then obviously the parsing operation throws ugly error. I want to avoid this. How to store the response first and then later parse it if the request is successful. I don't want to display the json whole response. Also if I add -w "%{http_code}" in my request it appends the status code with the JSON response which messes up the parsing. How to solve this ? I basically want to first check if the curl request is successful or not then get the JSON response and parse it.I also want to get the status code, so that if it fails I can display the status code. But status code is now messing up with json response.
You can combine the --write and --fail options:
# separating the (verbose) curl options into an array for readability
curl_args=(
--write "%{http_code}\n"
--fail
--silent
--user "$user:$password"
--request GET
--header "Content-Type: application/json"
)
if ! output=$(curl "${curl_args[#]}" "$url"); then
echo "Failure: code=$output"
else
# remove the "http_code" line from the end of the output, and parse it
sed '$d' <<<"$output" | jq '...'
fi
Also note: quote your variables!
I found glenn jackman's answer good, but a bit confusingly written, so I rewrote it, and altered it so I can use it as a safer alternative to curl | jq.
#!/bin/bash
# call this with normal curl arguments, especially url argument, e.g.
# safecurl.sh "http://example.com:8080/something/"
# separating the (verbose) curl options into an array for readability
curl_args=(
-H 'Accept:application/json'
-H 'Content-Type:application/json'
--write '\n%{http_code}\n'
--fail
--silent
)
echo "${curl_args[#]}"
# prepend some arguments, but pass on whatever arguments this script was called with
output=$(curl "${curl_args[#]}" "$#")
return_code=$?
if [ 0 -eq $return_code ]; then
# remove the "http_code" line from the end of the output, and parse it
echo "$output" | sed '$d' | jq .
else
# echo to stderr so further piping to jq will process empty output
>&2 echo "Failure: code=$output"
fi
Note: This code does not test for services that ignore the requested content type and respond with HTML. You'd need to test for grep -l '</html>' for that.

Resources