I'm trying to run a curl command such as:
curl -Sks -XPOST https://localhost:9999/some/local/service -d 'some text arguments'
The above works fine. However, when I place
'some text arguments'
in a file and call the command as:
curl -Sks -XPOST https://localhost:9999/some/local/service -d `cat file_with_args`
I get many exceptions from the service. Does anyone know why this is?
Thanks!
If the file contains 'some text arguments', then your command is equivalent to this:
curl -Sks -XPOST https://localhost:9999/some/local/service -d \'some text arguments\'
— that is, it's passing 'some, text, and arguments' as three separate arguments to curl.
Instead, you should put just some text arguments in the file (no single-quotes), and run this command:
curl -Sks -XPOST https://localhost:9999/some/local/service -d "`cat file_with_args`"
(wrapping the `cat file_with_args` part in double-quotes so that the resulting some text arguments doesn't get split into separate arguments).
Incidentally, I recommend writing $(...) rather than `...`, because it's more robust in general (though in your specific command it doesn't make a difference).
Related
Im trying to hit a rest api with token in header.
apikeyName="$(date '+%s')"
key=$(curl -k -X POST -H "Content-Type: application/json" \
-d '{"name":"'$apikeyName'", "role": "Admin"}' \
http://admin:admin#localhost:3000/api/auth/keys | jq '.key')
echo $key
# # Alerting API
curl -k -X GET 'http://localhost:3000/api/alert-notifications' -H 'Authorization: Bearer '$key'';
Terminal output
"eyJrIjoiaWJPaDFFZXZMeW1RYU90NUR4d014T3hYUmR6NDVUckoiLCJuIjoiMTY3NTM1OTc4OCIsImlkIjoxfQ=="
{"message":"invalid API key","traceID":""}
First 1 is the key printing and last one from api response. I tried to hardcode the key and it works.
Short answer: Use jq -r '.key' to extract the key from the json response without adding quotes to it.
Long answer: There is a difference between quotes on the command line and quotes embedded in a variable. Consider:
key='"abcd"'
printf '%s\n' $key "abcd"
# prints:
# "abcd"
# abcd
Quotes on the command line are bash syntax. Bash notes what is being quoted and then removes the quotes from the command line when it's done, thus printf only prints abcd in the second case above.
Quotes inside a variable are plain old data. Bash doesn't do anything with them, so they get passed through to the command like any other data and printf prints "abcd" in the first case.
In your curl case the receiver doesn't expect the key to have quotes embedded in the data. So, curl -blah "keydata" works fine because bash takes the quotes out, but curl -blah $key fails because bash does NOT remove the embedded quotes.
See also: BashParser
When write bash scripts, I want to store my whole curl command in heredoc to get a better layout. The following works fine:
#/bin/bash
read -r -d '' command1 <<- MULTI_STRING_SCOPE
curl -v www.stackoverflow.com
MULTI_STRING_SCOPE
But when add some json data with the -d option, the command is executed weirdly. For example:
#/bin/bash
read -r -d '' command2 <<- MULTI_STRING_SCOPE
curl -v www.stackoverflow.com
-d '{
"hello":"world"
}'
MULTI_STRING_SCOPE
response2=$(${command2})
Wrong logs from terminal:
curl: (3) URL using bad/illegal format or missing URL
curl: (3) unmatched close brace/bracket in URL position 1:
}'
And it seems that the curl take line }' as a seperated url, and thus the json data not sent as a unit.
How to solve the problem? Any suggestions will be highly appreciated.
I learned from this post to make heredoc work with curl command.
As the comment made by #Gordon Davisson in current post, we should not mix command with data. Since the json data set to -d option is only data and other parts is command, so I decide to use heredoc to store only the json data and remain other parts to be command itself, rather than store them in string by heredoc.
The result bash script should be something like this:
#/bin/bash
response3=$(curl -v www.stackoverflow.com \
-d #- <<- MULTI_STRING_SCOPE
{
"hello":"world"
}
MULTI_STRING_SCOPE
)
Notice: heredoc indent only works with tab, not with blanks. Be careful, especially when you are working with editors like Visual Studio Code, which may have already set indent as blanks for you.
I have a script which needs to pass a string containing a space.
Basically, I want to collect the error return of calling curl "https://my-api.com" -H'X-API-Key: API key here'
The following works, but I'd really rather avoid using eval. I'm sure there is a cleaner way of writing this, but I can't find it.
CURL="curl -s --fail $URL -H\"X-API-Key:$API_KEY\""
return $(eval "$CURL" >> /dev/null)
This is a duplicate of Dynamically building a command in bash, but here is the fix for your code.
Please take the time to read the answers from the parent question.
# Build the curl command with its arguments in an array
curl_cmd=(curl -s --fail "$URL" -H "X-API-Key:$API_KEY")
# Execute the curl command with its arguments from the curl_cmd array
"${curl_cmd[#]}"
I have a curl command in a script, when the script is run the command isn't able to fetch a resource (the command itself works, it's the response that's incorrect), but if I copy and paste the same command into the terminal I get the expected response.
After reading this my script looks like this:
jsess=`awk '/\sJSESSION/ { print "\x27"$6"="$7"\x27" }' cookies.txt`
ARGS=( -k -v -b $jsess $url7)
echo "curl ${ARGS[*]}"
curl "${ARGS[#]}"
and the last echo looks like this:
curl -k -v -b 'JSESSIONID=hexystuff' https://secretstuff.com
The last curl doesn't work, but copy-pasting that echo works. Any ideas what could be wrong? Thanks.
The problem seems in the two single quotes, try this :
jsess="$(awk '/\sJSESSION/ { print $6"="$7 }' cookies.txt)"
ARGS=( -k -v -b "$jsess" "$url7")
echo "curl ${ARGS[*]}"
curl "${ARGS[#]}"
args="-k -v -b"
jsess=$(awk '/\sJSESSION/ { print "\x27"$6"="$7"\x27" }' cookies.txt)
url7="https://secretstuff.com"
curl "${args}" "${jsess}" "${url7}"
The use of arrays is not my personal preference, and I believe the current situation demonstrates why. I believe that as much as possible, every individual item of data should be contained in its' own variable. This makes accessing said variables much simpler, and also greatly increases flexibility. I can choose exactly which pieces of information will go into a given command line.
I am having some trouble figuring out the expansion of a variable in my shell script. If I replace the variable with the desired string it works.
#!/bin/zsh
KEY="$(curl -Ivs -X GET "http://admin:admin#192.168.1.1" &> >(awk '/^> Authorization/{ print $3 " " $4 }'))"
# The string returned by the curl and awk command is Basic "YWRtaW46YWRtaW4=" without double quotes.
curl -H "Authorization: $KEY" "http://192.168.1.1/userRpm/WlanMacFilterRpm.htm?Page=1&exclusive=1"
# This doesn't work
curl -H "Authorization: Basic YWRtaW46YWRtaW4=" "http://192.168.1.1/userRpm/WlanMacFilterRpm.htm?Page=1&exclusive=1"
# This works
The only thing thats different in the above two lines is.
-H "Authorization: $KEY"
-H "Authorization: Basic YWRtaW46YWRtaW4="
HTTP generally uses DOS style CR+LF line terminators. Whenever you parse data out of curl, you have to account for this.
To check if this is the problem, run your script with bash -x yourscript or zsh -x yourscript to see trace output that shows otherwise invisible carriage returns:
var=$'value\C-M' # zsh
var=$'value\r' # bash
(Dash and ash/busybox unfortunately doesn't highlight this problem, so try with one of the above shells)
To strip them, pipe your data through tr -d '\r'.