How to check if curl was successful and print a message? - bash

I am trying to do a CURL with an IF Else condition. On success of the call Print a successful message or else Print the call failed.
My Sample Curl would look like:
curl 'https://xxxx:1234xxxx#abc.dfghj.com/xl_template.get_web_query?id=1035066' > HTML_Output.html
I want to do the same thing using Shell.
Using JavaScript:
if(res.status === 200){console.log("Yes!! The request was successful")}
else {console.log("CURL Failed")}
Also, I see the CURL percentage, but I do not know, how to check the percentage of CURL. Please help.
CURL output

You can use the -w (--write-out) option of curl to print the HTTP code:
curl -s -w '%{http_code}\n' 'https://xxxx:1234xxxx#abc.dfghj.com/xl_template.get_web_query?id=1035066'
It will show the HTTP code the site returns.
Also curl provides a whole bunch of exit codes for various scenarios, check man curl.

One way of achieving this like,
HTTPS_URL="https://xxxx:1234xxxx#abc.dfghj.com/xl_template.get_web_query?id=1035066"
CURL_CMD="curl -w httpcode=%{http_code}"
# -m, --max-time <seconds> FOR curl operation
CURL_MAX_CONNECTION_TIMEOUT="-m 100"
# perform curl operation
CURL_RETURN_CODE=0
CURL_OUTPUT=`${CURL_CMD} ${CURL_MAX_CONNECTION_TIMEOUT} ${HTTPS_URL} 2> /dev/null` || CURL_RETURN_CODE=$?
if [ ${CURL_RETURN_CODE} -ne 0 ]; then
echo "Curl connection failed with return code - ${CURL_RETURN_CODE}"
else
echo "Curl connection success"
# Check http code for curl operation/response in CURL_OUTPUT
httpCode=$(echo "${CURL_OUTPUT}" | sed -e 's/.*\httpcode=//')
if [ ${httpCode} -ne 200 ]; then
echo "Curl operation/command failed due to server return code - ${httpCode}"
fi
fi

Like most programs, curl returns a non-zero exit status if it gets an error, so you can test it with if.
if curl 'https://xxxx:1234xxxx#abc.dfghj.com/xl_template.get_web_query?id=1035066' > HTML_Output
then echo "Request was successful"
else echo "CURL Failed"
fi
I don't know of a way to find out the percentage if the download fails in the middle.

Related

cURL: manage download errors

I tried to wrote a Bash script to manage cURL download errors by parsing the response header. However, with all Github downloads, after the redirect the header is HTTP/1.1 403 Forbidden, despite the fact that the download works.
function curldown {
url="$1"
code=$(curl -LI $url | awk '/^HTTP/{a=$2} END{print a}')
if [[ "${code:0:1}" == "4" ]]; then
echo "Error $code"
else
curl -JOL $url > /tmp/curldown
fi
}
url='https://github.com/keybase/client/releases/download/v5.1.1/keybase-v5.1.1.tar.xz'
curldown $url
# Error 403
but
curl -JOL $url
gives a working output.
Any fix?
A better solution :
http_code=$(curl -Ls "$url" -w '%{http_code}\n' -o /dev/null)
if ((http_code >= 400)); then
echo >&2 "Error detected; HTTP code $http_code"
fi

How to get success count, failure count and failure reason when testing rest webservices from file using shell script

Hi i am testing web services using shell script by having multiple if condition, with the shell script coding i am getting success count, failure count and failure reason
success=0
failure=0
if curl -s --head --request DELETE http://localhost/bimws/delete/deleteUser?email=pradeepkumarhe1989#gmail.com | grep "200 OK" > /dev/null; then
success=$((success+1))
else
echo "DeleteUser is not working"$'\r' >> serverLog.txt
failure=$((failure+1))
fi
if curl -s --head --request GET http://localhost/bimws/get/getUserDetails?email=anusha4saju#gmail.com | grep "200 OK" > /dev/null; then
success=$((success+1))
else
curl -s --head --request GET http://localhost/bimws/get/getUserDetails?email=anusha4saju#gmail.com > f1.txt
echo "getUserDetails is not working"$'\r' >> serverLog.txt
failure=$((failure+1))
fi
if curl -s -i -X POST -H "Content-Type:application/json" http://localhost/bimws/post/addProjectLocationAddress -d '{"companyid":"10","projectid":"200","addresstypeid":"5","address":"1234 main st","city":"san jose","state":"CA","zip":"989898","country":"United States"}' | grep "200 OK" > /dev/null; then
success=$((success+1))
else
echo "addProjectLocationAddress is not working"$'\r' >> serverLog.txt
failure=$((failure+1))
fi
echo $success Success
echo $failure failure
but i am looking forward to test the web services from a file like i have file called web_services.txt which contains all my web services using shell script how do i execute and success count, failure count and failure reason
web_services.txt
All are different calls delete,get and post
http://localhost/bimws/delete/deleteUser?email=pradeepkumarhe1989#gmail.com
http://localhost/bimws/get/getUserDetails?email=anusha4saju#gmail.com
http://localhost/bimws/post/addProjectLocationAddress -d '{"companyid":"10","projectid":"200","addresstypeid":"5","address":"1234 main st"
,"city":"san jose","state":"CA","zip":"989898","country":"United States"}'
First of all, your current code does not correctly deal with empty lines. You need to skip those.
Your lines already contain shell commands. Running curl on them makes no sense. Instead, you should evaluate these commands.
Then, you need to modify curl so that it reports whether the request was successful by adding -f:
FILE=D:/WS.txt
success=0
failure=0
while read LINE; do
if test -z "$LINE"; then
continue
fi
if eval $(echo "$LINE" | sed 's/^curl/curl -f -s/') > /dev/null; then
success=$((success+1))
else
echo $LINE >> aNewFile.txt
failure=$((failure+1))
fi
done < $FILE
echo $success Success
echo $failure failure

How to deal with HTTP 102 in bash?

From bash/curl I am consuming an API that receives a POST, perform heavy long tasks and returns a 200 on success. As we were facing some timeouts in WAF, API has been improved to accept header:
--header "Expect:102-Processing"
If API receives that header it sends a HTTP 102 every 20 secs until process finishes and sends a HTTP 200. This should be enough to prevent timeouts.
What I have to do to deal with those HTTP 102?
I added that header to my curl command but as soon as it receives first 102, curl command finishes.
I was thinking that maybe there is a parameter in curl to wait until 200 or error.
Another option I have in mind is waiting in a loop querying for status but I don't know how to instruct curl to monitor that connection
This is a test version of my bash script.
#!/bin/bash
clear
function_triggerFooAdapter()
{
pFooUrl=$1
pPayload=$2
pCURLConnectTimeout=$3
pWaitForFooResponse=$4
pAddExpect102Header=$5
rm ./tmpResult.html 2>/dev/null
rm ./tmpResult.txt 2>/dev/null
echo "Triggering internal Foo adapter $pFooAdapterName"
echo "Full URL=$pFooUrl"
echo "Payload to send=$pPayload"
echo "Curl connect-timeout=$pCURLConnectTimeout"
echo "WaitForFooResponse=$pWaitForFooResponse"
echo "AddExpect102Header=$pAddExpect102Header"
if [[ "$pAddExpect102Header" = true ]]; then
text102Header="Expect:102-Processing"
else
text102Header="NoExpect;" # send innofensive custom header
fi
if [[ "$pWaitForFooResponse" = true ]]; then
echo "So DO wait..."
Response=$(curl -k --write-out %{http_code} --header "$text102Header" --header "Content-Type:application/json" --silent --connect-timeout $pCURLConnectTimeout --output ./tmpResult.html -X POST --data "$pPayload" "$pFooUrl" 2>&1 | tee ./tmpResult.txt)
echo "HTTP Response=$Response"
echo "$(cat ./tmpResult.txt)"
if [ "${Response:0:1}" -eq "1" ] || [ "${Response:0:1}" -eq "2" ]; then #if HTTP Response start by 1 or 2 (10x - 20x)...
echo "Adapter sucessfully triggered."
return 0
else
# cat ./tmpResult.html 2>/dev/null
#cat ./tmpResult.txt 2>/dev/null
echo
echo "HTTP error trying to trigger adapter."
return 1
fi
else
echo "So DO NOT wait..."
curl -k --write-out %{http_code} --header "$text102Header" --header "Content-Type:application/json" --silent --connect-timeout $pCURLConnectTimeout --output ./tmpResult.html -X POST --data "$pPayload" "$pFooUrl" > /dev/null 2>&1 &
echo "Adapter sucessfully (hopefully) triggered. NOT reading HTTP response until Foo code is upgraded to respond directly a HTTP 200 Successfully queued or similar."
return 0
fi
}
clear
export http_proxy="http://1.1.1.256:3128/"
export https_proxy="http://1.1.1.256:3128/"
export no_proxy="foo.com"
# Main
clear
echo "STEP 09- Triggering Foo Internal Adapters."
echo "Proxy settings:"
env | grep proxy
function_triggerFooAdapter "http://foo.com/lookups/trigger_foo" "" 600 true true
Run it manually and CHECK what curl -v is sending as the headers; I would expect to see something like
> POST /the/url HTTP/1.1
> Host: thehost.com:80
> User-Agent: curl/7.51.0
> Accept: */*
> Expect: 102-Processing
... some stuff skipped.
If you're not sending the Expect header; then curl is in fact doing the right thing..

Getting exit status of somecommand after "if ! somecommand"

I seem to not be able to get the exit code of the command execution in a Bash if condition:
#! /bin/bash
set -eu
if ! curl -sS --fail http://not-there; then
echo "ERROR: curl failed with exit code of $?" >&2
fi
But the $? always returns zero, when my curl exits with non-zero.
If I don't do the curl command inside of the if-condition, then my $? returns correctly.
Am I missing something here?
In your original code, $? is returning the exit status not of curl, but of ! curl.
To preserve the original value, choose a control structure that doesn't require that inversion:
curl -sS --fail http://not-there || {
echo "ERROR: curl failed with exit code of $?" >&2
exit 1
}
...or something akin to:
if curl -sS --fail http://not-there; then
: "Unexpected success"
else
echo "ERROR: curl failed with exit status of $?" >&2
fi
Another way to achieve what you want to do is to collect the return code first, then perform the if statement.
#!/bin/bash
set -eu
status=0
curl -sS --fail http://not-there || status=$?
if ((status)) then
echo "ERROR: curl failed with exit code of $status" >&2
fi
I find this method especially convenient when checking for failure of several commands when you want to return an error code at the end of your script or function if any of these has failed.
Please note in the above I use an arithmetic test, which returns true (0) if the value inside is non-zero, and false (non-zero) otherwise. It is shorter (and more readable to my own taste) than using something like [[ $status != 0 ]].

curl exit status code issue while using authentication

I'm writing a script to check whether a file can be downloaded from a remote server using curl. The script will report if there is a problem downloading the file or the file is not present. I'm using the below code snippet:
curl --fail -u "$USERNAME:$PASSWORD" --remote-name "$HOST/$FILEPATH"
if [ $? -ne 0 ]
then
echo "There is some error in Downloading file from $HOST";
else
echo "Download success";
fi
But I always get the exit status code as 0. That's because curl always downloads the file ( even when the file is not present) and you can find the actual error message in that file.
So how can I get the real curl exit status code ( 22 file not found) ?
--fail will return an error code of 22 in case of http protocol and
the return code may be different for other protocols say ftp. See below :
me#udistro:~$ curl --fail ftp://ftp.redhat.com/redhat/brms/6.2.0/en/source/MD5
curl: (78) RETR response: 550
me#udistro:~$ echo $?
78
But as the man page states :
This method is not fail-safe and there are occasions where
non-successful response codes will slip through, especially when
authentication is involved (response codes 401 and 407).
As a starting point you could try this :
curl --fail -u "$USERNAME:$PASSWORD" --remote-name "$HOST/$FILEPATH"
var=$?
if [ $var -ne 78 ] && [ $var -ne 22 ]
then
echo "Success"
else
echo "Failed"
fi

Resources