I'm writing a script to check whether a file can be downloaded from a remote server using curl. The script will report if there is a problem downloading the file or the file is not present. I'm using the below code snippet:
curl --fail -u "$USERNAME:$PASSWORD" --remote-name "$HOST/$FILEPATH"
if [ $? -ne 0 ]
then
echo "There is some error in Downloading file from $HOST";
else
echo "Download success";
fi
But I always get the exit status code as 0. That's because curl always downloads the file ( even when the file is not present) and you can find the actual error message in that file.
So how can I get the real curl exit status code ( 22 file not found) ?
--fail will return an error code of 22 in case of http protocol and
the return code may be different for other protocols say ftp. See below :
me#udistro:~$ curl --fail ftp://ftp.redhat.com/redhat/brms/6.2.0/en/source/MD5
curl: (78) RETR response: 550
me#udistro:~$ echo $?
78
But as the man page states :
This method is not fail-safe and there are occasions where
non-successful response codes will slip through, especially when
authentication is involved (response codes 401 and 407).
As a starting point you could try this :
curl --fail -u "$USERNAME:$PASSWORD" --remote-name "$HOST/$FILEPATH"
var=$?
if [ $var -ne 78 ] && [ $var -ne 22 ]
then
echo "Success"
else
echo "Failed"
fi
Related
I tried to wrote a Bash script to manage cURL download errors by parsing the response header. However, with all Github downloads, after the redirect the header is HTTP/1.1 403 Forbidden, despite the fact that the download works.
function curldown {
url="$1"
code=$(curl -LI $url | awk '/^HTTP/{a=$2} END{print a}')
if [[ "${code:0:1}" == "4" ]]; then
echo "Error $code"
else
curl -JOL $url > /tmp/curldown
fi
}
url='https://github.com/keybase/client/releases/download/v5.1.1/keybase-v5.1.1.tar.xz'
curldown $url
# Error 403
but
curl -JOL $url
gives a working output.
Any fix?
A better solution :
http_code=$(curl -Ls "$url" -w '%{http_code}\n' -o /dev/null)
if ((http_code >= 400)); then
echo >&2 "Error detected; HTTP code $http_code"
fi
From bash/curl I am consuming an API that receives a POST, perform heavy long tasks and returns a 200 on success. As we were facing some timeouts in WAF, API has been improved to accept header:
--header "Expect:102-Processing"
If API receives that header it sends a HTTP 102 every 20 secs until process finishes and sends a HTTP 200. This should be enough to prevent timeouts.
What I have to do to deal with those HTTP 102?
I added that header to my curl command but as soon as it receives first 102, curl command finishes.
I was thinking that maybe there is a parameter in curl to wait until 200 or error.
Another option I have in mind is waiting in a loop querying for status but I don't know how to instruct curl to monitor that connection
This is a test version of my bash script.
#!/bin/bash
clear
function_triggerFooAdapter()
{
pFooUrl=$1
pPayload=$2
pCURLConnectTimeout=$3
pWaitForFooResponse=$4
pAddExpect102Header=$5
rm ./tmpResult.html 2>/dev/null
rm ./tmpResult.txt 2>/dev/null
echo "Triggering internal Foo adapter $pFooAdapterName"
echo "Full URL=$pFooUrl"
echo "Payload to send=$pPayload"
echo "Curl connect-timeout=$pCURLConnectTimeout"
echo "WaitForFooResponse=$pWaitForFooResponse"
echo "AddExpect102Header=$pAddExpect102Header"
if [[ "$pAddExpect102Header" = true ]]; then
text102Header="Expect:102-Processing"
else
text102Header="NoExpect;" # send innofensive custom header
fi
if [[ "$pWaitForFooResponse" = true ]]; then
echo "So DO wait..."
Response=$(curl -k --write-out %{http_code} --header "$text102Header" --header "Content-Type:application/json" --silent --connect-timeout $pCURLConnectTimeout --output ./tmpResult.html -X POST --data "$pPayload" "$pFooUrl" 2>&1 | tee ./tmpResult.txt)
echo "HTTP Response=$Response"
echo "$(cat ./tmpResult.txt)"
if [ "${Response:0:1}" -eq "1" ] || [ "${Response:0:1}" -eq "2" ]; then #if HTTP Response start by 1 or 2 (10x - 20x)...
echo "Adapter sucessfully triggered."
return 0
else
# cat ./tmpResult.html 2>/dev/null
#cat ./tmpResult.txt 2>/dev/null
echo
echo "HTTP error trying to trigger adapter."
return 1
fi
else
echo "So DO NOT wait..."
curl -k --write-out %{http_code} --header "$text102Header" --header "Content-Type:application/json" --silent --connect-timeout $pCURLConnectTimeout --output ./tmpResult.html -X POST --data "$pPayload" "$pFooUrl" > /dev/null 2>&1 &
echo "Adapter sucessfully (hopefully) triggered. NOT reading HTTP response until Foo code is upgraded to respond directly a HTTP 200 Successfully queued or similar."
return 0
fi
}
clear
export http_proxy="http://1.1.1.256:3128/"
export https_proxy="http://1.1.1.256:3128/"
export no_proxy="foo.com"
# Main
clear
echo "STEP 09- Triggering Foo Internal Adapters."
echo "Proxy settings:"
env | grep proxy
function_triggerFooAdapter "http://foo.com/lookups/trigger_foo" "" 600 true true
Run it manually and CHECK what curl -v is sending as the headers; I would expect to see something like
> POST /the/url HTTP/1.1
> Host: thehost.com:80
> User-Agent: curl/7.51.0
> Accept: */*
> Expect: 102-Processing
... some stuff skipped.
If you're not sending the Expect header; then curl is in fact doing the right thing..
I seem to not be able to get the exit code of the command execution in a Bash if condition:
#! /bin/bash
set -eu
if ! curl -sS --fail http://not-there; then
echo "ERROR: curl failed with exit code of $?" >&2
fi
But the $? always returns zero, when my curl exits with non-zero.
If I don't do the curl command inside of the if-condition, then my $? returns correctly.
Am I missing something here?
In your original code, $? is returning the exit status not of curl, but of ! curl.
To preserve the original value, choose a control structure that doesn't require that inversion:
curl -sS --fail http://not-there || {
echo "ERROR: curl failed with exit code of $?" >&2
exit 1
}
...or something akin to:
if curl -sS --fail http://not-there; then
: "Unexpected success"
else
echo "ERROR: curl failed with exit status of $?" >&2
fi
Another way to achieve what you want to do is to collect the return code first, then perform the if statement.
#!/bin/bash
set -eu
status=0
curl -sS --fail http://not-there || status=$?
if ((status)) then
echo "ERROR: curl failed with exit code of $status" >&2
fi
I find this method especially convenient when checking for failure of several commands when you want to return an error code at the end of your script or function if any of these has failed.
Please note in the above I use an arithmetic test, which returns true (0) if the value inside is non-zero, and false (non-zero) otherwise. It is shorter (and more readable to my own taste) than using something like [[ $status != 0 ]].
I am trying to do a CURL with an IF Else condition. On success of the call Print a successful message or else Print the call failed.
My Sample Curl would look like:
curl 'https://xxxx:1234xxxx#abc.dfghj.com/xl_template.get_web_query?id=1035066' > HTML_Output.html
I want to do the same thing using Shell.
Using JavaScript:
if(res.status === 200){console.log("Yes!! The request was successful")}
else {console.log("CURL Failed")}
Also, I see the CURL percentage, but I do not know, how to check the percentage of CURL. Please help.
CURL output
You can use the -w (--write-out) option of curl to print the HTTP code:
curl -s -w '%{http_code}\n' 'https://xxxx:1234xxxx#abc.dfghj.com/xl_template.get_web_query?id=1035066'
It will show the HTTP code the site returns.
Also curl provides a whole bunch of exit codes for various scenarios, check man curl.
One way of achieving this like,
HTTPS_URL="https://xxxx:1234xxxx#abc.dfghj.com/xl_template.get_web_query?id=1035066"
CURL_CMD="curl -w httpcode=%{http_code}"
# -m, --max-time <seconds> FOR curl operation
CURL_MAX_CONNECTION_TIMEOUT="-m 100"
# perform curl operation
CURL_RETURN_CODE=0
CURL_OUTPUT=`${CURL_CMD} ${CURL_MAX_CONNECTION_TIMEOUT} ${HTTPS_URL} 2> /dev/null` || CURL_RETURN_CODE=$?
if [ ${CURL_RETURN_CODE} -ne 0 ]; then
echo "Curl connection failed with return code - ${CURL_RETURN_CODE}"
else
echo "Curl connection success"
# Check http code for curl operation/response in CURL_OUTPUT
httpCode=$(echo "${CURL_OUTPUT}" | sed -e 's/.*\httpcode=//')
if [ ${httpCode} -ne 200 ]; then
echo "Curl operation/command failed due to server return code - ${httpCode}"
fi
fi
Like most programs, curl returns a non-zero exit status if it gets an error, so you can test it with if.
if curl 'https://xxxx:1234xxxx#abc.dfghj.com/xl_template.get_web_query?id=1035066' > HTML_Output
then echo "Request was successful"
else echo "CURL Failed"
fi
I don't know of a way to find out the percentage if the download fails in the middle.
I'm writing a script to download a bunch of files, and I want it to inform when a particular file doesn't exist.
r=`wget -q www.someurl.com`
if [ $r -ne 0 ]
then echo "Not there"
else echo "OK"
fi
But it gives the following error on execution:
./file: line 2: [: -ne: unary operator expected
What's wrong?
Others have correctly posted that you can use $? to get the most recent exit code:
wget_output=$(wget -q "$URL")
if [ $? -ne 0 ]; then
...
This lets you capture both the stdout and the exit code. If you don't actually care what it prints, you can just test it directly:
if wget -q "$URL"; then
...
And if you want to suppress the output:
if wget -q "$URL" > /dev/null; then
...
$r is the text output of wget (which you've captured with backticks). To access the return code, use the $? variable.
$r is empty, and therefore your condition becomes if [ -ne 0 ] and it seems as if -ne is used as a unary operator. Try this instead:
wget -q www.someurl.com
if [ $? -ne 0 ]
...
EDIT As Andrew explained before me, backticks return standard output, while $? returns the exit code of the last operation.
you could just
wget ruffingthewitness.com && echo "WE GOT IT" || echo "Failure"
-(~)----------------------------------------------------------(07:30 Tue Apr 27)
risk#DockMaster [2024] --> wget ruffingthewitness.com && echo "WE GOT IT" || echo "Failure"
--2010-04-27 07:30:56-- http://ruffingthewitness.com/
Resolving ruffingthewitness.com... 69.56.251.239
Connecting to ruffingthewitness.com|69.56.251.239|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: unspecified [text/html]
Saving to: `index.html.1'
[ <=> ] 14,252 72.7K/s in 0.2s
2010-04-27 07:30:58 (72.7 KB/s) - `index.html.1' saved [14252]
WE GOT IT
-(~)-----------------------------------------------------------------------------------------------------------(07:30 Tue Apr 27)
risk#DockMaster [2025] --> wget ruffingthewitness.biz && echo "WE GOT IT" || echo "Failure"
--2010-04-27 07:31:05-- http://ruffingthewitness.biz/
Resolving ruffingthewitness.biz... failed: Name or service not known.
wget: unable to resolve host address `ruffingthewitness.biz'
zsh: exit 1 wget ruffingthewitness.biz
Failure
-(~)-----------------------------------------------------------------------------------------------------------(07:31 Tue Apr 27)
risk#DockMaster [2026] -->
Best way to capture the result from wget and also check the call status
wget -O filename URL
if [[ $? -ne 0 ]]; then
echo "wget failed"
exit 1;
fi
This way you can check the status of wget as well as store the output data.
If call is successful use the output stored
Otherwise it will exit with the error wget failed
I been trying all the solutions without lucky.
wget executes in non-interactive way. This means that wget work in the background and you can't catch the return code with $?.
One solution it's to handle the "--server-response" property, searching http 200 status code
Example:
wget --server-response -q -o wgetOut http://www.someurl.com
sleep 5
_wgetHttpCode=`cat wgetOut | gawk '/HTTP/{ print $2 }'`
if [ "$_wgetHttpCode" != "200" ]; then
echo "[Error] `cat wgetOut`"
fi
Note: wget need some time to finish his work, for that reason I put "sleep 5". This is not the best way to do but worked ok for test the solution.