curl returns empty reply from server bash due to curl failure - bash

i am writing a simple bash script to "curl get" some values. Sometimes the code works and sometimes it fails, and says "empty reply from server".
How to set up a check for this in bash so that if the curl fails once it tries again until it gets the values?

while ! curl ... # add your specific curl statement here
do
{ echo "Exit status of curl: $?"
echo "Retrying ..."
} 1>&2
# you may add a "sleep 10" or similar here to retry only after ten seconds
done
In case you want the output of that curl in a variable, feel free to capture it:
output=$(
while ! curl ... # add your specific curl statement here
do
{ echo "Exit status of curl: $?"
echo "Retrying ..."
} 1>&2
# you may add a "sleep 10" or similar here to retry only after ten seconds
done
)
The messages about the retry are printed to stderr, so they won't mess up the curl output.

People are overcomplicating this:
until contents=$(curl "$url")
do
sleep 10
done

For me sometimes it happens when curl timed out and there is no information about that. Try curl with --connect-timeout 600 (in seconds) like:
curl --connect-timeout 600 "https://api.morph.io/some_stuff/data.json"
Maybe this helps you.

if you wanted to try the command until it succeeded, you could say:
command_to_execute; until (( $? == 0 )); do command_to_execute; done

Related

BASH - how to force sleep of 300s before continuing the execution, if upon sending a curl post request the server returns an unexpected result

#Socowi guided me to the perfect solution, you can see it at the bottom of the question:
(1)
Here's a practical example of a script whose content consists of 10 curl POST requests, each will result in posting a different comment on my website.
#!/bin/bash
curl "https://mywebsite.com/comment... ...&text=Test-1"; sleep 60;
curl "https://mywebsite.com/comment... ...&text=Test-2"; sleep 60;
curl "https://mywebsite.com/comment... ...&text=Test-3"; sleep 60;
curl "https://mywebsite.com/comment... ...&text=Test-4"; sleep 60;
curl "https://mywebsite.com/comment... ...&text=Test-5"; sleep 60;
curl "https://mywebsite.com/comment... ...&text=Test-6"; sleep 60;
curl "https://mywebsite.com/comment... ...&text=Test-7"; sleep 60;
curl "https://mywebsite.com/comment... ...&text=Test-8"; sleep 60;
curl "https://mywebsite.com/comment... ...&text=Test-9"; sleep 60;
curl "https://mywebsite.com/comment... ...&text=Test-10"; sleep 60;
(2)
When that goes smoothly, here's how the terminal looks like:
(3)
The problem: On random intervals something will go wrong, and instead of what's shown on the screenshot above, I will start getting large amounts of text containing words like "Something went wrong". For an example, it can execute the first 6 curl commands just fine, and on the 7th there will be a bad response... upon which the script continues further, and runs the 8th curl command and gets the same error shown in the terminal, and the script just goes on until the end leaving me with partially finished work.
(4)
The solution desired: I just want the script to pause/wait for 300 seconds whenever an error alike is thrown out in the terminal, before proceeding with running the next curl command in line in the script. The waiting does help, but I have to do it manually at the moment. Kindly help me with a solution how to properly modify my script to achieve the same.
Thank you !
EDIT: The Solution for my problem as described, thanks to #Socowi:
#!/bin/bash
for i in {1..10}; do
if curl "https://mywebsite.com/comment... ...&text=Test-$i" | grep -qF '"status":"ERROR"'; then
sleep 300 # there was an error
else
sleep 60 # no error given
fi
done
exec $SHELL
Usually you could use if curl ... to check the exit status and adapt the sleeping time accordingly. However, in your case curl succeeds to get a response back. curl doesn't care about the content of the response, but you can check the content yourself. In your case a tool for json would be the proper way to parse the response, but a hacky grep does the job as well.
Since you want to print the response to the terminal, we use a variable, so that we can print the response and use grep on it.
#!/bin/bash
for i in {1..10}; do
response=$(curl "https://...&text=Test-$i")
echo "$response"
if grep -qF '"status":"ERROR"' <<< "$response"; then
sleep 300 # there was an error
else
sleep 60 # everything ok
fi
done

BASH If command contains 'this text' do another command?

I'm creating a bash script to check the HTTP headers on remote hosts, I'm doing this via cURL and have noted that appending http://{host} will only work for services running on tcp\80, and not tcp\443. For example for HTTPS services, you require curl -I -k {host}, as opposed to HTTP services which only required curl -I {host}. This is my script:
for host in $(cat file.txt); do
echo " "
echo "Current host: "${host}
curl -I -k https://${host}
echo " "
echo "=============================================="
done
Now what I'm wanting is some condition operator to check that if the output is "Could not resolve host" then the script should run "curl -I http://{host}" on those hosts which the stdout contained the str value "Could not resolve host".
How can I achieve this in bash?
stdout will not contain Could not resolve host though, that's output to stderr. While you could capture stderr and then do string matching, there is a much, much simpler solution: exit code.
You can see here that curl will always exit with code 6 when it fails to resolve host. Thus, simply testing the exit code is sufficient:
curl -i -k http://nowaythisthingexists.test
if [[ $? -eq 6 ]]
then
echo "oopsie, couldn't resolve host!"
fi
Alternately, if you really want to do it by matching strings, make sure to redirect stderr to stdout (and possibly also kill stdout so it doesn't interfere):
output=$(curl -i -k http://nowaythisthingexists.test 2>&1 >/dev/null)
if [[ "$output" = *"Could not resolve host"* ]]
then
echo "oopsie, couldn't resolve host!"
fi
Obviously, you are not getting the output of your request this way, so you'd need to redirect it somewhere more useful than /dev/null — a file, or a Unix pipe. Now it's getting more complicated than it needs to be.

Capturing error message of curl in BASH script andchecking status

I need 4 things from curl in a BASH script:
I need to capture brief humanly readable error message from curl into a bash variable.
I need to be able to check that the command completed
successfully or not.
I need the command to run
I don't want anything printed to console unless I echo it.
m=$(curl -u "$user":AP"$pass" -T "$pathA" "$url")
if [ $? -ne 0 ] ; then
echo "Error: ""$m"
fi
The problem is curl puts gibberish into $m. It just dumps the error to console instead of m. I don't want anything printed to console unless I echo it. And I only want to capture error descriptions. I tried many variations, nothing seemed to work for this use-case... at least nothing suggested here on Stack.
curl sends errors to STDERR and will not get captured by $m. The output of curl is sent to STDERR (that gibberish you mentioned).
One solution is to redirect STDERR to STDOUT by adding 2>&1 to your curl invocation in your script:
m=$(curl -u "$user":AP"$pass" -T "$pathA" "$url" 2>&1)
if [ $? -ne 0 ] ; then
echo "Error: ""$m"
fi
You could also use the --fail, --silent and --show-errors flags of curl like in this example if all you care about are the errors:
Making curl send errors to stderr and everything else to stdout
and some information about capturing STDERR to $variable in bash scripts:
Bash how do you capture stderr to a variable?
Similar to one suggested; I m just using the output stderror as $? is always 0.
OUTPUT=$(curl -X POST -H 'Content-type: application/json' --data "{\"text\":\"$MSG \"}" $SLACK_URL 2>1)
echo $OUTPUT
if [[ $OUTPUT == "ok" ]]; then
echo "$FILE_NAME Msg successfully sent"
elif [[ $OUTPUT == "invalid_token" ]]; then
echo "$FILE_NAME Slack url incorrect"
else
echo "$FILE_NAME Some issue Sending msg"
fi

Bash script to check HTTP status before executing tests

Hi I am trying to execute specific tests only if application is up and running (I am using docker), I am trying to achieve this with the help of bash script. What I am expecting is I need to run a loop until I receive 200 status from application, once I receive 200 script should move ahead and execute the test.I am trying bash script as follows
#!/bin/bash
urlstatus=0
until [ $urlstatus -ne 200 ]; do
urlstatus=$(curl -o /dev/null --silent --head --write-out '%{http_code}' "http://localhost:8000/animals")
echo $urlstatus
done
Execute Test if application is up & running
Please let me know what is missing in the script.
Thanks
-ne is the exact opposite of the test you actually want; to loop until the exit status is 200 you should have -eq, or even better (to avoid error messages from the comparison if a non-numeric value is present), =.
#!/bin/sh
fetchstatus() {
curl \
-o /dev/null \
--silent \
--head \
--write-out '%{http_code}' \
"http://localhost:8000/animals"
}
urlstatus=$(fetchstatus) # initialize to actual value before we sleep even once
until [ "$urlstatus" = 200 ]; do # until our result is success...
sleep 1 # wait a second...
urlstatus=$(fetchstatus) # then poll again.
done
But since curl can adjust its exit status to indicate whether a request was successful, you don't even need that. Use --fail, and you can branch directly:
#!/bin/sh
while :; do
curl -sS --fail -o /dev/null "http://localhost:8000/animals") && break
sleep 1 # actually give your server a little rest
done
The && break means that we break out of the loop only if the request was successful; the --fail argument to curl means that it only returns success if the server returned a non-erroneous exit status (such as 200).

How to create a loop in bash that is waiting for a webserver to respond?

How to create a loop in bash that is waiting for a webserver to respond?
It should print a "." every 10 seconds or so, and wait until the server starts to respond.
Update, this code tests if I get a good response from the server.
if curl --output /dev/null --silent --head --fail "$url"; then
echo "URL exists: $url"
else
echo "URL does not exist: $url"
fi
Combining the question with chepner's answer, this worked for me:
until $(curl --output /dev/null --silent --head --fail http://myhost:myport); do
printf '.'
sleep 5
done
I wanted to limit the maximum number of attempts. Based on Thomas's accepted answer I made this:
attempt_counter=0
max_attempts=5
until $(curl --output /dev/null --silent --head --fail http://myhost:myport); do
if [ ${attempt_counter} -eq ${max_attempts} ];then
echo "Max attempts reached"
exit 1
fi
printf '.'
attempt_counter=$(($attempt_counter+1))
sleep 5
done
httping is nice for this. simple, clean, quiet.
while ! httping -qc1 http://myhost:myport ; do sleep 1 ; done
while/until etc is a personal pref.
The poster asks a specific question about printing ., but I think most people coming here are looking for the solution below, as it is a single command that supports finite retries.
curl --head -X GET --retry 5 --retry-connrefused --retry-delay 1 http://myhost:myport
The use of backticks ` ` is outdated.
Use $( ) instead:
until $(curl --output /dev/null --silent --head --fail http://myhost:myport); do
printf '.'
sleep 5
done
You can also combine timeout and tcp commands like this. It will timeout after 60s instead of waiting indefinitely
timeout 60 bash -c 'until echo > /dev/tcp/myhost/myport; do sleep 5; done'
The following snippet:
Wait's until all URLs from the arguments return 200
Expires after 30 second if one URL is not available
One curl requests timeouts after 3 seconds
Just put it into a file and use it like a generic script to wait until the required services are available.
#/bin/bash
##############################################################################################
# Wait for URLs until return HTTP 200
#
# - Just pass as many urls as required to the script - the script will wait for each, one by one
#
# Example: ./wait_for_urls.sh "${MY_VARIABLE}" "http://192.168.56.101:8080"
##############################################################################################
wait-for-url() {
echo "Testing $1"
timeout --foreground -s TERM 30s bash -c \
'while [[ "$(curl -s -o /dev/null -m 3 -L -w ''%{http_code}'' ${0})" != "200" ]];\
do echo "Waiting for ${0}" && sleep 2;\
done' ${1}
echo "${1} - OK!"
}
echo "Wait for URLs: $#"
for var in "$#"; do
wait-for-url "$var"
done
Gist: https://gist.github.com/eisenreich/195ab1f05715ec86e300f75d007d711c
printf "Waiting for $HOST:$PORT"
until nc -z $HOST $PORT 2>/dev/null; do
printf '.'
sleep 10
done
echo "up!"
I took the idea from here: https://stackoverflow.com/a/34358304/1121497
Interesting puzzle. If you have no access or async api with your client, you can try grepping your tcp sockets like this:
until grep '***IPV4 ADDRESS OF SERVER IN REVERSE HEX***' /proc/net/tcp
do
printf '.'
sleep 1
done
But that's a busy wait with 1 sec intervals. You probably want more resolution than that. Also this is global. If another connection is made to that server, your results are invalid.

Resources