How to check if an URL exists with the shell and probably curl? - shell

I am looking for a simple shell (+curl) check that would evaluate as true or false if an URL exists (returns 200) or not.

Using --fail will make the exit status nonzero on a failed request. Using --head will avoid downloading the file contents, since we don't need it for this check. Using --silent will avoid status or errors from being emitted by the check itself.
if curl --output /dev/null --silent --head --fail "$url"; then
echo "URL exists: $url"
else
echo "URL does not exist: $url"
fi
If your server refuses HEAD requests, an alternative is to request only the first byte of the file:
if curl --output /dev/null --silent --fail -r 0-0 "$url"; then

I find wget to be a better tool for this than CURL; there's fewer options to remember and you can actually check for its truth value in bash to see if it succeeded or not by default.
if wget --spider http://google.com 2>/dev/null; then
echo "File exists"
else
echo "File does not exist"
fi
The --spider option makes wget just check for the file instead of downloading it, and 2> /dev/null silences wget's stderr output.

Related

Format URL with the system date in bash

I would run a .sh script to execute an operation from a server only if the given URL is up.
The URL where I get data updates everyday (but I dont know exactly what time it updates).
A cron job would run this script every five minutes and as soon as the updated URL exists, it runs an Rscript.
I don't know curl or bash enough to update the date according to the system's.
I thought of writing a code in BASH that would look like this :
if curl -s --head --request GET https://example.com-2021-05-10 | grep "200 OK" > /dev/null; then
Rscript ~/mybot.R
else
echo "the page is not up yet"
fi
Just use the date command.
if curl -s --head --request GET https://example.com-$(date '+%Y-%M-%d') | grep "200 OK" > /dev/null; then
Rscript ~/mybot.R
else
echo "the page is not up yet"
fi

BASH If command contains 'this text' do another command?

I'm creating a bash script to check the HTTP headers on remote hosts, I'm doing this via cURL and have noted that appending http://{host} will only work for services running on tcp\80, and not tcp\443. For example for HTTPS services, you require curl -I -k {host}, as opposed to HTTP services which only required curl -I {host}. This is my script:
for host in $(cat file.txt); do
echo " "
echo "Current host: "${host}
curl -I -k https://${host}
echo " "
echo "=============================================="
done
Now what I'm wanting is some condition operator to check that if the output is "Could not resolve host" then the script should run "curl -I http://{host}" on those hosts which the stdout contained the str value "Could not resolve host".
How can I achieve this in bash?
stdout will not contain Could not resolve host though, that's output to stderr. While you could capture stderr and then do string matching, there is a much, much simpler solution: exit code.
You can see here that curl will always exit with code 6 when it fails to resolve host. Thus, simply testing the exit code is sufficient:
curl -i -k http://nowaythisthingexists.test
if [[ $? -eq 6 ]]
then
echo "oopsie, couldn't resolve host!"
fi
Alternately, if you really want to do it by matching strings, make sure to redirect stderr to stdout (and possibly also kill stdout so it doesn't interfere):
output=$(curl -i -k http://nowaythisthingexists.test 2>&1 >/dev/null)
if [[ "$output" = *"Could not resolve host"* ]]
then
echo "oopsie, couldn't resolve host!"
fi
Obviously, you are not getting the output of your request this way, so you'd need to redirect it somewhere more useful than /dev/null — a file, or a Unix pipe. Now it's getting more complicated than it needs to be.

Capturing error message of curl in BASH script andchecking status

I need 4 things from curl in a BASH script:
I need to capture brief humanly readable error message from curl into a bash variable.
I need to be able to check that the command completed
successfully or not.
I need the command to run
I don't want anything printed to console unless I echo it.
m=$(curl -u "$user":AP"$pass" -T "$pathA" "$url")
if [ $? -ne 0 ] ; then
echo "Error: ""$m"
fi
The problem is curl puts gibberish into $m. It just dumps the error to console instead of m. I don't want anything printed to console unless I echo it. And I only want to capture error descriptions. I tried many variations, nothing seemed to work for this use-case... at least nothing suggested here on Stack.
curl sends errors to STDERR and will not get captured by $m. The output of curl is sent to STDERR (that gibberish you mentioned).
One solution is to redirect STDERR to STDOUT by adding 2>&1 to your curl invocation in your script:
m=$(curl -u "$user":AP"$pass" -T "$pathA" "$url" 2>&1)
if [ $? -ne 0 ] ; then
echo "Error: ""$m"
fi
You could also use the --fail, --silent and --show-errors flags of curl like in this example if all you care about are the errors:
Making curl send errors to stderr and everything else to stdout
and some information about capturing STDERR to $variable in bash scripts:
Bash how do you capture stderr to a variable?
Similar to one suggested; I m just using the output stderror as $? is always 0.
OUTPUT=$(curl -X POST -H 'Content-type: application/json' --data "{\"text\":\"$MSG \"}" $SLACK_URL 2>1)
echo $OUTPUT
if [[ $OUTPUT == "ok" ]]; then
echo "$FILE_NAME Msg successfully sent"
elif [[ $OUTPUT == "invalid_token" ]]; then
echo "$FILE_NAME Slack url incorrect"
else
echo "$FILE_NAME Some issue Sending msg"
fi

Bash script to check HTTP status before executing tests

Hi I am trying to execute specific tests only if application is up and running (I am using docker), I am trying to achieve this with the help of bash script. What I am expecting is I need to run a loop until I receive 200 status from application, once I receive 200 script should move ahead and execute the test.I am trying bash script as follows
#!/bin/bash
urlstatus=0
until [ $urlstatus -ne 200 ]; do
urlstatus=$(curl -o /dev/null --silent --head --write-out '%{http_code}' "http://localhost:8000/animals")
echo $urlstatus
done
Execute Test if application is up & running
Please let me know what is missing in the script.
Thanks
-ne is the exact opposite of the test you actually want; to loop until the exit status is 200 you should have -eq, or even better (to avoid error messages from the comparison if a non-numeric value is present), =.
#!/bin/sh
fetchstatus() {
curl \
-o /dev/null \
--silent \
--head \
--write-out '%{http_code}' \
"http://localhost:8000/animals"
}
urlstatus=$(fetchstatus) # initialize to actual value before we sleep even once
until [ "$urlstatus" = 200 ]; do # until our result is success...
sleep 1 # wait a second...
urlstatus=$(fetchstatus) # then poll again.
done
But since curl can adjust its exit status to indicate whether a request was successful, you don't even need that. Use --fail, and you can branch directly:
#!/bin/sh
while :; do
curl -sS --fail -o /dev/null "http://localhost:8000/animals") && break
sleep 1 # actually give your server a little rest
done
The && break means that we break out of the loop only if the request was successful; the --fail argument to curl means that it only returns success if the server returned a non-erroneous exit status (such as 200).

Error in checking file downloaded properly through curl

I am trying to run curl command to download a file using sftp and after that checking if that file downloaded or not, but somehow my code is printing "File downloaded successfully" everytime even when the file did not download.
Here is my script:
#!bin/bash
curl -k -u "user:user" -ssl -o file.zip sftp:domain
if [[ $? -ne 0 ]]; then
echo "Failed to download file"
exit -1
fi
echo "File downloaded successfully"
Try using --write-out option as
#!bin/bash
rcode=$(curl --silent --write-out '%{response_code}' -k -u "user:user" -ssl -o file.zip sftp:domain)
if [[ "$rcode" -ne 0 ]]; then
echo "Failed to download file"
exit -1
fi
echo "File downloaded successfully"
For a 404 http response
rcode=$curl --silent --write-out '%{response_code})' http://localhost:8080/tyu
echo "$rcode"
404
sftp response codes, success is 0.

Resources