AND exit code in BASH - bash

I am monitoring websites for availability and using curl command to test for 200 status code.
All URLs are contained in a file. So far so good
What I want to achieve is if all URLs are online then exit 0; If any of the URLs are offline exit 1;
How to achieve this in bash?

Try this (where 200 - status code).
ANSWER=`curl -s -o /dev/null -w "%{http_code}" google.com | grep -c 200`
if [[ $ANSWER == "1" ]];then
exit 0
else
exit 1
fi

Related

Shell script for monitoring the sever

Can someone please give shell script for this scenario. I need to check below Server/URL is up or not.
If my server is up no need to trigger any alert .. if my server is down .. Alert should be triggered by stating my server is down.
http://18.216.40.147:5555/
#!/bin/bash
addr=18.216.40.147
if ! curl -v -u 'Administrator:******' https://"$addr":5555 | grep HTTP/1.1 200; then
echo "Server is down" | mailx -s "server is down" "******#gmail.com"
echo "Server is down" >&2
fi
curl can return only http response code, then you can compare with right answer:
#!/bin/bash
addr="http://18.216.40.147:5555/"
answer=$(curl -o /dev/null -s -w '%{http_code}\n' $addr)
if [ $answer -eq 200 ]
then
echo "UP"
else
echo "DOWN"
fi
Suggesting #Juarnir Santos is essentially correct but need some improvements:
If server is down or network configuration is blocking access to server, you want to timeout the request for 4 (or more) seconds.
Bash testing for numbers requires [[ numerical expression ]]
Therefore the improved answer is:
#!/bin/bash
addr="http://18.216.40.147:5555/"
answer=$(timeout 4 curl -o /dev/null -s -w '%{http_code}\n' $addr)
if [[ $answer -eq 200 ]]; then
echo "UP"
else
echo "DOWN"
fi

Checking whether specific website is up in the terminal?

Is there an easy way to check Internet connectivity from console? I am trying to play around in a shell script. One idea I seem is to wget --spider http://www.google.com/ and check the HTTP response code to interpret if the Internet connection is working fine.
This is what I am trying:
#!/bin/bash
# Sending the output of the wget in a variable and not what wget fetches
RESULT=`wget --spider http://google.com 2>&1`
FLAG=0
# Traverse the string considering it as an array of words
for x in $RESULT; do
if [ "$x" = '200' ]; then
FLAG=1 # This means all good
fi
done
Is there any way to accomplish this?
You can do it with ping or curl commands. Check man for more.
I am using this for myself and kinda works for me! It checks the connection from a reliable website like google and if it gets 200 status as the response, you probably have internet.
if curl -s --head --request GET www.google.com | grep "200 OK" > /dev/null ; then
echo "Internet is present"
else
echo "Internet isn't present"
fi
On one line, thanks #PS
if ping -c1 8.8.8.8 &>/dev/null ;then echo Working ;else echo Down ;fi
An option that does not use the internet to see if it is available is to check for a default route in your routing tables. The routing daemon will remove your default route when the internet is not available and add it back when it is.
netstat -nrf inet | grep -q ^default && \
echo internet is up || \
echo internet is down
To check if a website is up, you can use netcat to see if it is listening on port 80. This helps with sites that refuse head requests with '405 Method Not Allowed'.
nc -zw2 www.example.com 80 &>/dev/null && \
echo website is up || \
echo website is down
Please try this code.
#!/bin/bash
wget -q --spider http://google.com
if [ $? -eq 0 ]; then
echo "Internet connection is OK"
else
echo "Internet connection is FAILED"
fi
A bit more compact variant of #carlos-abraham answer. You can have curl to output just the http response code and make a decision with it
# 200 if everything is ok
http_code=$(curl -s --head -m 5 -w %{http_code} --output /dev/null www.google.com)
if [ "$http_code" -eq 200 ]; then
echo "success"
else
# write error to stderr
echo "http request failed: $http_code" >&2
exit 1
fi
-m 5: wait 5 seconds for the whole operation
--output /dev/null: suppress html site response
-w %{http_code}: write to stdout the http response code.
A bit more elaborated script to check connectivity and http response
#url="mmm.elgoog.moc"
url="www.google.com"
max_wait=5
(ping -w $max_wait -q -c 1 "$url" > /dev/null 2>&1 )
response_code=$?
if [ "$response_code" -eq 0 ]; then
# 200 if everything is ok
response_code=$(curl -s --head -m $max_wait -w %{http_code} --output /dev/null "$url")
fi
case "$response_code" in
1)
echo "Connectivity failed. Host down?" >&2
exit $response_code
;;
2)
echo "Unknown host or other problem. DNS problem?" >&2
exit $response_code
;;
200)
echo "success"
exit 0
;;
*)
echo "Failed to get a response: $response_code" >&2
exit 1
esac

Script that monitors log file quits unexpectedly

I've written simple script that monitors elasticsearch.log looking for specific pattern and finally sending curl POST request.
#!/bin/bash
tail -F -n 0 elasticsearch.log | \
while read -r line
do
echo "$line" | grep '<PATTERN>'
if [[ "$?" -eq 0 ]]
then
curl -X POST <URL>
fi
done
The problem is that script is quits unexpectedly with 0 exit status. Do you have any idea what might be the reason?

How to create a loop in bash that is waiting for a webserver to respond?

How to create a loop in bash that is waiting for a webserver to respond?
It should print a "." every 10 seconds or so, and wait until the server starts to respond.
Update, this code tests if I get a good response from the server.
if curl --output /dev/null --silent --head --fail "$url"; then
echo "URL exists: $url"
else
echo "URL does not exist: $url"
fi
Combining the question with chepner's answer, this worked for me:
until $(curl --output /dev/null --silent --head --fail http://myhost:myport); do
printf '.'
sleep 5
done
I wanted to limit the maximum number of attempts. Based on Thomas's accepted answer I made this:
attempt_counter=0
max_attempts=5
until $(curl --output /dev/null --silent --head --fail http://myhost:myport); do
if [ ${attempt_counter} -eq ${max_attempts} ];then
echo "Max attempts reached"
exit 1
fi
printf '.'
attempt_counter=$(($attempt_counter+1))
sleep 5
done
httping is nice for this. simple, clean, quiet.
while ! httping -qc1 http://myhost:myport ; do sleep 1 ; done
while/until etc is a personal pref.
The poster asks a specific question about printing ., but I think most people coming here are looking for the solution below, as it is a single command that supports finite retries.
curl --head -X GET --retry 5 --retry-connrefused --retry-delay 1 http://myhost:myport
The use of backticks ` ` is outdated.
Use $( ) instead:
until $(curl --output /dev/null --silent --head --fail http://myhost:myport); do
printf '.'
sleep 5
done
You can also combine timeout and tcp commands like this. It will timeout after 60s instead of waiting indefinitely
timeout 60 bash -c 'until echo > /dev/tcp/myhost/myport; do sleep 5; done'
The following snippet:
Wait's until all URLs from the arguments return 200
Expires after 30 second if one URL is not available
One curl requests timeouts after 3 seconds
Just put it into a file and use it like a generic script to wait until the required services are available.
#/bin/bash
##############################################################################################
# Wait for URLs until return HTTP 200
#
# - Just pass as many urls as required to the script - the script will wait for each, one by one
#
# Example: ./wait_for_urls.sh "${MY_VARIABLE}" "http://192.168.56.101:8080"
##############################################################################################
wait-for-url() {
echo "Testing $1"
timeout --foreground -s TERM 30s bash -c \
'while [[ "$(curl -s -o /dev/null -m 3 -L -w ''%{http_code}'' ${0})" != "200" ]];\
do echo "Waiting for ${0}" && sleep 2;\
done' ${1}
echo "${1} - OK!"
}
echo "Wait for URLs: $#"
for var in "$#"; do
wait-for-url "$var"
done
Gist: https://gist.github.com/eisenreich/195ab1f05715ec86e300f75d007d711c
printf "Waiting for $HOST:$PORT"
until nc -z $HOST $PORT 2>/dev/null; do
printf '.'
sleep 10
done
echo "up!"
I took the idea from here: https://stackoverflow.com/a/34358304/1121497
Interesting puzzle. If you have no access or async api with your client, you can try grepping your tcp sockets like this:
until grep '***IPV4 ADDRESS OF SERVER IN REVERSE HEX***' /proc/net/tcp
do
printf '.'
sleep 1
done
But that's a busy wait with 1 sec intervals. You probably want more resolution than that. Also this is global. If another connection is made to that server, your results are invalid.

How to test an Internet connection with bash?

How can an internet connection be tested without pinging some website?
I mean, what if there is a connection but the site is down? Is there a check for a connection with the world?
Without ping
#!/bin/bash
wget -q --spider http://google.com
if [ $? -eq 0 ]; then
echo "Online"
else
echo "Offline"
fi
-q : Silence mode
--spider : don't get, just check page availability
$? : shell return code
0 : shell "All OK" code
Without wget
#!/bin/bash
echo -e "GET http://google.com HTTP/1.0\n\n" | nc google.com 80 > /dev/null 2>&1
if [ $? -eq 0 ]; then
echo "Online"
else
echo "Offline"
fi
Ping your default gateway:
#!/bin/bash
ping -q -w 1 -c 1 `ip r | grep default | cut -d ' ' -f 3` > /dev/null && echo ok || echo error
Super Thanks to user somedrew for their post here: https://bbs.archlinux.org/viewtopic.php?id=55485 on 2008-09-20 02:09:48
Looking in /sys/class/net should be one way
Here's my script to test for a network connection other than the loop back.
I use the below in another script that I have for periodically testing if my website is accessible. If it's NOT accessible a popup window alerts me to a problem.
The script below prevents me from receiving popup messages every five minutes whenever my laptop is not connected to the network.
#!/usr/bin/bash
# Test for network conection
for interface in $(ls /sys/class/net/ | grep -v lo);
do
if [[ $(cat /sys/class/net/$interface/carrier) = 1 ]]; then OnLine=1; fi
done
if ! [ $OnLine ]; then echo "Not Online" > /dev/stderr; exit; fi
Note for those new to bash: The final 'if' statement tests if NOT [!] online and exits if this is the case. See man bash and search for "Expressions may be combined" for more details.
P.S. I feel ping is not the best thing to use here because it aims to test a connection to a particular host NOT test if there is a connection to a network of any sort.
P.P.S. The Above works on Ubuntu 12.04 The /sys may not exist on some other distros. See below:
Modern Linux distributions include a /sys directory as a virtual filesystem (sysfs, comparable to /proc, which is a procfs), which stores and allows modification of the devices connected to the system, whereas many traditional UNIX and Unix-like operating systems use /sys as a symbolic link to the kernel source tree.[citation needed]
From Wikipedia https://en.wikipedia.org/wiki/Filesystem_Hierarchy_Standard
This works on both MacOSX and Linux:
#!/bin/bash
ping -q -c1 google.com &>/dev/null && echo online || echo offline
In Bash, using it's network wrapper through /dev/{udp,tcp}/host/port:
if : >/dev/tcp/8.8.8.8/53; then
echo 'Internet available.'
else
echo 'Offline.'
fi
(: is the Bash no-op, because you just want to test the connection, but not processing.)
The top answer misses the fact that you can have a perfectly stable connection to your default gateway but that does not automatically mean you can actually reach something on the internet. The OP asks how he/she can test a connection with the world. So I suggest to alter the top answer by changing the gateway IP to a known IP (x.y.z.w) that is outside your LAN.
So the answer would become:
ping -q -w 1 -c 1 x.y.z.w > /dev/null && echo ok || echo error
Also removing the unfavored backticks for command substitution[1].
If you just want to make sure you are connected to the world before executing some code you can also use:
if ping -q -w 1 -c 1 x.y.z.w > /dev/null; then
# more code
fi
I've written scripts before that simply use telnet to connect to port 80, then transmit the text:
HTTP/1.0 GET /index.html
followed by two CR/LF sequences.
Provided you get back some form of HTTP response, you can generally assume the site is functioning.
make sure your network allow TCP traffic in and out, then you could get back your public facing IP with the following command
curl ifconfig.co
Execute the following command to check whether a web site is up, and what status message the web server is showing:
curl -Is http://www.google.com | head -1 HTTP/1.1 200 OK
Status code ‘200 OK’ means that the request has succeeded and a website is reachable.
The top voted answer does not work for MacOS so for those on a mac, I've successfully tested this:
GATEWAY=`route -n get default | grep gateway`
if [ -z "$GATEWAY" ]
then
echo error
else
ping -q -t 1 -c 1 `echo $GATEWAY | cut -d ':' -f 2` > /dev/null && echo ok || echo error
fi
tested on MacOS High Sierra 10.12.6
If your local nameserver is down,
ping 4.2.2.1
is an easy-to-remember always-up IP (it's actually a nameserver, even).
This bash script continuously check for Internet and make a beep sound when the Internet is available.
#!/bin/bash
play -n synth 0.3 sine 800 vol 0.75
while :
do
pingtime=$(ping -w 1 8.8.8.8 | grep ttl)
if [ "$pingtime" = "" ]
then
pingtimetwo=$(ping -w 1 www.google.com | grep ttl)
if [ "$pingtimetwo" = "" ]
then
clear ; echo 'Offline'
else
clear ; echo 'Online' ; play -n synth 0.3 sine 800 vol 0.75
fi
else
clear ; echo 'Online' ; play -n synth 0.3 sine 800 vol 0.75
fi
sleep 1
done
Similarly to #Jesse's answer, this option might be much faster than any solution using ping and perhaps slightly more efficient than #Jesse's answer.
find /sys/class/net/ -maxdepth 1 -mindepth 1 ! -name "*lo*" -exec sh -c 'cat "$0"/carrier 2>&1' {} \; | grep -q '1'
Explenation:
This command uses find with -exec to run command on all files not named *lo* in /sys/class/net/. These should be links to directories containing information about the available network interfaces on your machine.
The command being ran is an sh command that checks the contents of the file carrier in those directories. The value of $interface/carrier has 3 meanings - Quoting:
It seems there are three states:
./carrier not readable (for instance when the interface is disabled in Network Manager).
./carrier contain "1" (when the interface is activated and it is connected to a WiFi network)
./carrier contain "0" (when the interface is activated and it is not connected to a WiFi network)
The first option is not taken care of in #Jesse's answer. The sh command striped out is:
# Note: $0 == $interface
cat "$0"/carrier 2>&1
cat is being used to check the contents of carrier and redirect all output to standard output even when it fails because the file is not readable.
If grep -q finds "1" among those files it means there is at least 1 interface connected. The exit code of grep -q will be the final exit code.
Usage
For example, using this command's exit status, you can use it start a gnubiff in your ~/.xprofile only if you have an internet connection.
online() {
find /sys/class/net/ -maxdepth 1 -mindepth 1 ! -name "*lo*" -exec sh -c 'cat "$0"/carrier 2>&1 > /dev/null | grep -q "1" && exit 0' {} \;
}
online && gnubiff --systemtray --noconfigure &
Reference
Help testing special file in /sys/class/net/
find -exec a shell function?
shortest way: fping 4.2.2.1 => "4.2.2.1 is alive"
i prefer this as it's faster and less verbose output than ping, downside is you will have to install it.
you can use any public dns rather than a specific website.
fping -q google.com && echo "do something because you're connected!"
-q returns an exit code, so i'm just showing an example of running something you're online.
to install on mac: brew install fping; on ubuntu: sudo apt-get install fping
Ping was designed to do exactly what you're looking to do. However, if the site blocks ICMP echo, then you can always do the telnet to port 80 of some site, wget, or curl.
Checking Google's index page is another way to do it:
#!/bin/bash
WGET="/usr/bin/wget"
$WGET -q --tries=20 --timeout=10 http://www.google.com -O /tmp/google.idx &> /dev/null
if [ ! -s /tmp/google.idx ]
then
echo "Not Connected..!"
else
echo "Connected..!"
fi
For the fastest result, ping a DNS server:
ping -c1 "8.8.8.8" &>"/dev/null"
if [[ "${?}" -ne 0 ]]; then
echo "offline"
elif [[ "${#args[#]}" -eq 0 ]]; then
echo "online"
fi
Available as a standalone command: linkStatus
Pong doesn't mean web service on the server is running; it merely means that server is replying to ICMP echo.
I would recommend using curl and check its return value.
If your goal is to actually check for Internet access, many of the existing answers to this question are flawed. A few things you should be aware of:
It's possible for your computer to be connected to a network without that network having internet access
It's possible for a server to be down without the entire internet being inaccessible
It's possible for a captive portal to return an HTTP response for an arbitrary URL even if you don't have internet access
With that in mind, I believe the best strategy is to contact several sites over an HTTPS connection and return true if any of those sites responds.
For example:
connected_to_internet() {
test_urls="\
https://www.google.com/ \
https://www.microsoft.com/ \
https://www.cloudflare.com/ \
"
processes="0"
pids=""
for test_url in $test_urls; do
curl --silent --head "$test_url" > /dev/null &
pids="$pids $!"
processes=$(($processes + 1))
done
while [ $processes -gt 0 ]; do
for pid in $pids; do
if ! ps | grep "^[[:blank:]]*$pid[[:blank:]]" > /dev/null; then
# Process no longer running
processes=$(($processes - 1))
pids=$(echo "$pids" | sed --regexp-extended "s/(^| )$pid($| )/ /g")
if wait $pid; then
# Success! We have a connection to at least one public site, so the
# internet is up. Ignore other exit statuses.
kill -TERM $pids > /dev/null 2>&1 || true
wait $pids
return 0
fi
fi
done
# wait -n $pids # Better than sleep, but not supported on all systems
sleep 0.1
done
return 1
}
Usage:
if connected_to_internet; then
echo "Connected to internet"
else
echo "No internet connection"
fi
Some notes about this approach:
It is robust against all the false positives and negatives I outlined above
The requests all happen in parallel to maximize speed
It will return false if you technically have internet access but DNS is non-functional or your network settings are otherwise messed up, which I think is a reasonable thing to do in most cases
If you want to handle captive portals, you can do this oneliner:
if [[ $(curl -s -D - http://www.gstatic.com/generate_204 2>/dev/null | head -1 | cut -d' ' -f 2) == "204" ]]; then
echo 'online'
else
echo 'offline'
fi
Or if you want something more readable that can differentiate captive portals from lack of signal:
function is_online() {
# Test signal
local response
response=$(curl --silent --dump-header - http://www.gstatic.com/generate_204 2> /dev/null)
if (($? != 0)); then return 2; fi
# Test captive portal
local status=$(echo $response | head -1 | cut -d' ' -f 2)
((status == "204"))
}
is_online && echo online || echo offline

Resources