I need a script that checks the url '$site' twice before reporting it as inaccessible?
The script checks this URL and if the URL is not reachable, the script checks it twice after for example 10 seconds, and if the URL is not reachable again, it echoes "no access".
I tried this:
if wget --spider -S "$site" 2>&1 | grep -w "200\|301\|302" > /dev/null ; then
else
do
echo "not access";
done
fi
I am do this, because no any other short way found:
#!/bin/bash
TOKEN='000000000'
CHAT_ID='(000000000 111111111 22222222)'
URL="https://api.telegram.org/bot$TOKEN/sendMessage"
site='https://site.ru'
if wget --spider -S "$site" 2>&1 | grep -w "200\|301\|302" > /dev/null ; then
echo "Site is up 1: $(date -d 'now + 7 hours' +'%Y-%m-%d_%H:%M')"
else
sleep 5
if wget --spider -S "$site" 2>&1 | grep -w "200\|301\|302" > /dev/null ; then
echo "Site is up 2: $(date -d 'now + 7 hours' +'%Y-%m-%d_%H:%M')"
else
for chat in ${CHAT_ID[*]}
do
curl -s -X POST $URL -d chat_id=$chat -d text="Сервис временно не доступен"
echo "Site is down twice: $(date -d 'now + 7 hours' +'%Y-%m-%d_%H:%M')"
done
fi
fi
Related
So I have a script here that is taking a TCPDUMP output. We are trying to send (2) variables to a PHP script over the web ($SERVER). The filename header is created and contains both $FILETOSEND which is the filename and filedata. The actual data for the filedata variable is coming from a file called 1 (the data is formatted as you can tell). I am having issues with the section that calls out #send common 10 sec dump.
I am trying to CURL the file 1 and I am doing so by using curl --data "$(cat 1)" $SERVER
The script isn't sending the file 1 at all, mostly just sends the filename and no file data. Is there a problem with the way I am sending the file? Is there a better way to format it?
while true; do
sleep $DATASENDFREQ;
killall -9 tcpdump &> /dev/null
if [ -e $DUMP ]; then
mv $DUMP $DUMP_READY
fi
create_dump
DATE=`date +"%Y-%m-%d_%H-%M-%S"`
FILETOSEND=$MAC-$DATE-$VERSION
# we write fileheader to the file. 2 vars : filename, filedata.
FILEHEADER="filename=$FILETOSEND&filedata="
echo $FILEHEADER > 2
# change all colons to underscores for avoiding Windows filenames issues
sed -i 's/:/_/g' 2
# delete all newlines \n in the file
tr -d '\n' < 2 > 1
# parsing $DUMP_READY to awk.txt (no header in awk.txt)
awk '{ if (NF > 18 && $10 == "signal") {print "{\"mac\": \""$16"\",\"sig\": \""$9"\",\"ver\": \""$8"\",\"ts\": \""$1"\",\"ssid\": \""$19"\"}" }}' $DUMP_READY > awk.txt
sed -i 's/SA://g' awk.txt
sed -i 's/&/%26/g' awk.txt
cat awk.txt >> 1
sync
# send $OFFLINE
if [ -e $OFFLINE ]; then
curl -d $OFFLINE $SERVER
if [ $? -eq "0" ]; then
echo "status:dump sent;msg:offline dump sent"
rm $OFFLINE
else
echo "status:dump not sent;msg:offline dump not sent"
fi
fi
# send common 10 secs dump
curl --data "$(cat 1)" $SERVER
if [ $? -eq "0" ]; then
echo "status:dump sent"
else
cat 1 >> $OFFLINE
echo "status:dump not sent"
fi
if [ -e $DUMP_READY ]; then
rm -f $DUMP_READY 1 2 upload_file*
fi
I'm trying to create a bash script that check url from list status code and echo server name from header. I'm actually new.
#!/bin/bash
while read LINE; do
curl -o /dev/null --silent --head --write-out '%{http_code}' "$LINE"
echo " $LINE" &
curl -I /dev/null --silent --head | grep -Fi Server "$SERVER"
echo " $SERVER"
done < dominios-https
I get the following output
301 http://example.com
grep: : No such file or directory
1) while read LINE can not use last line if text file not ended with new line.
2) You don't set "$SERVER" anywhere, and grep say it
3) Not all servers return "Server:" in headers
try it:
scriptDir=$( dirname -- "$0" )
for siteUrl in $( < "$scriptDir/myUrl.txt" )
do
if [[ -z "$siteUrl" ]]; then break; fi # break line if him empty
httpCode=$( curl -I -o /dev/null --silent --head --write-out '%{http_code}' "$siteUrl" )
echo "HTTP_CODE = $httpCode"
headServer=$( curl -I --silent --head "$siteUrl" | grep "Server" | awk '{print $2}' )
echo "Server header = $headServer"
done
I am learning to customize wget in a bash function and having trouble. I would like to display Downloading (file):% instead of the messy output of wget. The function below seems close I am having trouble calling it for my specific needs.
For example, my standard wget is:
cd 'C:\Users\cmccabe\Desktop\wget'
wget -O getCSV.txt http://xxx.xx.xxx.xxx/data/getCSV.csv
and that downloads the .csv as a .txt in the directory specified with all the messy wget output.
This function seems like it will do more-or-less what I need, but I can not seem to get it to function correctly using my data. Below is what I have tried. Thank you :).
#!/bin/bash
download() {
local url=$1 wget -O getCSV.txt http://xxx.xx.xxx.xxx/data/getCSV.csv
local destin=$2 'C:\Users\cmccabe\Desktop\wget'
echo -n " "
if [ "$destin" ]; then
wget --progress=dot "$url" -O "$destin" 2>&1 | grep --line-buffered "%" | \
sed -u -e "s,\.,,g" | awk '{printf("\b\b\b\b%4s", $2)}'
else
wget --progress=dot "$url" 2>&1 | grep --line-buffered "%" | \
sed -u -e "s,\.,,g" | awk '{printf("\b\b\b\b%4s", $2)}'
fi
echo -ne "\b\b\b\b"
echo " DONE"
}
EDITED CODE
#!/bin/bash
download () {
url=http://xxx.xx.xxx.xxx/data/getCSV.csv
destin='C:\Users\cmccabe\Desktop\wget'
echo -n " "
if [ "$destin" ]; then
wget -O getCSV.txt --progress=dot "$url" -O "$destin" 2>&1 | grep --line-buffered "%" | \
sed -u -e "s,\.,,g" | awk '{printf("\b\b\b\b%4s", $2)}'
else
wget -O getCSV.txt --progress=dot $url 2>&1 | grep --line-buffered "%" | \
sed -u -e "s,\.,,g" | awk '{printf("\b\b\b\b%4s", $2)}'
fi
echo -ne "\b\b\b\b"
echo " DONE"
menu
}
menu() {
while true
do
printf "\n Welcome to NGS menu (v1), please make a selection from the MENU \n
==================================\n\n
\t 1 Patient QC\n
==================================\n\n"
printf "\t Your choice: "; read menu_choice
case "$menu_choice" in
1) patient ;;
*) printf "\n Invalid choice."; sleep 2 ;;
esac
done
}
I have the following code:
#!/bin/bash
#Desc: Images downloader
#Filename: img_downloader.sh
if [ $# -ne 3 ];
then
echo "Usage: $0 URL -d DIRECTORY"
exit -1
fi
for i in {1..4}
do
case $1 in
-d) shift; directory=$1; shift ;;
*) url=${url:-$1}; shift;;
esac
done
mkdir -p $directory;
baseurl=$(echo $url | egrep -o "https?://[a-z.]+")
echo Downloading $url
curl -s $url | egrep -o "<img src=[^>]*>" |
sed 's/<img src=\"\([^"]*\).*/\1/g' > /tmp/$$.list
sed -i "s|^/|$baseurl/|" /tmp/$$.list
cd $directory;
while read filename;
do
echo Downloading $filename
curl -s -O "$filename" --silent
done < /tmp/$$.list
And it’s run as is given as:
gavish#gavish-HP-Mini:~/Desktop$ ./img_downloader.sh http://pngimg.com/upload/tree_PNG3498.png -d ff
Then the next thing that happens is:
Downloading http://upload.wikimedia.org/wikipedia/commons/a/a9/Being_a_twin_means_you_always_have_a_pillow_or_blanket_handy.jpg
But the problem is the folder on the desktop remains empty even after the download is complete and I have no idea where the file is downloaded.
I'm trying to learn some scripting however I can't find solution for one functionality.
Basically I would like to ask to evaluate my script as it's probably possible to reduce the complexity and number of lines.
The purpose of this script is to download random, encrypted MySQL backups from Amazon S3, restore the dump and run some random MySQL queries.
I'm not sure how to email the output from printf statements - one is for headers and second one for actual data. I've tried to format the output so it looks like below but I had to exclude the headers from the loop:
Database: Table: Entries:
database1 random_table 0
database2 random_table 0
database3 random_table 0
database4 random_table 0
I would like to include this output in the email and also change the email subject based on the success/failure of the script.
I probably use to much if loops and MySQL queries are probably to complicated.
Script:
#!/usr/bin/env bash
# DB Details:
db_user="user"
db_pass="password"
db_host="localhost"
# Date
date_stamp=$(date +%d%m%Y)
# Initial Setup
data_dir="/tmp/backup"
# Checks
if [ ! -e /usr/bin/s3cmd ]; then
echo "Required package (http://s3tools.org/s3cmd)"
exit 2
fi
if [ -e /usr/bin/gpg ]; then
gpg_key=$(gpg -K | tr -d "{<,>}" | awk '/an#example.com/ { print $4 }')
if [ "$gpg_key" != "an#example.com" ]; then
echo "No GPG key"
exit 2
fi
else
echo "No GPG package"
exit 2
fi
if [ -d $data_dir ]; then
rm -rf $data_dir/* && chmod 700 $data_dir
else
mkdir $data_dir && chmod 700 $data_dir
fi
# S3 buckets
bucket_1=s3://test/
# Download backup
for backup in $(s3cmd ls s3://test/ | awk '{ print $2 }')
do
latest=$(s3cmd ls $backup | awk '{ print $2 }' | sed -n '$p')
random=$(s3cmd ls $latest | shuf | awk '{ print $4 }' | sed -n '1p')
s3cmd get $random $data_dir >/dev/null 2>&1
done
# Decrypting Files
for file in $(ls -A $data_dir)
do
filename=$(echo $file | sed 's/\.e//')
gpg --out $data_dir/$filename --decrypt $data_dir/$file >/dev/null 2>&1 && rm -f $data_dir/$file
if [ $? -eq 0 ]; then
# Decompressing Files
bzip2 -d $data_dir/$filename
if [ $? -ne 0 ]; then
echo "Decompression Failed!"
fi
else
echo "Decryption Failed!"
exit 2
fi
done
# MySQL Restore
printf "%-40s%-30s%-30s\n\n" Database: Table: Entries:
for dump in $(ls -A $data_dir)
do
mysql -h $db_host -u $db_user -p$db_pass < $data_dir/$dump
if [ $? -eq 0 ]; then
# Random DBs query
db=$(echo $dump | sed 's/\.sql//')
random_table=$(mysql -h $db_host -u $db_user -p$db_pass $db -e "SHOW TABLES" | grep -v 'Tables' | shuf | sed -n '1p')
db_entries=$(mysql -h $db_host -u $db_user -p$db_pass $db -e "SELECT * FROM $random_table" | grep -v 'id' | wc -l)
printf "%-40s%-30s%-30s\n" $db $random_table $db_entries
mysql -h $db_host -u $db_user -p$db_pass -e "DROP DATABASE $db"
else
echo "The system was unable to restore backups!"
rm -rf $data_dir
exit 2
fi
done
#Remove backups
rm -rf $data_dir
You'll get the best answers if you ask specific questions (rather than, "please review my code")...and if you limit each post to a single question. Regarding emailing the output of your printf statements:
You can group statements into a block and then pipe the output of a block into another program. For example:
{
echo "This is a header"
echo
for x in {1..10}; do
echo "This is row $x"
done
} | mail -s "Here is my output" lars#example.com
If you want to make the email subject conditional upon the success or
failure of something elsewhere in the script, you can (a) save your
output to a file, and then (b) email the file after building the
subject line:
{
echo "This is a header"
echo
for x in {1..10}; do
echo "This is row $x"
done
} > output
if is_success; then
subject="SUCCESS: Here is your output"
else
subject="FAILURE: Here are your errors"
fi
mail -s "$subject" lars#example.com < output