A generic script for checking the health of application - shell

I have written one shell script (Health_app.sh) which checks the health of the application. And for that it takes the name of the processes from App_Details file and checks for PID (whether it is running or not) and if it is not running and grep for that process in logs (field 3) and send email to the email id mentioned in the App_Details file (field 4).
App_Details is having records like:
process_Name|Process_description|logfile_path|email
abcd|main proceess to invoke the
dataready|/123/456/log|vikas#yahoo.com
pqrs|2nd
process..........................|/123/456/log|vikas#yahoo.com
Here is how my script looks like:
export App_Details=/home/123/sanity/App_Details
while read line
do
export procname=$(echo $line | cut -d " " -f1)
export PROCDES=$(echo $line | cut -d " " -f2)
#if ps -ef |grep [`echo $procname|awk '{print substr($0,1,1)}'`] [`echo $procname|awk '{print substr($0,2,length($0))}'`]> /dev/null
if ps -ef |grep -q [`echo $procname|awk '{print substr($0,1,1)}'`] `echo $procname|awk '{print substr($0,2,length($0))}'`
then
export part1=[`echo $procname|awk '{print substr($0,1,1)}'`]
export part2=`echo $procname|awk '{print substr($0,2,length($0))}'`
export PROCID=`ps -ef |grep $part1$part2|awk -F ' ' '{print $2}'`
else
export PROCID="OFFLINE"
trace_path=$(echo $line | cut -d " " -f3)
export mail=$(echo $line | cut -d " " -f4)
file_name=`ls -rt $trace_path/$procname*.trc 2>/dev/null | tail -1`
#export PROCDES=$(echo `tail -10 $file_name`)
(echo `tail -10 $file_name`) >> send.txt
mailx -s "Please find the alerts for your application OFFLINE services" vikas#domain.com < send.txt
fi
echo $PROCID|awk '{ printf("%-20s", $0)}'
echo $procname|awk '{ printf("%-20s", $0)}'
echo $PROCDES|awk '{ printf("%-20s\n", $0)}'
done<$App_Details
Now the issue is that grep -q is illegal for solaris and it is not working in solaris server.

Related

How to parse a variable in while loop multiple times

Here is my shell script and where 'i' stands for key.txt. Key file consists of around 8 properties and its values separated by a space like shown below.
Key.txt file :
bat slservice
solr slservice
tvs kimservice
ACM kimservice
product kimservice
tax kimservice
tvs kimservice
SNB taxservice
Shell script:
#!/bin/bash
while read i
do
key=$(echo $i | awk '{print $1}')
service=`echo $i | awk '{print $2}'`
ip=`cat IP.txt | grep $service | awk '{print $2}'|awk 'NR==1{print $1}'`
echo "VALIDATING $service properties"
echo "validating $key in $service IP = $ip"
curl -X GET -H "Content-type: application/json" -H "Accept: application/json" http://${ip}/v1/config 2>/dev/null>$service.json
v1_value=$(jq ".\"$key\"" "$service.json" | grep -oE "\b([0-9]{1,3}\.){3}[0-9]{1,3}\b")
echo "$key $v1_value" > file.txt
acutal_value=$(cat file.txt | grep "$key" | awk '{print $2}')
LB_name=$(cat mapping.txt | grep "$key" | awk '{print $2}')
LB_ip=$(cat LB.txt | grep "${LB_name}" | awk '{print $2}')
if [ "${acutal_value}" == "${LB_ip}" ]; then
echo "$key of $service value is matching $v1_value = $LB_ip"
else
echo "$key of $service v1/value ${acutal_value} is notmatching to GCP LB ${LB_ip}"
fi
done < key.txt
The above script is working fine and giving the exact output if we use mapping.txt file alone as a input file for LB_name variable.
But in my current situation I have 4 mapping (mapping1.txt to mapping4.txt)files and each mapping file should run only once for execution of all the props in key file and then followed by 2,3,4 files.
Here is the exact line that we are talking about:
LB_name=$(cat mapping.txt | grep "$key" | awk '{print $2}')
Waiting for suggestions !

unexpected output in docker container

I have below script to get values from config:
#!/bin/bash
while read -u10 line
do
if echo $line | grep -v -e "^#" -e "^$" | grep -F "=" &>/dev/null
then
varname=$(echo "$line" | cut -f1 -d '=')
export $varname=$(echo "$line" | cut -f2 -d '=')
fi
done 10< ./config
echo $a
My config like below:
a="b"
when at my host, it print a as b, but in container, it print a as "b".
Could anyone know why?

Why does my awk redirection not work?

Im trying to redirect my output to replace the contents of my file but if I do this it doesn't change my output at all
#!/bin/bash
ssh_config_path="$HOME/.ssh/config"
temp_ssh_config_path="$HOME/.ssh/config_temporary"
new_primary_username=$1
curr_primary_username=`awk '/^Host github\.com$/,/#Username/{print $2}' $ssh_config_path | tail -1`
new_user_name=`awk "/^Host github-$new_primary_username$/,/#Name/{print $2}" $ssh_config_path | tail -1 | sed 's/#Name //' | sed -e 's/^[[:space:]]*//' -e 's/[[:space:]]*$//'`
new_user_email=`awk "/^Host github-$new_primary_username$/,/#Email/{print $2}" $ssh_config_path | tail -1 | sed 's/#Email //' | sed -e 's/^[[:space:]]*//' -e 's/[[:space:]]*$//'`
echo "Switching from $curr_primary_username to $new_primary_username"
echo "Setting name to $new_user_name"
echo "Setting email to $new_user_email"
awk "
!x{x=sub(/github-$new_primary_username/,\"github.com\")}
!y{y=sub(/github\.com/,\"github-$curr_primary_username\")}
1" $ssh_config_path > temp_ssh_config_path && mv temp_ssh_config_path ssh_config_path
but if I do this I get the correct output on my terminal screen
#!/bin/bash
ssh_config_path="$HOME/.ssh/config"
temp_ssh_config_path="$HOME/.ssh/config_temporary"
new_primary_username=$1
curr_primary_username=`awk '/^Host github\.com$/,/#Username/{print $2}' $ssh_config_path | tail -1`
new_user_name=`awk "/^Host github-$new_primary_username$/,/#Name/{print $2}" $ssh_config_path | tail -1 | sed 's/#Name //' | sed -e 's/^[[:space:]]*//' -e 's/[[:space:]]*$//'`
new_user_email=`awk "/^Host github-$new_primary_username$/,/#Email/{print $2}" $ssh_config_path | tail -1 | sed 's/#Email //' | sed -e 's/^[[:space:]]*//' -e 's/[[:space:]]*$//'`
echo "Switching from $curr_primary_username to $new_primary_username"
echo "Setting name to $new_user_name"
echo "Setting email to $new_user_email"
awk "
!x{x=sub(/github-$new_primary_username/,\"github.com\")}
!y{y=sub(/github\.com/,\"github-$curr_primary_username\")}
1" $ssh_config_path
It's disappointing how far you've veered from the answers you were given but in any case here's the correct syntax for your script (untested since you didn't provide any sample input/output):
#!/bin/bash
ssh_config_path="$HOME/.ssh/config"
temp_ssh_config_path="$HOME/.ssh/config_temporary"
new_primary_username="$1"
curr_primary_username=$(awk 'f&&/#Username/{print $2; exit} /^Host github\.com$/{f=1}' "$ssh_config_path")
new_user_name=$(awk -v npu="$new_primary_username" 'f&&/#Name/{print $2; exit} $0~"^Host github-"npu"$"{f=1}' "$ssh_config_path")
new_user_email=$(awk -v npu="$new_primary_username" 'f&&/#Email/{print $2; exit} $0~"^Host github-"npu"$"{f=1}' "$ssh_config_path")
echo "Switching from $curr_primary_username to $new_primary_username"
echo "Setting name to $new_user_name"
echo "Setting email to $new_user_email"
awk -v npu="$new_primary_username" -v cpu="$curr_primary_username" '
!x{x=sub("github-"npu,"github.com")}
!y{y=sub(/github\.com/,"github-"cpu)}
1' "$ssh_config_path" > temp_ssh_config_path && mv temp_ssh_config_path "$ssh_config_path"
By doing that I noticed that your last statement was:
mv temp_ssh_config_path ssh_config_path
when you probably meant:
mv temp_ssh_config_path "$ssh_config_path"
and that would have caused a problem with your expected output file being empty.
The whole thing should, of course, have been written as just 1 simple awk script.

how to read words from a file using shell script

i have a file /ws/$1-rcd/temp.txt which has only one line as follows
198|/vob/ccm_tpl/repository/open_source/commons_collections/3_2_2/...
i have a script to get the value repository/open_source/commons_collections and 3_2_2 by reading the file and looping through it using for loop
i have my code as follows
grep -n "$4" /ws/$1-rcd/raw-vobs-config-spec | cut -d " " -f1,2 | sed -e 's/\:element/|/g' | sed -e 's/ //g' > /ws/$1-rcd/temp.txt
for i in `cat /ws/$1-rcd/temp.txt`
do
line=`echo $i | cut -d"|" -f1`
path=`echo $i | cut -d"|" -f2`
whoami
directory_temp=`echo $path | awk -F "/" '{ print $(NF-2)}'`
if [ "$directory_temp" == "$4" ]
then
OLD_VERSION=`sed -n "${line}p" /ws/$1-rcd/raw-vobs-config-spec | cut -d " " -f2 | awk -F "/" '{ print $(NF-1)}'`
total_fields=`sed -n "${line}p" /ws/$1-rcd/raw-vobs-config-spec | cut -d " " -f2 | awk -F "/" '{ print NF }'`
dir_path=`expr ${total_fields} - 2`
loc=`sed -n "${line}p" /ws/$1-rcd/raw-vobs-config-spec | cut -d " " -f2 | cut -d"/" -f1-"${dir_path}"`
location=`echo $loc | cut -d"/" -f4,5,6`
fi
done
but when i run this code it gives me an error as
-bash: line 45: syntax error near unexpected token |'
-bash: line 45:for i in 198|/vob/ccm_tpl/repository/open_source/commons_collections/3_2_2/...'
can anyone please suggest what am i doing wrong
If you want to iterate through each line of a file, use while loop like below
while read -r line ;do
echo $line
done <file.txt
so, your code can be rewritten as
grep -n "$4" /ws/$1-rcd/raw-vobs-config-spec | cut -d " " -f1,2 | sed -e 's/\:element/|/g' | sed -e 's/ //g' > /ws/$1-rcd/temp.txt
while read i ; do
line=`echo $i | cut -d"|" -f1`
path=`echo $i | cut -d"|" -f2`
whoami
directory_temp=`echo $path | awk -F "/" '{ print $(NF-2)}'`
if [ "$directory_temp" == "$4" ]
then
OLD_VERSION=`sed -n "${line}p" /ws/$1-rcd/raw-vobs-config-spec | cut -d " " -f2 | awk -F "/" '{ print $(NF-1)}'`
total_fields=`sed -n "${line}p" /ws/$1-rcd/raw-vobs-config-spec | cut -d " " -f2 | awk -F "/" '{ print NF }'`
dir_path=`expr ${total_fields} - 2`
loc=`sed -n "${line}p" /ws/$1-rcd/raw-vobs-config-spec | cut -d " " -f2 | cut -d"/" -f1-"${dir_path}"`
location=`echo $loc | cut -d"/" -f4,5,6`
fi
done < /ws/$1-rcd/temp.txt
You may be better served relying on parameter expansion and substring removal. For example:
#!/bin/sh
a=$(<dat/lline.txt) ## read file into a
a=${a##*ccm_tpl/} ## remove from left to ccm_tpl/
num=${a##*collections/} ## remove from left to collections/
num=${num%%/*} ## remove from right to /
a=${a%%${num}*} ## remove from right to $num
Input File
$ cat dat/lline.txt
198|/vob/ccm_tpl/repository/open_source/commons_collections/3_2_2/..
Output
$ sh getvals.sh
a : repository/open_source/commons_collections/
num : 3_2_2
If you need to trim in some other way, just let me know and I'm happy to help further.

BASH better way to monitor files

I've made a Bash script to monitor some server log files for certain data and my method probably isn't the most efficient.
One section specifically bugs me is that I have to write a newline to the monitored log so that the same line wont be read over continually.
Feedback would be greatly appreciated!
#!/bin/bash
serverlog=/home/skay/NewWorld/server.log
onlinefile=/home/skay/website/log/online.log
offlinefile=/home/skay/website/log/offline.log
index=0
# Creating the file
if [ ! -f "$onlinefile" ]; then
touch $onlinefile
echo "Name Date Time" >> "$onlinefile"
fi
if [ ! -f "$offlinefile" ]; then
touch $offlinefile
echo "Name Date Time" >> "$offlinefile"
fi
# Functions
function readfile {
# Login Variables
loginplayer=`tail -1 $serverlog | grep "[INFO]" | grep "joined the game" | awk '{print $4}'`
logintime=`tail -1 $serverlog | grep "[INFO]" | grep "joined the game" | awk '{print $2}'`
logindate=`tail -1 $serverlog | grep "[INFO]" | grep "joined the game" | awk '{print $1}'`
# Logout Variables
logoutplayer=`tail -1 $serverlog | grep "[INFO]" | grep "left the game" | awk '{print $4}'`
logouttime=`tail -1 $serverlog | grep "[INFO]" | grep "left the game" | awk '{print $2}'`
logoutdate=`tail -1 $serverlog | grep "[INFO]" | grep "left the game" | awk '{print $1}'`
# Check for Player Login
if [ ! -z "$loginplayer" ]; then
echo "$loginplayer $logindate $logintime" >> "$onlinefile"
echo "Player $loginplayer login detected" >> "$serverlog"
line=`grep -rne "$loginplayer" $offlinefile | cut -d':' -f1`
if [ "$line" > 1 ]; then
sed -i "$line"d $offlinefile
unset loginplayer
unset line
fi
fi
# Check for Player Logout
if [ ! -z "$logoutplayer" ]; then
echo "$logoutplayer $logoutdate $logouttime" >> "$offlinefile"
echo "Player $loginplayer logout detected" >> "$serverlog"
line=`grep -rne "$logoutplayer" $onlinefile | cut -d':' -f1`
if [ "$line" > 1 ]; then
sed -i "$line"d $onlinefile
unset logoutplayer
unset line
fi
fi
}
# Loop
while [ $index -lt 100 ]; do
readfile
done
Thanks!
instead of using multiple
tail -n 1 file
try the following construct:
tail -f file | while read line;do
echo "read: $line"
done
it will be much more reliable...and won't read the same line twice ;)
note: by using new processes of grep/awk/etc you are burning away processes...it's not that it is critical, but usually process creation is expensive...but if new lines occur rarely it's perfectly fine
where i want'ed to get is: if you are intrested, take a look at bash builting string manipulator function replace $(x/aa} ${x//aa} and friends..or try to use extended regexpes with grep

Resources