BASH script checking log files for current and previous month - bash

I have been working on this on and off for the last two months, and despite how many times I look at it, I can't make it work.
This script checks daily log files for a user defined variable, so they don't have to look through every one manually. It worked great checking the current month, but if the user wants to check back 20 days, and today is the 12th of this month, I wanted to be able to then go back to the previous month (not look for the log file with a date of 20150399 and so on). I have checked the logic for my date/day computations, and they seem okay (if there is a better way to do that in BASH, I am open to suggestions). What happens when I try to debug is unexpected end of file. I am somewhat new to writing scripts that contain more than 20 or so lines, but I just can't come up with what I am missing.
I have tried various fixes, to no avail, but I think this is the last iteration.
Ideas?
#!/bin/bash
########################################################
# multi_log_chk.sh
# This script will take input from the user and report which
# CyberFusion MFT logs contain what the user is looking for.
# Hopefully this will save the user having to search through every
# stinking log file to find what they are looking for.
# 20150406 pxg007 started typing
# 20150413 pxg007 added && comparison for back out (line 28)
# added message for no entries found (line 32, 38, 48-52)
# Added some further description (line 16)
# 20150424 pxg007 Added logic to calculate previous month and if necessary, year. (Lines 16-24, 60-78 )
#
########################################################
currDate=`date +%d%B%C%y`
currDay=`date +%d`
currMnth=`date +%m`
currYear=`date +%C%y`
case $currMnth in #Let's establish number of days for previous month
05 | 07 | 10 | 12 ) lastMnthD=30;;
01 |02 | 04 | 06 | 09 | 08 | 11 ) lastMnthD=31;;
03 ) lastMnthD=28;; ##and screw leap year
esac
if [ $currMnth -eq 01 ]; then ##accounting for January
lastMnth=12
else
lastMnth=$((currMnth-1))
fi
if [ $lastMnth -eq 12 ]; then ## accounting for Dec of previous year
lastMnthYr=$((currYear-1))
else
lastMnthYr=$currYear
fi
echo "This script will find entries for your query in whatever available MFT logs you request."
echo " "
echo "For instance - how many log files have transfer entries with \"DOG\" in them?"
echo " "
echo "I also will also give an estimate of how many transfers per log file contain your query, give or take a couple."
echo " "
echo "This search is case sensitive, so \"DOG\" is *** NOT *** the same as \"dog\""
echo " "
read -p "What text you are looking for? Punctuation is okay, but no spaces please. " looking ### what we want to find
echo " "
echo "Today's date is: $currDate."
echo " "
read -p "How many days back do you want to search(up to 25)? " daysBack ### How far back we are going to look
if [ "$daysBack" == 0 ] && [ "$daysBack" >> 25 ]; then
echo "I said up to 25 days. We ain't got more than that!"
exit 1
fi
echo " "
echo "I am going to search through the last $daysBack days of log files for:\"$looking\" "
echo " "
read -p "Does this look right? Press N to quit, or any other key to continue: " affirm
if [ "$affirm" = N ] && [ "$affirm" = n ]; then ###Yes, anything other than "N" or "n" is a go
echo "Quitter!"
exit 1
else
nada=0 ### Used to test for finding anything
backDate=$((currDay-daysBack)) ### current month iterator (assuming query covers only current month)
if (("$daysBack" => "$currDay")); then ## If there are more logs requested than days in the month...
lastMnthCnt=$((daysBack-currDay)) ### how many days to check last month
lastMnthStrt=$((lastMnthD-lastMnthCnt)) ## last month start and iterator
backDate=$(currDay-(daysBack-lastMnthCnt)) # Setting the iterator if we have to go back a month
while (("$lastMnthStrt" <= "$lastMnthD" )); do
foundIt=$(grep "$looking" /CyberFusion/log/Log.txt."$lastMnthYr$lastMnth$lastMnthStrt" | parsecflog | wc -l )
howMany=$((foundIt/40+1)) ### Add one in case there are less than 40 lines in the record.
if (("$foundIt" > 0))
then
nada=$((nada+1))
echo "Log.txt.$lastMnthYr$lastMnth$lastMnthStrt contains $looking in approximately $howMany transfer records."
lastMnthStrt=$((lastMnthStrt+1))
echo " "
else
lastMnthStrt=$((lastMnthStrt+1))
fi
fi
backDate=$((currDay-daysBack)) ### current month iterator (assuming query covers only current month)
while (("$backDate" <= "$currDay")); do
foundIt=$(grep "$looking" /CyberFusion/log/Log.txt."$backDate" | parsecflog | wc -l )
howMany=$((foundIt/40+1)) ### Add one in case there are less than 40 lines in the record.
if (("$foundIt" > 0))
then
nada=$((nada+1))
echo "Log.txt.$backDate contains $looking in approximately $howMany transfer records."
backDate=$((backDate+1))
echo " "
else
backDate=$((backDate+1))
fi
if [ "$nada" \< 1 ]
then
echo " "
echo "I found no entries for $looking in any log file."
fi

You are missing the keyword 'done' on lines 81 and 96 and also a final 'fi' keyword on the last line.
Also as others suggested you can do
date -d "20 days ago" +"%d%B%C%y"
to easily get dates in the past

Related

Cannot understand why our function calls return twice?

We have a 15 year (or so) old script we are trying to figure out and document. We have found some errors in it but one specific log file gives us much headache. and I would love some help figuring it out.
First the function that are run with the question:
#=========================================================#
# Define function removeOldBackupFile. #
#=========================================================#
removeOldBackupFile()
{
#set -x
echo "Removing old backups if they exists." >> "${report}"
local RCLOC=0
spaceBefore=$(getAvailableSpace ${backupDirectory})
timesToWait=60 # Wait a maximum of 10 minutes before bailing
cat ${oldDbContainer} | while read fileName
do
echo "Old file exists. Removing ${fileName}." >> "${report}"
removeFileIfExist "${fileName}"
RC=$?
echo "Resultcode for removing old backup is: RC=$RC." >> "${report}"
RCLOC=$(($RC+$RCLOC))
spaceAfter=$(getAvailableSpace ${backupDirectory})
# Wait for the OS to register that the file is removed
cnt=0
while [ $spaceAfter -le $spaceBefore ]; do
cnt=$((cnt+1))
if [ $cnt -gt $timesToWait ]; then
echo "Waited too long for space in ${backupDirectory}" | tee -a "${report}"
RCLOC=$(($RCLOC+1))
return $RCLOC
fi
sleep 10
spaceAfter=$(getAvailableSpace ${backupDirectory})
done
done
return $RCLOC
}
The place where this function is ran looks as follows:
#=========================================================#
# Remove old backupfiles if any exist. #
#=========================================================#
removeOldBackupFile
RC=$?
RCSUM=$(($RC+$RCSUM))
We have identified that the if condition is a bit wrong and the while loops would not work as intended if there are multiple files.
But what bothers us is output from a log file:
...
+ cnt=61
+ '[' 61 -gt 60 ']'
+ echo 'Waited too long for space in /<redacted>/backup'
+ tee -a /tmp/maintenanceBackupMessage.70927
Waited too long for space in /<redacted>/backup
+ RCLOC=1
+ return 1
+ return 0
+ RC=0
+ RCSUM=0
...
As seen in the log output after the inner loop have ran 60 times and ending it returns 1 as expected.. BUT! it also have return 0 after!? Why is it also returning 0?
We are unable to figure out the double returns... Any help appriciated
The first return executes in the subshell started by the pipe cat ${oldDbContainer} | while .... The second return is from return $RCLOC at the end of the function. Get rid of the useless use of cat:
removeOldBackupFile()
{
#set -x
echo "Removing old backups if they exists." >> "${report}"
local RCLOC=0
spaceBefore=$(getAvailableSpace ${backupDirectory})
timesToWait=60 # Wait a maximum of 10 minutes before bailing
while read fileName
do
...
done < ${oldDbContainer}
return $RCLOC
}

how to process last x lines of a file

I want to Analyse a logfile for specific Errors.
Therefore i want to be able to loop through the last x lines of the files and check every line with a specific REGEX Pattern and then define a specific return value.
The logfile Looks in case of success as follows at the Moment when i want to check it.
….
sftp> get blahblah/blahblah
sftp> bye
In case of an Error there is something between the two sftp lines.
What i allready tried is to solve the Problem with a specific regex which worked fine on some online Regex testers but couldn´t get it to work in ksh.
My current Approach is the following
LOG_FIL="test_log"
MODE="${1}"
check_log_file() {
ERRNBR=${1}
REGEX=${2}
TAIL=${3}
RETURN="0"
echo "ERRNBR = ${ERRNBR}"
echo "REGEX = ${REGEX}"
echo "TAIL = ${TAIL}"
while read line; do
echo "${line}"
if [[ "${line}" =~ ${REGEX} ]]; then
RETURN="0"
echo "bin hier"
else
RETURN=${ERRNBR}
echo "bin wo anders"
break
fi
done <<<$(tail -${TAIL} ${LOG_FIL})
echo "${RETURN}"
return ${RETURN}
}
echo "sftp> get cwi/cdk_final*" >> ${LOG_FIL}
if [ "${MODE}" == "1" ]; then
echo "Werner ist der beste" >>${LOG_FIL}
fi
check_log_file "22" "^(sftp> ).*$" "1"
echo "$?"
echo "sftp> bye" >> ${LOG_FIL}
check_log_file "21" "((sftp> ).*|(sftp> bye))" "2"
echo "$?"
The results i get are the following
edv> sh cdk_test4sftp.sh 1
ERRNBR = 22
REGEX = ^(sftp> ).*$
TAIL = 1
Werner ist der beste
bin wo anders
22
22
ERRNBR = 21
REGEX = ((sftp> ).*|(sftp> bye))
TAIL = 2
Werner ist der beste sftp> bye
bin hier
0
0
What i hoped to achieve was that the Output coming from the tail command would be seperated. So that i ccould test each line individually.
Your second regex:
((sftp> ).*|(sftp> bye))
Matches the following line, which is why your function returns 0:
Werner ist der beste sftp> bye
Since you want to match the following pattern on each line:
sftp> get blahblah/blahblah
sftp> bye
Your regex should look more like the first one you used to match:
^(sftp> ).*$

How to create a customized log monitoring job in java which provide reports of exception message happened on log file

Should able to process larger log files and provide exception message reports
After completion of log analysis, report notification trigger to specific mail id's.
And also please suggest which framework is the best for processing large files.[eg: spring boot/batch]
I would suggest to go with ELK stack. Stream the logs to elastic search and set up alerts in Kibana.
Can use sendmail client on system and run script on that system to send alert on any Exception.
exception="Exception" # "Error", "HTTP 1.1 \" 500", etc
ignoredException="ValidationException"
# log file to scan
logFileToScan=/var/log/tomcat8/log/application.log
# file where we will keep log of this script
logFilePath=/home/ec2-user/exception.log
# a file where we store till what line the log file has been scanned
# initalize it with 0
countPath=/home/ec2-user/lineCount
# subject with which you want to receive the mail regading Exception
subject="[ALERT] Exception"
# from whom do you want to send the mail regarding Exception
from="abc#abc.com"
# to whom do you want to send the mail
to="xyz#xyz.com"
# number of lines, before the line containing the word to be scanned, to be sent in the mail
linesBefore=1
# number of lines, before the line containing the word to be scanned, to be sent in the mail
linesAfter=4
# start line
fromLine=`cat $countPath`
# current line count in the file
toLine=`wc -l $logFileToScan | awk '{print $1}'`
#logs are rolling so if fromLine has a value greater than toLine then fromLine has to be set to 0
if [ "$fromLine" == "" ]; then
fromLine=0
echo `date` fromLine values was empty, set to 0 >> $logFilePath
elif [ $fromLine -gt $toLine ]; then
echo `date` logfile was rolled, updating fromLine from $fromLine to 0 >> $logFilePath
fromLine=0
fi
# if from n to lines are equal then no logs has been generated since last scan
if [ "$fromLine" == "$toLine" ]; then
echo `date` no logs genetared after last scan >> $logFilePath
else
echo `date` updating linecount to $toLine >> $logFilePath
echo $toLine > $countPath
logContent=`tail -n +"$fromLine" $logFileToScan | head -n "$((toLine - fromLine))" | grep -v $ignoredException | grep -A $linesAfter -B $linesBefore $exception`
logContent=`echo $logContent | cut -c1-2000`
if [ "$logContent" == "" ]; then
echo `date` no exception found >> $logFilePath
else
/usr/sbin/sendmail $to <<EOF
subject: $subject
from: $from
logContent=$logContent
EOF
fi
fi

bash to search file based on user input and create new file

In the below bash a file is downloaded when the program is opened and then that file is searched based on user input and the result is written to a new file. As of now the file downloads and the user is prompted for the input, but after it is entered nothing happens.
For example, the bash is opened and the download.txt is downloaded, the user then enters the id (NA04520). The id is used to search download.txt and the line is written to match.txt. The code runs but no output result. Eventually, I will search for specific text in the line that id was found in, but I figured getting the search based on user input was a good start. Thank you :).
#!/bin/bash
cd 'C:\Users\cmccabe\Desktop\wget'
wget -O getCSV.txt http://xxx.xx.xxx.xxx/data/getCSV.csv --progress=bar:force 2>&1 | tail -f -n +6
{
printf "\n\n"
printf "What is the id of the NGS patient: "; read id
[ -z "$id" ] && printf "\n No ID supplied. Leaving match function." && sleep 2 && return
[ "$id" = "end" ] && printf "\n Leaving match function." && sleep 2 && return
}
input=$id
while read -r line
do
case $line in
*$id*)
echo $line " yes" >> bashgrep.txt
;;
*)
echo "no"
;;
esac
done
Contents of download.txt
Report,Status,Flows,Library,TF Name,Q10 Mean,Q17 Mean,System SNR,50Q10 Reads,50Q17 Reads,Keypass Reads,TF Key Peak Counts,Total_Num_Reads,Library_50Q10_Reads,Library_100Q10_Reads,Library_200Q10_Reads,Library_Mean_Q10_Length,Library_Q10_Coverage,Library_Q10_Longest_Alignment,Library_Q10_Mapped Bases,Library_Q10_Alignments,Library_50Q17_Reads,Library_100Q17_Reads,Library_200Q17_Reads,Library_Mean_Q17_Length,Library_Q17_Coverage,Library_Q17_Longest_Alignment,Library_Q17_Mapped Bases,Library_Q17_Alignments,Library_50Q20_Reads,Library_100Q20_Reads,Library_200Q20_Reads,Library_Mean_Q20_Length,Library_Q20_Coverage,Library_Q20_Longest_Alignment,Library_Q20_Mapped Bases,Library_Q20_Alignments,Library_Key_Peak_Counts,Library_50Q47_Reads,Library_100Q47_Reads,Library_200Q47_Reads,Library_Mean_Q47_Length,Library_Q47_Coverage,Library_Q47_Longest_Alignment,Library_Q47_Mapped Bases,Library_Q47_Alignments,Library_CF,Library_IE,Library_DR,Library_SNR,Raw Accuracy,Sample,Notes,Run Name,PGM Name,Run Date,Run Directory,Num_Washouts,Num_Dud_Washouts,Num_Washout_Ambiguous,Num_Washout_Live,Num_Washout_Test_Fragment,Num_Washout_Library,Library_Pass_Basecalling,Library_pass_Cafie,Number_Ambiguous,Nubmer_Live,Number_Dud,Number_TF,Number_Lib,Number_Bead,Library_Live,Library_Keypass,TF_Live,TF_Keypass,Keypass_All_Beads,P,s
Auto_user_MOL-95-Epilepsy70_125,Completed,500,hg19,TF_A,93.0,90.0,27.4077550007,27969.0,27031.0,28647.0,93.0,5046334,4861480,4439577,2307648,179,0.0,343,885197150,4942977,4689944,4272916,2213442,177,0.0,341,850800392,4796942,4465846,4082874,2050257,171,0.0,341,804445593,4698861,81.0,4073847,3251302,1186042,143,0.0,319,651683806,4541746,0.515698455274,0.728147709742,0.00734527275199,21.2101955615,99.4,NA04520,NA04520,R_2014_01_14_16_21_42_user_MOL-95-Epilepsy70,MolecularGenetics,2014-01-14 22:21:42+00:00,/results/MolecularGenetics/R_2014_01_14_16_21_42_user_MOL-95-Epilepsy70,0,0,0,0,0,0,0,0,0,9332288,14179,31491,9300797,9346467,0,0,9332288,0,0,"{""variantCaller"": {""hotspots"": {}, ""barcoded"": ""false"", ""Target Regions"": ""Epilepsy70"", ""Trim Reads"": true, ""Target Loci"": ""Not using"", ""variants"": {""no_call"": 0, ""homo_snps"": 50, ""het_snps"": 104, ""other"": 0, ""variants"": 163, ""het_indels"": 3, ""homo_indels"": 6}, ""Configuration"": ""Germ Line - Low Stringency"", ""Aligned Reads"": ""R_2014_01_14_16_21_42_user_MOL-95-Epilepsy70"", ""Library Type"": ""AmpliSeq""}}","{""FastqCreator"": {}}","{""coverageAnalysis"": {""Bases in target regions"": ""268545"", ""Amplicons with at least 1 read"": ""99.80%"", ""barcoded"": ""false"", ""Target base coverage at 100x"": ""97.72%"", ""Amplicons with at least 500 reads"": ""95.42%"", ""Total assigned amplicon reads"": ""4879939"", ""Reference (File)"": ""hg19"", ""Total aligned base reads"": ""884883559"", ""Target base coverage at 20x"": ""98.82%"", ""Number of amplicons"": ""1507"", ""Target bases with no strand bias"": ""84.70%"", ""Percent reads on target"": ""97.44%"", ""Amplicons with at least 100 reads"": ""98.08%"", ""Average base coverage depth"": ""3149"", ""Average reads per amplicon"": ""3238"", ""Using"": ""All Mapped Reads"", ""Amplicons reading end-to-end"": ""80.76%"", ""Non-duplicate"": """", ""Uniquely mapped"": ""No"", ""Targeted Regions"": ""Epilepsy70"", ""Uniformity of base coverage"": ""93.35%"", ""Targetted regions"": ""/results/uploads/BED/42/hg19/merged/plain/Epilepsy70.bed"", ""Target padding"": ""0"", ""Amplicons with at least 20 reads"": ""99.14%"", ""Number of mapped reads"": ""5007953"", ""Percent assigned amplicon reads"": ""97.44%"", ""Amplicons
Change the while loop to:
while IFS= read -r line
do
case "$line" in
*$id*)
echo "$line" >> bashgrep.txt
echo "yes"
#Assumed you want only $line to go to the file and print yes to stdout
;;
*)
echo "no"
;;
esac
done <download.txt
#change it to the name of the file downloaded.
#Your posted code seems to download getCSV.txt

Add input to an existing file with Bash

I am working on a library record app (for school).
I need to be able to collect user input and write to an existent file (add new record). However, when I try to do so, I get the following error:
./minilib.sh: line 12: : No such file or directory
Here is my function for adding new records
records = "/lib_records.txt"
add_book(){
echo
echo "Enter Book Name:"
read name
echo "Enter Book Author:"
read author_name
echo "$name $author_name" >> "$records" #this is my line 12
}
Any idea what may be causing the error? Any help is greatly appreciated.
Here are the file permissions:
-rwxrwxrwx. 1 GSUAD\ GSUAD\domain^users 0 Oct 30 18:04 lib_records.txt
-rwxrwxrwx. 1 GSUAD\ GSUAD\domain^users 1253 Oct 30 18:40 minilib.sh
Here are 2 issues for your shell script:
records="./lib_records.txt": should not have space before and after =
"./lib_records.txt" instead of "/lib_records.txt"
Here is modified script for you.
records="./lib_records.txt"
add_book(){
echo
echo "Enter Book Name:"
read name
echo "Enter Book Author:"
read author_name
echo "$name $author_name" >> "$records" #this is my line 12
}
add_book
You shouldn't store file in / directory (/lib_records.txt), because you will probably get a Permission denied error. Secondly, remove spaces in the first line.

Resources