Redirect output to mulitple files with tee and grep - shell

I spent a lot of hours to get this to run:
redirecting output from a script STDOUT + STDERR toLogfile 1 and a grep to Logfile 2
The First logfile should contain the complete output, and the second logfile only Start and End-Lines (grep).
I tried different syntax, but nothing works.
./run.sh 2>&1 | tee -a /var/log/log1.log | (grep 'START|END') > /var/log/myscripts.log
./run.sh 2>&1 | tee -a /var/log/log1.log | grep 'Start' > /var/log/myscripts.log
./run.sh 2>&1 | tee -a /var/log/log1.log | egrep 'Start' > /var/log/myscripts.log
./run.sh 2>&1 | tee -a /var/log/log1.log | grep -E 'Start' > /var/log/myscripts.log
the output will be redirected only to the first log. The second log is empty.
I don't know why; do you have any ideas?
Example-Lines from Output
this should be complete in the log1.log
(the script is java startet via shell script)
26.09.2014 20:38:51 | start script > load_stats.sh
26.09.2014 20:38:51 | [DB DATA]
26.09.2014 20:38:51 | Host > locahost
26.09.2014 20:38:51 | User > leroy
... more ...
26.09.2014 20:39:23 | fin script > load_stats.sh
I want to grep this in myscripts.log
26.09.2014 20:38:51 | start script > load_stats.sh
26.09.2014 20:39:23 | fin script > load_stats.sh
I think the problem is the format, timestamp, whitespaces.
I thought grep 'word' will catch me this both lines, but it doesn't.
Stupid.
./run.sh 2>&1 | tee -a /var/log/log1.log | sed -nE '/(start script|end script)/p' >> /var/log/myscripts.log
did not work, log1 ok, mysrctips.log empty
tail -f -n 500 /var/log/log1.log | sed -nE '/(start script|end script)/p'
works well in the shell. but in combination of all it doesn't.
execute a script > redirect to log 1 > redirect and filter(grep,egrep,sed,..) to log 2

This works fine for me:
$ cat <<_DATA | tee out1 | grep -E 'START|END' > out2
hello
START1
foo
END2
bar
_DATA
$ cat out1
hello
START1
foo
END2
bar
$ cat out2
START1
END2

This should work
./run_test.sh 2>&1 | tee -a /var/log/log1.log | grep -E 'start script|fin Main-Job' > /var/log/myscripts.log
# ^^ append ^^^^^^^ - same as egrep ^overwrite/create
prints
26.09.2014 20:38:51 | start script > load_stats.sh
26.09.2014 20:39:23 | fin Main-Job > load_stats.sh
if want
overwrite/create the log1 - delete the -a
append to myscripts.log use >>
the egrep is deprecated - therefore is better to use grep -E
Also, instead of the grep you can use sed too, like:
sed -nE '/(start script|fin Main-Job)/p'

Related

How do I do a website health check using CURL command

I'm trying to monitor a website using curl but the output doesn't seem to work, please see commands below:
#!/bin/bash
varDate=$(date '+%Y-%m-%d %H:%M:%S')
varCurlError=$(curl -sSf https://website.com > /dev/null)
varHttpCode=$(curl -Is https://website.com | head -n 1)
varResponseTime=$(curl -s -w '%{time_total}' -o /dev/null website.com)
varOutput="$varDate | $varCurlError | $varHttpCode | $varResponseTime"
echo $varOutput
The output looks like this :
| 0.07323 18:51:40 | | HTTP/1.1 200 OK
What can I change or add to fix the output.
Much appreciated.
#!/bin/bash
varDate=$(date '+%Y-%m-%d %H:%M:%S')
varCurlError=$(curl -sSf https://website.com 2>&1 >/dev/null)
varHttpCode=$(curl -Is https://website.com | head -n 1)
varResponseTime=$(curl -s -w '%{time_total}' -o /dev/null website.com | tr -d \\r )
varOutput="$varDate | $varCurlError | $varHttpCode | $varResponseTime"
echo $varOutput
There are two corrections:
tr -d \r was added as per glenn jackman. The CR is causing your varResponseTime to be printed at the beginning of the line. The tr command deletes the CR.
You need to first redirect stderr to stdout before you close file descriptor 1 in your varCurlError statement. Now, errors reported by curl to stderr will be sent to stdout (and captured by your $() enclosure). The output curl sends to stdout will go to the bitbucket. Order is important. >/dev/null 2>&1 doesn't work - it sends stdout and stderr to /dev/null.
#glenn jackman is correct about the need to pipe the curl output to | tr -d '\r'
That is, change your code to
#!/bin/bash
varDate=$(date '+%Y-%m-%d %H:%M:%S' | tr -d '\r')
varCurlError=$(curl -sSf https://website.com | tr -d '\r' > /dev/null)
varHttpCode=$(curl -Is https://website.com | tr -d '\r' | head -n 1)
varResponseTime=$(curl -s -w '%{time_total}' -o /dev/null website.com | tr -d '\r')
varOutput="$varDate | $varCurlError | $varHttpCode | $varResponseTime"
echo "$varOutput"
It can be done with wget so you see if you can get any data and it can be simple like this:
#!/bin/bash
dt=$(date '+%d/%m/%Y %H:%M:%S');
wget domain/yourindex
if [ -f /home/$USER/yourindex ] ; then
#echo $dt GOOD >> /var/log/fix.log
echo GOOD >/dev/null 2>&1
else
#counter measures like sudo systemctl restart php7.2-fpm.service && sudo systemctl restart nginx
echo $dt BROKEN >> /var/log/fix.log
fi
rm login*
exit

Getting common commands count from log file on MacOS

I need to get info from system on how many times I've entered "pw" "ls" and "cd" commands.
#!/bin/bash
#HISTFILE=~/.bash_history
#set -o history
echo "Printing number of ls commands used" > LogFile
a='history | grep -w "ls" | wc -1'
echo $a >> LogFile
echo "Printing number of cd commands used" >> LogFile
b='history | grep -w "cd" | wc -1'
echo $b >> LogFile
echo "Printing number of pwd commands used" >> LogFile
b='history | grep -w "pwd" | wc -1'
echo $c >> LogFile
cat LogFile
and my output on terminal:
Joyces-MacBook-Pro:desktop Joyce$ ./failedlogin_detect.sh
Printing number of ls commands used
history | grep -w "ls" | wc -1
Printing number of cd commands used
history | grep -w "cd" | wc -1
Printing number of pwd commands used
The sentences print fine, but it's not retrieving the count from the system.

bash pipe and printing with multiple filter

I was wondering if something like this exist:
tail -f file1 | grep "hello" > fileHello | grep "bye" > fileBye | grep "etc" > fileEtc
echo b1bla >> file1
echo b2hello >> file1
echo b3bye >> file1
echo b4hellobye >> file1
echo b5etc >> file1
echo b6byeetc >> file1
That will make that result :
file1:
b1bla
b2hello
b3bye
b4hellobye
b5etc
b6byeetc
fileHello:
b2hello
b4hellobye
fileBye:
b3bye
b4hellobye
b6byeetc
fileEtc:
b5etc
b6byeetc
Thanks!
Use tee with process substitution:
tail -f file1 | tee >(exec grep "hello" > fileHello) >(exec grep "bye" > fileBye) | grep "etc" > fileEtc
This works, but be aware that piping tail -f is likely to cause some unexpected buffering issues.
tail -f file1 |
awk '/hello/ { print > "fileHello"}
/bye/ { print > "fileBye"}
/etc/ { print > "fileEtc"}'

Bash Stdout redirect on solaris 10

ok this is working:
trace -t lstat64 -v lstat64 ls "myfilename" 2>pipefile
cat pipefile | grep ct | cut -d '[' -f 2 | cut -d ' ' -f 2
But i dont want to have to use the file "pipefile", how can i redirect the output straight to my grep and cut?
So, you want to ignore stdout and only consider stderr?
trace -t lstat64 -v lstat64 ls "myfilename" 2>&1 1>/dev/null |
grep ct | cut -d '[' -f 2 | cut -d ' ' -f 2
First, the stderr file handle is redirected to whatever the stdout file handle refers to, then the stdout file handle is redirected to /dev/null. Then grep can read from stdin whatever is emitted from trace's stderr.
I got it, I just realized i was getting stderr confused with stdout, this was my solution:
trace -t lstat64 -v lstat64 ls "myfilename" 2>&1 | grep ct | cut -d '[' -f 2 | cut -d ' ' -f 2

How to store the output of sed command in shell script

past_date=`date +"%Y-%m-%d" -d "-60 day"`
initial_date= sed -n "/$past_date/p" 'logfile.txt' | head -1 | sed -e 's/\([0-9][0-9][0-9][0-9]-[0-9][0-9]-[0-9][0-9] [0-9][0-9]:[0-9][0-9]:[0-9][0-9]\).*/\1/'
echo $initial_date
/*I am trying to store the result of sed command to initial_date variable. But nothing is stored in initial_date*/
To store a command output into a variable $var, use the var=$(command) syntax:
initial_date=$(sed -n "/$past_date/p" 'logfile.txt' | head -1 | sed -e 's/\([0-9][0-9][0-9][0-9]-[0-9][0-9]-[0-9][0-9] [0-9][0-9]:[0-9][0-9]:[0-9][0-9]\).*/\1/')
Then to print the result it is always recommended to quote the variable:
echo "$initial_date"
Update
If you are looking for the first date hour in logfile.txt, being date the $past_date, then you can use:
grep -o -m1 '2013-11-14 [0-9][0-9]:[0-9][0-9]:[0-9][0-9]' logfile.txt
Given this sample file:
$ cat logfile.txt
hello 2013-11-14 11:12:33
2013-11-14 11:12:33
2013-21-14 11:12:33
2013-r2:33
2013-19-14
2013-11-10 adf
$ grep -o -m1 '2013-11-14 [0-9][0-9]:[0-9][0-9]:[0-9][0-9]' logfile.txt
2013-11-14 11:12:33
$ data=$(grep -o -m1 '2013-11-14 [0-9][0-9]:[0-9][0-9]:[0-9][0-9]' logfile.txt)
$ echo $data
2013-11-14 11:12:33
initial_time=echo $line | sed -e 's/\([0-9][0-9][0-9][0-9]-[0-9][0-9]-[0-9][0-9] [0-9][0-9]:[0-9][0-9]:[0-9][0-9]\).*/\1/'

Resources