I'm running Laravel in openshift server (Lamp stack) . My server was offline for past two days. Then, I looked into the error log, It says caught SIGWINCH, shutting down gracefully. But, It didn't give me more details. How to find the reason for the shutdown. I have attached the error log with this question.
- - - [13/Dec/2014:12:06:34 -0500] "OPTIONS * HTTP/1.0" 200 - "-" "Apache/2.2.15 (Red Hat) (internal dummy connection)"
- - - [13/Dec/2014:12:06:34 -0500] "OPTIONS * HTTP/1.0" 200 - "-" "Apache/2.2.15 (Red Hat) (internal dummy connection)"
[Sat Dec 13 12:06:34 2014] [notice] caught SIGWINCH, shutting down gracefully
[Mon Dec 15 01:15:31 2014] [notice] SELinux policy enabled; httpd running as context
unconfined_u:system_r:openshift_t:s0:c6,c126
[Mon Dec 15 01:15:31 2014] [notice] Digest: generating secret for digest authentication ...
[Mon Dec 15 01:15:31 2014] [notice] Digest: done
[Mon Dec 15 01:15:31 2014] [notice] Apache/2.2.15 (Unix) configured -- resuming normal operations
- - - [15/Dec/2014:01:15:32 -0500] "GET / HTTP/1.0" 302 268 "-" "-"
- - - [15/Dec/2014:01:15:38 -0500] "GET / HTTP/1.0" 302 268 "-" "-"
- - - [15/Dec/2014:01:15:41 -0500] "GET / HTTP/1.0" 302 268 "-" "-"
- - - [15/Dec/2014:01:15:44 -0500] "GET / HTTP/1.0" 302 268 "-" "-"
- - - [15/Dec/2014:01:15:47 -0500] "GET / HTTP/1.0" 302 268 "-" "-"
- - - [15/Dec/2014:01:15:49 -0500] "GET / HTTP/1.0" 302 268 "-" "-"
- - - [15/Dec/2014:01:15:52 -0500] "GET / HTTP/1.0" 302 268 "-" "-"
- - - [15/Dec/2014:01:15:55 -0500] "GET / HTTP/1.0" 302 268 "-" "-"
- - - [15/Dec/2014:01:15:58 -0500] "GET / HTTP/1.0" 302 268 "-" "-"
- - - [15/Dec/2014:01:16:04 -0500] "GET / HTTP/1.0" 302 268 "-" "-"
- - - [15/Dec/2014:01:16:07 -0500] "GET / HTTP/1.0" 302 268 "-" "-"
- - - [15/Dec/2014:01:16:10 -0500] "GET / HTTP/1.0" 302 268 "-" "-"
- - - [15/Dec/2014:01:16:14 -0500] "GET / HTTP/1.0" 302 268 "-" "-"
(98)Address already in use: make_sock: could not bind to address 127.12.49.129:8080
no listening sockets available, shutting down
Unable to open logs
Can anyone please help in finding the reason for the error ?
Thanks in advance.
SIGWINCH is also used by some services that need to restart Apache when rotating logs, nightly jobs, etc.
That doesn't explain the problem you're currently having, but I think it might be something else running on your server that's restarting Apache – or it might not be related to your problem at all.
Related
In the Apache logs, I found out that strange requests are coming from the IP address of my VPS, which I did not make. Usually, requests from my VPS that I run through the CRON scheduler are as follows.
domain.ru:443 **.**.**.** - - [20/Sep/2021:19:55:01 +0300] "GET /test.php HTTP/1.0" 200 421 "-" "Wget/1.19.4 (linux-gnu)" 118650
Strange queries look like this
domain.ru:80 **.**.**.** - - [21/Sep/2021:09:06:52 +0300] "GET / HTTP/1.0" 400 0 "-" "-" 48
domain.ru:80 **.**.**.** - - [21/Sep/2021:08:10:59 +0300] "GET / HTTP/1.0" 400 0 "-" "-" 53
domain.ru:80 **.**.**.** - - [21/Sep/2021:07:27:17 +0300] "GET /boaform/admin/formLogin?username=adminisp&psd=adminisp HTTP/1.0" 400 0 "-" "-" 51
domain.ru:80 **.**.**.** - - [21/Sep/2021:06:25:03 +0300] "GET / HTTP/1.0" 400 0 "-" "-" 145
domain.ru:80 **.**.**.** - - [21/Sep/2021:04:11:17 +0300] "GET / HTTP/1.0" 400 0 "-" "-" 41
domain.ru:80 **.**.**.** - - [21/Sep/2021:02:52:44 +0300] "GET / HTTP/1.0" 400 0 "-" "-" 41
domain.ru:80 **.**.**.** - - [21/Sep/2021:02:36:17 +0300] "GET / HTTP/1.0" 400 0 "-" "-" 41
domain.ru:80 **.**.**.** - - [21/Sep/2021:01:51:52 +0300] "GET / HTTP/1.0" 400 0 "-" "Mozilla/5.0" 38
These requests are especially alarming.
domain.ru:80 **.**.**.** - - [21/Sep/2021:07:27:17 +0300] "GET /boaform/admin/formLogin?username=adminisp&psd=adminisp HTTP/1.0" 400 0 "-" "-" 51
domain.ru:80 **.**.**.** - - [21/Sep/2021:01:51:52 +0300] "GET / HTTP/1.0" 400 0 "-" "Mozilla/5.0" 38
domain.ru:80 **.**.**.** - - [20/Sep/2021:19:51:34 +0300] "GET / HTTP/1.0" 400 0 "-" "YahooBot" 54
As you can see, UserAgent YahooBot and Mozilla / 5.0 are transmitted, and even a strange request to the page /boaform/admin/formLogin?username=adminisp&psd=adminisp
Tell me what to do. Is it a virus?
These are bots probing for vulnerabilities. I suggest you modify LogFormat to include the client IP in case you want to block those. Here is what I use:
LogFormat "%h %l %u %t \"%r\" %>s %B \"%{Referer}i\" \"%{User-Agent}i\"" combined
CustomLog ${APACHE_LOG_DIR}/access.log combined
Here is the relevant documentation for LogFormat variables. If you want, the next step is to block abuse traffic. I use fail2ban for that. Consider enabling the unique_id module, that way you can log that id in both access and error log.
I wrote a little bash script to parse Apache Access log to count POST|GET request.
My script works fine but I have a little graphical issue when I want to remove "[" char from the date field return by awk command.
Here is my script:
clear
ls /var/log/httpd | egrep *access_log$ > temp.txt
while read line
do
linecount=$(cat /var/log/httpd/"$line" | wc -l)
#echo -e "$line"
#echo -e "$linecount"
if [ $linecount -gt 0 ]
then
echo -e "==========================================="
echo -e "$line"
echo -e "Date de debut du log :"
cat /var/log/httpd/"$line" | awk -v ligne=1 'NR == ligne, FS=":" {print $4}' | xargs -0 sed -i 's/\[//g'
echo -e "Date de fin du log :"
cat /var/log/httpd/"$line" | awk 'END {print $4}'
echo -e "Nombre de requêtes sur la période :"
egrep -i 'post|get' /var/log/httpd/"$line" | wc -l
fi
linecount=0
done < temp.txt
rm -f temp.txt
An example of standard output of this code looks like this :
===========================================
xxx.xxx.xxx-ssl_access_log
Date de debut du log :
sed: impossible de lire [01/Jan/2021:07:34:59
: Aucun fichier ou dossier de ce type
Date de fin du log :
[22/Jan/2021:07:44:44
Nombre de requêtes sur la période :
22
Why can't sed use the string piped by awk?
How can I correct it ?
Below an example of log imput file :
54.36.148.55 - - [29/Dec/2020:18:05:38 +0100] "GET /robots.txt HTTP/1.1" 404 159
54.36.149.92 - - [29/Dec/2020:18:05:38 +0100] "GET / HTTP/1.1" 200 2394
54.36.148.185 - - [30/Dec/2020:17:51:06 +0100] "GET / HTTP/1.1" 200 2394
54.36.149.77 - - [31/Dec/2020:17:19:18 +0100] "GET /robots.txt HTTP/1.1" 404 159
54.36.148.97 - - [31/Dec/2020:17:19:19 +0100] "GET / HTTP/1.1" 200 2394
54.36.149.61 - - [01/Jan/2021:14:45:59 +0100] "GET / HTTP/1.1" 200 2394
54.36.148.151 - - [02/Jan/2021:16:26:22 +0100] "GET /robots.txt HTTP/1.1" 404 159
54.36.148.71 - - [02/Jan/2021:16:26:24 +0100] "GET / HTTP/1.1" 200 2394
54.36.148.108 - - [03/Jan/2021:15:21:28 +0100] "GET / HTTP/1.1" 200 2394
208.100.26.249 - - [03/Jan/2021:23:15:13 +0100] "GET / HTTP/1.1" 200 2394
54.36.149.95 - - [04/Jan/2021:15:28:31 +0100] "GET /robots.txt HTTP/1.1" 404 159
54.36.148.202 - - [04/Jan/2021:15:28:32 +0100] "GET / HTTP/1.1" 200 2394
54.36.149.24 - - [05/Jan/2021:14:44:52 +0100] "GET / HTTP/1.1" 200 2394
54.36.148.184 - - [06/Jan/2021:15:00:55 +0100] "GET /robots.txt HTTP/1.1" 404 159
54.36.149.54 - - [06/Jan/2021:15:00:55 +0100] "GET / HTTP/1.1" 200 2394
54.36.148.185 - - [07/Jan/2021:14:03:13 +0100] "GET / HTTP/1.1" 200 2394
51.158.103.247 - - [08/Jan/2021:12:31:33 +0100] "GET / HTTP/1.1" 200 2394
54.36.148.17 - - [08/Jan/2021:14:10:18 +0100] "GET /robots.txt HTTP/1.1" 404 159
54.36.148.185 - - [08/Jan/2021:14:10:19 +0100] "GET / HTTP/1.1" 200 2394
54.36.148.101 - - [09/Jan/2021:14:17:39 +0100] "GET /robots.txt HTTP/1.1" 404 159
54.36.148.94 - - [09/Jan/2021:14:17:40 +0100] "GET / HTTP/1.1" 200 2394
54.36.148.103 - - [10/Jan/2021:15:21:24 +0100] "GET /robots.txt HTTP/1.1" 404 159
54.36.148.68 - - [10/Jan/2021:15:21:24 +0100] "GET / HTTP/1.1" 200 2394
54.36.148.208 - - [11/Jan/2021:18:15:40 +0100] "GET /robots.txt HTTP/1.1" 404 159
54.36.149.78 - - [11/Jan/2021:18:15:41 +0100] "GET / HTTP/1.1" 200 2394
54.36.148.64 - - [12/Jan/2021:20:37:08 +0100] "GET /robots.txt HTTP/1.1" 404 159
54.36.149.38 - - [12/Jan/2021:20:37:09 +0100] "GET / HTTP/1.1" 200 2394
54.36.149.66 - - [13/Jan/2021:20:40:09 +0100] "GET /robots.txt HTTP/1.1" 404 159
54.36.148.203 - - [13/Jan/2021:20:40:10 +0100] "GET / HTTP/1.1" 200 2394
51.158.127.119 - - [14/Jan/2021:11:41:05 +0100] "GET / HTTP/1.1" 200 2394
51.15.251.143 - - [14/Jan/2021:11:52:04 +0100] "GET / HTTP/1.1" 200 2394
54.36.149.76 - - [14/Jan/2021:20:05:36 +0100] "GET / HTTP/1.1" 200 2394
208.100.26.243 - - [18/Jan/2021:10:20:00 +0100] "GET / HTTP/1.1" 200 2394
208.100.26.248 - - [25/Jan/2021:04:10:37 +0100] "GET / HTTP/1.1" 200 2394
Using awk as a "complete" solution
awk 'FNR==1 {
gsub("[[]","",$4);
sdat=$4 # When the file record number (FNR) is 1, remove [ from the 4th space separated field with gsub and set sdat to this field
}
ENDFILE {
gsub("[[]","",$4);
fdat=$4; # When we reach the end of each file, remove [ gain from the 4th field and set fdat to this field
print "==========================================="
print FILENAME # Print the filename using awk's FILENAME variable
print "Date de debut du log :" # Print the data required
print sdat
print "Date de fin du log :"
print fdat
print "Nombre de requêtes sur la période :"
print FNR # Print the total number of records in the file (file number record)
} ' /var/log/httpd/*access_log
# user15097052 : you'll absolutely love the insane power afforded by AWK. It's great because of its simplicity - it doesn't come with every bell and whistle, but for the building blocks it does, they do it REALLY well.
These days I pretty much avoid touching wc, sed, cut, and the majority of the time, I prefer not having to deal with perl or python3. The URL encode/decode module on python3 slows me down compared to awk.
I have a history web log files like this:
157.15.14.19 - - 06 Sep 2016 09:13:10 +0300 "GET /index.php?id=1 HTTP/1.1" 200 16977 "-" "Mozilla/5.0 (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)"
157.15.14.19 - - 06 Sep 2016 09:13:11 +0300 "GET /index.php?id=2 HTTP/1.1" 200 16977 "-" "Mozilla/5.0 (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)"
157.15.14.19 - - 06 Sep 2016 09:13:12 +0300 "GET /index.php?id=3 HTTP/1.1" 200 16977 "-" "Mozilla/5.0 (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)"
157.15.14.19 - - 06 Sep 2016 09:14:13 +0300 "GET /index.php?id=4 HTTP/1.1" 200 16977 "-" "Mozilla/5.0 (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)"
157.15.14.19 - - 06 Sep 2016 09:14:14 +0300 "GET /index.php?id=5 HTTP/1.1" 200 16977 "-" "Mozilla/5.0 (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)"
157.15.14.19 - - 06 Sep 2016 09:15:15 +0300 "GET /index.php?id=6 HTTP/1.1" 200 16977 "-" "Mozilla/5.0 (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)"
157.15.14.19 - - 06 Sep 2016 09:15:16 +0300 "GET /index.php?id=7 HTTP/1.1" 200 16977 "-" "Mozilla/5.0 (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)"
157.15.14.19 - - 06 Sep 2016 09:15:17 +0300 "GET /index.php?id=8 HTTP/1.1" 200 16977 "-" "Mozilla/5.0 (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)"
157.15.14.19 - - 06 Sep 2016 09:16:10 +0300 "GET /index.php?id=9 HTTP/1.1" 200 16977 "-" "Mozilla/5.0 (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)"
157.15.14.19 - - 06 Sep 2016 09:16:10 +0300 "GET /index.php?id=10 HTTP/1.1" 200 16977 "-" "Mozilla/5.0 (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)"
8.8.8.8 - - 06 Sep 2016 09:17:10 +0300 "GET /index.php?id=11 HTTP/1.1" 200 16977 "-" "Mozilla/5.0 (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)"
9.9.9.9 - - 06 Sep 2016 09:17:10 +0300 "GET /index.php?id=12 HTTP/1.1" 200 16977 "-" "Mozilla/5.0 (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)"
157.15.14.19 - - 06 Sep 2016 09:18:10 +0300 "GET /index.php?id=13 HTTP/1.1" 200 16977 "-" "Mozilla/5.0 (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)"
157.15.14.19 - - 06 Sep 2016 09:19:10 +0300 "GET /index.php?id=14 HTTP/1.1" 200 16977 "-" "Mozilla/5.0 (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)"
157.15.14.19 - - 06 Sep 2016 09:19:10 +0300 "GET /index.php?id=15 HTTP/1.1" 200 16977 "-" "Mozilla/5.0 (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)"
157.15.14.19 - - 06 Sep 2016 09:20:10 +0300 "GET /index.php?id=15 HTTP/1.1" 200 16977 "-" "Mozilla/5.0 (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)"
123.123.123.123 - - 06 Sep 2016 09:21:10 +0300 "GET /index.php?id=15 HTTP/1.1" 200 16977 "-" "Mozilla/5.0 (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)"
157.15.14.19 - - 06 Sep 2016 09:22:10 +0300 "GET /index.php?id=15 HTTP/1.1" 200 16977 "-" "Mozilla/5.0 (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)"
I want find out the cc attack IPs,only through the yesterday web log files
This example , I sign a cc attack :
every 5 minutes,The same remote ip request counts more than 5, the ip will a cc attack and print it.
The log file is all day,and only use bash scripts, just like awk,cat,gawk,sed and so..
Please me some suggest, Thanks a lot.
Update:
I try wite the test script (per 2minutes the same request count>5)
yy#yy:/tmp/tb$ cat 5.txt |awk '{print $7,$1}' |awk -F: '{print $1*60+int($2/2),$0}' |sort |uniq -c -f2 |awk '{if($1>5){print $0}}'
10 546 09:13:10 157.15.14.19
But, the code is so badly, It will be optimization.
awk -v Interval=5 -v Trig=5 -F '[[:blank:]]*|:' '
{
# using format log
# 157.15.14.19 - - 06 Sep 2016 09:13:10 +0300 "GET /index.php?id=1 HTTP/1.1" 200 16977 "-" "Mozilla/5.0 (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)"
# $1 2 3 4 5 6 7 8 9 10 11 ...
ThisTime = $7 * 60 + $8
#if new cycle (so this line is not in the cycle)
if ( ThisTime > ( LastTic + Interval ) ) {
# check and print last cycle hit
for( IP in IPCounts) if ( IPCounts[ IP] > Trig) print LastTime " " IP " : " IPCounts[ IP]
# reset reference
split( "", IPCounts)
LastTime = $4 " " $5 " " $6 " " $7 ":" sprintf( "%2d", ( $8 - ( $8 % Interval) )) ":00"
LastTic = $7 * 60 + ( $8 - ( $8 % Interval) )
}
# add this line to new cycle
IPCounts[ $1]++
}
END {
# print last cycle
for( IP in IPCounts) if ( IPCounts[ IP] > Trig) print LastTime " " IP " : " IPCounts[ IP]
}
' YourFile
# for format of log
# op.g.cc 124.145.36.121 - - [21/Nov/2016:03:38:02 +0800] ==> 172.11.0.238:80 "POST ...
# $1 2 3 4 5 6 7 8 9 10 11 ...
# change:
# $7 by $6, $8 by $7
# LastTime = $5 ":" $6 ":" sprintf( "%2d", ( $7 - ( $7 % Interval) )) ":00 +800]"
# IPCounts[ $2]++
Note:
work quick and dirty for time selection (you mention 1 log per day). If more precision is needed, use mkftime to use real epoch time reference
Trig is the count trigger level (5 times) and Interval is the time of the cycle (5 minutes)
I have this file:
2001:778:0:1::21 - - [16/Sep/2011:12:30:46 +0300] "GET / HTTP/1.1" 200 44
2001:778:0:1::21 - - [16/Sep/2011:12:30:46 +0300] "GET /favicon.ico HTTP/1.1" 200 1406
2001:778:0:1::21 - - [16/Sep/2011:12:32:15 +0300] "GET / HTTP/1.1" 200 66643
88.222.10.7 - - [17/Sep/2011:23:39:25 +0300] "GET / HTTP/1.1" 200 66643
88.222.10.7 - - [17/Sep/2011:23:39:25 +0300] "GET /favicon.ico HTTP/1.1" 200 1406
88.222.10.7 - - [18/Sep/2011:13:45:39 +0300] "GET / HTTP/1.1" 304 -
88.222.10.7 - - [19/Sep/2011:05:47:35 +0300] "GET / HTTP/1.1" 200 66643
88.222.10.7 - - [19/Sep/2011:05:47:36 +0300] "GET /favicon.ico HTTP/1.1" 200 1406
121.141.172.40 - - [19/Sep/2011:20:32:07 +0300] "CONNECT 64.12.202.43:443 HTTP/1.0" 405 235
And I have IP addresses data (last number in each line), for example 44, 1406, 66643, 6664, .....
I want to sum all data that belongs to same IP address. So my results should be:
2001:778:0:1::21 68093 (44+1406+66643)
88.222.10.7 136098 (66643+1406+66643+1406)
121.141.172.40 235 (235)
Is is possible to do that in shell?
This should give you the desired output:
# awk 'BEGIN{FS=" "}{arr[$1]+=$10}END{for(i in arr) print i,arr[i]}' file
88.222.10.7 136098
2001:778:0:1::21 68093
121.141.172.40 235
127.0.0.1 - - [08/Mar/2011:00:26:27 +0530] "HEAD /sk/ HTTP/1.1" 200 169 "-" "Python-urllib/2.6"
127.0.0.1 - - [08/Mar/2011:00:26:27 +0530] "HEAD /sk/.treeinfo HTTP/1.1" 404 182 "-" "Python-urllib/2.6"
127.0.0.1 - - [08/Mar/2011:00:26:27 +0530] "HEAD /sk/Fedora HTTP/1.1" 404 182 "-" "Python-urllib/2.6"
127.0.0.1 - - [08/Mar/2011:00:26:27 +0530] "HEAD /sk/Server HTTP/1.1" 404 182 "-" "Python-urllib/2.6"
127.0.0.1 - - [08/Mar/2011:00:26:27 +0530] "HEAD /sk/Client HTTP/1.1" 404 182 "-" "Python-urllib/2.6"
127.0.0.1 - - [08/Mar/2011:00:26:27 +0530] "HEAD /sk/RedHat HTTP/1.1" 404 182 "-" "Python-urllib/2.6"
127.0.0.1 - - [08/Mar/2011:00:26:27 +0530] "HEAD /sk/CentOS HTTP/1.1" 404 182 "-" "Python-urllib/2.6"
127.0.0.1 - - [08/Mar/2011:00:26:27 +0530] "HEAD /sk/SL HTTP/1.1" 404 182 "-" "Python-urllib/2.6"
127.0.0.1 - - [08/Mar/2011:00:26:27 +0530] "HEAD /sk/directory.yast HTTP/1.1" 404 182 "-" "Python-urllib/2.6"
127.0.0.1 - - [08/Mar/2011:00:26:27 +0530] "HEAD /sk/current/images/MANIFEST HTTP/1.1" 404 182 "-" "Python-urllib/2.6"
127.0.0.1 - - [08/Mar/2011:00:26:27 +0530] "HEAD /sk/images/daily/MANIFEST HTTP/1.1" 404 182 "-" "Python-urllib/2.6"
127.0.0.1 - - [08/Mar/2011:00:26:27 +0530] "HEAD /sk/boot/platform/i86xpv/kernel/unix HTTP/1.1" 404 182 "-" "Python-urllib/2.6"
127.0.0.1 - - [08/Mar/2011:00:26:27 +0530] "HEAD /sk/platform/i86xpv/kernel/unix HTTP/1.1" 404 182 "-" "Python-urllib/2.6"
127.0.0.1 - - [08/Mar/2011:00:26:27 +0530] "HEAD /sk/STARTUP/XNLOADER.SYS HTTP/1.1" 404 182 "-" "Python-urllib/2.6"
127.0.0.1 - - [08/Mar/2011:00:26:27 +0530] "HEAD /sk/images/xen/vmlinuz HTTP/1.1" 404 182 "-" "Python-urllib/2.6"
127.0.0.1 - - [08/Mar/2011:00:26:27 +0530] "HEAD /sk/images/boot.iso HTTP/1.1" 404 182 "-" "Python-urllib/2.6"
127.0.0.1 - - [08/Mar/2011:00:26:27 +0530] "HEAD /sk/boot/boot.iso HTTP/1.1" 404 182 "-" "Python-urllib/2.6"
127.0.0.1 - - [08/Mar/2011:00:26:27 +0530] "HEAD /sk/current/images/netboot/mini.iso HTTP/1.1" 404 182 "-" "Python-urllib/2.6"
127.0.0.1 - - [08/Mar/2011:00:26:27 +0530] "HEAD /sk/install/images/boot.iso HTTP/1.1" 404 182 "-" "Python-urllib/2.6"
127.0.0.1 - - [08/Mar/2011:00:26:35 +0530] "HEAD /sk/ HTTP/1.1" 200 169 "-" "Python-urllib/2.6"
127.0.0.1 - - [08/Mar/2011:00:26:35 +0530] "HEAD /sk/.treeinfo HTTP/1.1" 404 182 "-" "Python-urllib/2.6"
1 27.0.0.1 - - [08/Mar/2011:00:26:35 +0530] "HEAD /sk/Fedora HTTP/1.1" 404 182 "-" "Python-urllib/2.6"
127.0.0.1 - - [08/Mar/2011:00:26:35 +0530] "HEAD /sk/Server HTTP/1.1" 404 182 "-" "Python-urllib/2.6"
127.0.0.1 - - [08/Mar/2011:00:26:35 +0530] "HEAD /sk/Client HTTP/1.1" 404 182 "-" "Python-urllib/2.6"
127.0.0.1 - - [08/Mar/2011:00:26:35 +0530] "HEAD /sk/RedHat HTTP/1.1" 404 182 "-" "Python-urllib/2.6"
127.0.0.1 - - [08/Mar/2011:00:26:35 +0530] "HEAD /sk/CentOS HTTP/1.1" 404 182 "-" "Python-urllib/2.6"
127.0.0.1 - - [08/Mar/2011:00:26:35 +0530] "HEAD /sk/SL HTTP/1.1" 404 182 "-" "Python-urllib/2.6"
127.0.0.1 - - [08/Mar/2011:00:26:35 +0530] "HEAD /sk/directory.yast HTTP/1.1" 404 182 "-" "Python-urllib/2.6"
127.0.0.1 - - [08/Mar/2011:00:26:35 +0530] "HEAD /sk/current/images/MANIFEST HTTP/1.1" 404 182 "-" "Python-urllib/2.6"
How can I remove date from above logs using a sed script [08/Mar/2011:00:26:35 +0530] like this there are many instances.
You can use:
sed -r 's/\[[0-9]{2}\/[A-Z][a-z]{2}\/[0-9]{4}:[0-9]{2}:[0-9]{2}:[0-9]{2} \+[0-9]+\]//g'
See it on Ideone
Alternatively if there are no other occurrences of [..] in the input you can just do:
sed 's/\[.*\]//g'
If you have Ruby(1.9+)
$ ruby -i.bak -ne 'print $_.gsub(/\[.*?\]/,"")' file
if you absolutely must use sed
$ sed -i.bak 's/\[.[^]]*\]//g' file
The general pattern would be:
sed -e 's/pattern/replacement/' filename
With:
-e command
Append the editing commands specified by the command argument
to the list of commands.
In your case, this could be e.g.:
sed -e 's/\[.*\]//' yourfilename.log
Note that [.*] will work correctly as long as you don't have additional ] characters in a line.