Create index based on message field - appname - elasticsearch

****Logstash.conf code *******
input {
stdin{
type => "stdin-type"
}
file{
type => "json"
path => [ "C:/prod/*.log", "C:/prod/*/**.log"]
start_position => "beginning"
tags => "prod"
}
file{
type => "json"
path => [ "C:/dev/*.log", "C:/dev/*/**.log"]
start_position => "beginning"
tags => "dev"
}
}
filter {
grok {
match => {
"message" => [ "%{JSON:payload_raw} "]
}
pattern_definitions => {
"JSON" => "{.*$"}
}
json {
source => "payload_raw"
target => "payload"
}
mutate {
remove_field => [ "payload_raw","message" ]
}
date {
match => [ "[payload][datetime]", "yyyy-MM-dd HH:mm:ss,SSS" ]
target => "#timestamp"
}
}
output {
stdout {
codec => rubydebug
}
elasticsearch {
hosts => ["localhost:9200"]
index => "%{tags}-logs"
}
}
Sample log
{datetime":"2021-08-10 04:11:37,825","servername":"VM-0001","serverip":"(null)","process":"2404","thread":"4","level":"DEBUG","appname":"Dev-Email","page":"Program.cs"}

Given the sample document your shared, your elasticsearch output needs to look like this:
elasticsearch {
hosts => ["localhost:9200"]
index => "%{appname}-logs"
}
Also know that index names are not allowed to contain uppercase letters, so Dev-Email will need to be lowercased (using the mutate/lowercase filter) before being used as the index name.

Related

logstash don't report all the events

i could see some events are missing while reporting logs to elastic search. Take an example i am sending 5 logs event only 4 or 3 are reporting.
Basically i am using logstash 7.4 to read my log messages and store the information on elastic search 7.4. below is my logstash configuration
input {
file {
type => "web"
path => ["/Users/a0053/Downloads/logs/**/*-web.log"]
start_position => "beginning"
sincedb_path => "/tmp/sincedb_file"
codec => multiline {
pattern => "^(%{MONTHDAY}-%{MONTHNUM}-%{YEAR} %{TIME}) "
negate => true
what => previous
}
}
}
filter {
if [type] == "web" {
grok {
match => [ "message","(?<frontendDateTime>%{MONTHDAY}-%{MONTHNUM}-%{YEAR} %{TIME})%{SPACE}(\[%{DATA:thread}\])?( )?%{LOGLEVEL:level}%{SPACE}%{USERNAME:zhost}%{SPACE}%{JAVAFILE:javaClass} %{USERNAME:orgId} (?<loginId>[\w.+=:-]+#[0-9A-Za-z][0-9A-Za-z-]{0,62}(?:[.](?:[0-9A-Za-z][0-9A-Za-zā€Œā€‹-]{0,62}))*) %{GREEDYDATA:jsonstring}"]
}
json {
source => "jsonstring"
target => "parsedJson"
remove_field=>["jsonstring"]
}
mutate {
add_field => {
"actionType" => "%{[parsedJson][actionType]}"
"errorMessage" => "%{[parsedJson][errorMessage]}"
"actionName" => "%{[parsedJson][actionName]}"
"Payload" => "%{[parsedJson][Payload]}"
"pageInfo" => "%{[parsedJson][pageInfo]}"
"browserInfo" => "%{[parsedJson][browserInfo]}"
"dateTime" => "%{[parsedJson][dateTime]}"
}
}
}
}
output{
if "_grokparsefailure" in [tags]
{
elasticsearch
{
hosts => "localhost:9200"
index => "grokparsefailure-%{+YYYY.MM.dd}"
}
}
else {
elasticsearch
{
hosts => "localhost:9200"
index => "zindex"
}
}
stdout{codec => rubydebug}
}
As keep on new logs are writing to log files, i could see a difference of log counts.
Any suggestions would be appreciated.

Loading a number of xml file into logstash

i want to load a number of xml file into logstash in the same time, so what i should to add in my config file!.
Thanks guys for your support :)
this is my config file :
input {
file {
path => "D:/test*.xml",
start_position => beginning
sincedb_path => "NUL"
codec => multiline {
pattern => "<invoicing>|</invoicing>"
negate => "true"
what => "previous"
auto_flush_interval => 1
max_lines => 3000
}
}
}
filter {
xml {
source => "message"
target => "message.parsed"
store_xml => false
force_array => false
}
}
output {
stdout { codec => rubydebug }
elasticsearch {
index => "tizer005"
hosts => ["localhost:9200"]
document_type => "ChannelFiles"
}
}

Logstash Not Reading "File" Input

I am trying to use file as an input to logstash.Here is my logstash.conf
input {
file {
path => "/home/dxp/elb.log"
type => "elb"
start_position => "beginning"
sincedb_path => "/home/dxp/log.db"
}
}
filter {
if [type] == "elb" {
grok {
match => [ "message", "%{TIMESTAMP_ISO8601:timestamp} %{NOTSPACE:loadbalancer} %{IP:client_ip}:%{NUMBER:client_port:int} %{IP:backend_ip}:%{NUMBER:backend_port:int} %{NUMBER:request_processing_time:float} %{NUMBER:backend_processing_time:float} %{NUMBER:response_processing_time:float} %{NUMBER:elb_status_code:int} %{NUMBER:backend_status_code:int} %{NUMBER:received_bytes:int} %{NUMBER:sent_bytes:int} %{QS:request}" ]
}
}
}
output
{
elasticsearch {
hosts => "10.99.0.180:9200"
manage_template => false
index => "elblog-%{+YYYY.MM.dd}"
document_type => "%{[#metadata][type]}"
}
}
My logs show this:
[2017-10-27T13:11:31,164][DEBUG][logstash.inputs.file ]_globbed_files: /home/dxp/elb.log: glob is []: I guess my file has not been read by logstash, so a new index is not formed in elasticsearch.
Please help me with what i am missing in this.

How can I make indexing json file by using logstash?

I try to make index my json file like below. I have to write a grok expression . But I could not do that? can you help me?
{"level":"Information","ClientIP":"10.201.21.188","Test":"10.210.21.188"}
{"level":"Information","ClientIP":"10.202.21.187","Test":"10.220.21.188"}
{"level":"Information","ClientIP":"10.203.21.186","Test":"10.230.21.188"}
{"level":"Information","ClientIP":"10.204.21.185","Test":"10.240.21.188"}
My logstash.conf is below :
input {
file {
type => "json"
path => ["C:/logs/test-20170933.json"]
start_position => "beginning"
}
}
filter {
grok {
match => [ "message","%{WORD:level} I HAVE TO WRITE OTHER ELEMENTS BUT HOW????"]
}
json {
source => "message"
}
}
output {
stdout {
codec => rubydebug
}
elasticsearch {
hosts => [ "localhost:9200" ]
index => "logstash-%{+YYYY.MM.dd}"
}
}
I guess that we need grok expression to achive that. Also I am open for new creative solution for that.
You don't need to grok anything, your file input simply needs a JSON codec and you're good to go:
input {
file {
type => "json"
path => ["C:/logs/test-20170933.json"]
start_position => "beginning"
codec => "json" <-- add this
}
}
filter {
}
output {
stdout {
codec => rubydebug
}
elasticsearch {
hosts => [ "localhost:9200" ]
index => "logstash-%{+YYYY.MM.dd}"
}
}

elasticsearch - import csv using logstash date is not parsed as of datetime type

I am trying to import csv into elasticsearch using logstash
I have tried using two ways:
Using CSV
Using grok filter
1) For csv below is my logstash file:
input {
file {
path => "path_to_my_csv.csv"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter {
csv {
separator => ","
columns => ["col1","col2_datetime"]
}
mutate {convert => [ "col1", "float" ]}
date {
locale => "en"
match => ["col2_datetime", "ISO8601"] // tried this one also - match => ["col2_datetime", "yyyy-MM-dd HH:mm:ss"]
timezone => "Asia/Kolkata"
target => "#timestamp" // tried this one also - target => "col2_datetime"
}
}
output {
elasticsearch {
hosts => "http://localhost:9200"
index => "my_collection"
}
stdout {}
}
2) Using grok filter:
For grok filter below is my logstash file
input {
file {
path => "path_to_my_csv.csv"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter {
grok {
match => { "message" => "(?<col1>(?:%{BASE10NUM})),(%{TIMESTAMP_ISO8601:col2_datetime})"}
remove_field => [ "message" ]
}
date {
match => ["col2_datetime", "yyyy-MM-dd HH:mm:ss"]
}
}
output {
elasticsearch {
hosts => "http://localhost:9200"
index => "my_collection_grok"
}
stdout {}
}
PROBLEM:
So when I run both the files individually, I am able to import the data in elasticsearch. But my date field is not parsed as of datetime type rather it has been saved as string and because of that I am not able to run the date filters.
So can someone help me to figure out why it's happening.
My elasticsearch version is 5.4.1.
Thanks in advance
There are 2 changes I made to your config file.
1) remove the under_score in the column name col2_datetime
2) add target
Here is how my config file look like...
vi logstash.conf
input {
file {
path => "/config-dir/path_to_my_csv.csv"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter {
csv {
separator => ","
columns => ["col1","col2"]
}
mutate {convert => [ "col1", "float" ]}
date {
locale => "en"
match => ["col2", "yyyy-MM-dd HH:mm:ss"]
target => "col2"
}
}
output {
elasticsearch {
hosts => "http://172.17.0.1:9200"
index => "my_collection"
}
stdout {}
}
Here is the data file:
vi path_to_my_csv.csv
1234365,2016-12-02 19:00:52
1234368,2016-12-02 15:02:02
1234369,2016-12-02 15:02:07

Resources