ELK - date, defined in logstash shows as string in kibana - elasticsearch

I have the following config file for logstash:
input {
file {
path => "/home/elk/data/visits.csv"
start_position => "beginning"
sincedb_path => "NUL"
}
}
filter {
csv {
separator => ","
columns => ["estado","tiempo_demora","poblacion","id_poblacion","edad_valor","cp","latitude_corregida","longitud_corregida","patologia","Fecha","id_tipo","id_personal","nasistencias","menor","Geopoint_corregido"]
}
date {
match => ["Fecha","dd-MM-YYYY HH:mm"]
target => "Fecha"
}
mutate {convert => ["nasistencias", "integer"]}
mutate {convert => ["id_poblacion", "integer"]}
mutate {convert => ["id_personal", "integer"]}
mutate {convert => ["id_tipo", "integer"]}
mutate {convert => ["cp", "integer"]}
mutate {convert => ["edad_valor", "integer"]}
mutate {
convert => { "longitud_corregida" => "float" }
convert => { "latitude_corregida" => "float" }
}
mutate {
rename => {
"longitud_corregida" => "[location][lon]"
"latitude_corregida" => "[location][lat]"
}
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
index => "medicalvisits-%{+dd.MM.YYYY}"
}
stdout {
codec => json_lines
codec => rubydebug
}
}
From there Fecha should have been sent to elasticsearch as date, but in kibana, when I try to set it as timestamp, it doesn't appear, and it shows as string:
Any idea what am I doing wrong here?

The types in your index patterns are not the same as the types in your index templates (the information actually being stored).
I would suggest to you, that you should overwrite the timestamp with the information sent by your logstash. After all, what's important to you in most cases, is the timestamp of the event, not the timestamp of the time the event was sent to your elasticsearch.
With this being said, why you just dont save your "Fecha" directly into your "#timestamp" by means of "date" filter in logstash. Like this:
date {
match => ["Fecha","dd-MM-YYYY HH:mm"]
target => "#timestamp"
tag_on_failure => ["fallo_filtro_fecha"]
Another option if you really need that "Fecha" algonside with #timestamp (not the best idea), and Fecha being of type "date", is to modify your index mapping to change that field type to date. Like this (adjust as necessary):
PUT /nombre_de_tu_indice/_mapping
{
"properties": {
"Fecha": {
"type": "date",
}
}
}
Of course, this change will only affect new indexed indices or a re-indexed one.

Related

Removing grok matched field after using it

I use filebeat to fetch log files into my logstash and then filter unnecessary fields. Everything works fine and I output these into elasticsearch but there is a field which I use for elasticsearch index name, I define this variable in my grok match but I couldn't find a way to remove that variable once it serves its purpose. I'll share my logstash config below
input {
beats {
port => "5044"
}
}
filter {
grok {
match => { "[log][file][path]" => ".*(\\|\/)(?<myIndex>.*)(\\|\/).*.*(\\|\/).*(\\|\/).*(\\|\/).*(\\|\/)" }
}
json {
source => message
}
mutate {
remove_field => ["agent"]
remove_field => ["input"]
remove_field => ["#metadata"]
remove_field => ["log"]
remove_field => ["tags"]
remove_field => ["host"]
remove_field => ["#version"]
remove_field => ["message"]
remove_field => ["event"]
remove_field => ["ecs"]
}
date {
match => ["t","yyyy-MM-dd HH:mm:ss.SSS"]
remove_field => ["t"]
}
mutate {
rename => ["l","log_level"]
rename => ["mt","msg_template"]
rename => ["p","log_props"]
}
}
output {
elasticsearch {
hosts => [ "localhost:9222" ]
index => "%{myIndex}"
}
stdout { codec => rubydebug { metadata => true } }
}
I just want to remove the "myIndex" field from my index. With this config file, I see this field in elasticsearch if possible I want to remove it. I've tried to remove it with other fields altogether but it gave an error. I guess it's because I removed it before logstash could give it to elasticsearch.
Create the field under [#metadata]. Those fields are available to use in logstash but are ignored by outputs unless they use a rubydebug codec.
Adjust your grok filter
match => { "[log][file][path]" => ".*(\\|\/)(?<[#metadata][myIndex]>.*)(\\|\/).*.*(\\|\/).*(\\|\/).*(\\|\/).*(\\|\/)" }
Delete [#metadata] from the mutate+remove_field and change the output configuration to have
index => "%{[#metadata][myIndex]}"

elasticsearch - import csv using logstash date is not parsed as of datetime type

I am trying to import csv into elasticsearch using logstash
I have tried using two ways:
Using CSV
Using grok filter
1) For csv below is my logstash file:
input {
file {
path => "path_to_my_csv.csv"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter {
csv {
separator => ","
columns => ["col1","col2_datetime"]
}
mutate {convert => [ "col1", "float" ]}
date {
locale => "en"
match => ["col2_datetime", "ISO8601"] // tried this one also - match => ["col2_datetime", "yyyy-MM-dd HH:mm:ss"]
timezone => "Asia/Kolkata"
target => "#timestamp" // tried this one also - target => "col2_datetime"
}
}
output {
elasticsearch {
hosts => "http://localhost:9200"
index => "my_collection"
}
stdout {}
}
2) Using grok filter:
For grok filter below is my logstash file
input {
file {
path => "path_to_my_csv.csv"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter {
grok {
match => { "message" => "(?<col1>(?:%{BASE10NUM})),(%{TIMESTAMP_ISO8601:col2_datetime})"}
remove_field => [ "message" ]
}
date {
match => ["col2_datetime", "yyyy-MM-dd HH:mm:ss"]
}
}
output {
elasticsearch {
hosts => "http://localhost:9200"
index => "my_collection_grok"
}
stdout {}
}
PROBLEM:
So when I run both the files individually, I am able to import the data in elasticsearch. But my date field is not parsed as of datetime type rather it has been saved as string and because of that I am not able to run the date filters.
So can someone help me to figure out why it's happening.
My elasticsearch version is 5.4.1.
Thanks in advance
There are 2 changes I made to your config file.
1) remove the under_score in the column name col2_datetime
2) add target
Here is how my config file look like...
vi logstash.conf
input {
file {
path => "/config-dir/path_to_my_csv.csv"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter {
csv {
separator => ","
columns => ["col1","col2"]
}
mutate {convert => [ "col1", "float" ]}
date {
locale => "en"
match => ["col2", "yyyy-MM-dd HH:mm:ss"]
target => "col2"
}
}
output {
elasticsearch {
hosts => "http://172.17.0.1:9200"
index => "my_collection"
}
stdout {}
}
Here is the data file:
vi path_to_my_csv.csv
1234365,2016-12-02 19:00:52
1234368,2016-12-02 15:02:02
1234369,2016-12-02 15:02:07

Custom date time is same but not matching in grok date filter logstash

The input is comma separated values:
"2010-08-19","09:12:55","56095675"
I created the custom date_time field which appears to right format 2010-08-19;09:12:55 but not matching.
filter {
grok {
match => { "message" => '"(%{GREEDYDATA:cust_date})","(%{TIME:cust_time})","(%{NUMBER:author})"'}
add_field => {
"date_time" => "%{cust_date};%{cust_time}"
}
}
date {
match => ["date_time", "yyyy-MM-dd;hh:mm:ss"]
target => "#timestamp"
add_field => { "debug" => "timestampMatched"}
}
Output on Kibana:
cust_date August 18th 2010, 20:00:00.000
cust_time 09:12:55
date_time 2010-08-19;09:12:55
message "2010-08-19","09:12:55","56095675"
tags beats_input_codec_plain_applied, _dateparsefailure
It gives _dateparsefailure. The fields appear to be same as match pattern.
I tried different time format like YYYY-MM-dd;hh:mm:ss and YYYY-MM-dd;HH:mm:ss
What am I doing wrong?
Help!
You should put the date plugin inside the filter section, right under grok.
filter {
grok {
match => { "message" => '"(%{GREEDYDATA:cust_date})","(%{TIME:cust_time})","(%{NUMBER:author})"'}
add_field => {
"date_time" => "%{cust_date};%{cust_time}"
}
date {
match => ["date_time", "yyyy-MM-dd;hh:mm:ss"]
target => "#timestamp"
add_field => { "debug" => "timestampMatched"}
}
}

Import CSV File as input in logstash and pass the output data to elasticsearch

I am new to elastic search and I am trying to import data from csv to elasticsearch using logstash and want to show output in kibana. I am using the below logstash conf file.
input {
file {
path => ["D:/AnalyticsTool/Exe/logstash-5.2.0/logstash-5.2.0/bin/data.csv"]
type => "core2"
start_position => "beginning"
}
}
filter {
csv {
separator => ","
columns => ["Date","Open","High","Low","Close","Volume","Adj Close"]
}
mutate {convert => ["High", "float"]}
mutate {convert => ["Open", "float"]}
mutate {convert => ["Low", "float"]}
mutate {convert => ["Close", "float"]}
mutate {convert => ["Volume", "float"]}
}
output {
elasticsearch {
action => "index"
hosts => ["localhost:9200"]
index => "stock"
workers => 1
user => elastic
password => changeme
}
stdout { codec => rubydebug}
}
I am not able to visualize in kibana. Do I need to create indexes manually in elasticsearch or will kibana create the above index automatically?

How to type data input in logstash

I'm trying to input a csv file to elasticsearch through logstash.
That's my configuration file
input {
file {
codec => plain{
charset => "ISO-8859-1"
}
path => ["PATH/*.csv"]
sincedb_path => "PATH/.sincedb_path"
start_position => "beginning"
}
}
filter {
if [message] =~ /^"ID","DATE"/ {
drop { }
}
date {
match => [ "DATE","yyyy-MM-dd HH:mm:ss" ]
target => "DATE"
}
csv {
columns => ["ID","DATE",...]
separator => ","
source => message
remove_field => ["message","host","path","#version","#timestamp"]
}
}
output {
elasticsearch {
embedded => false
host => "localhost"
cluster => "elasticsearch"
node_name => "localhost"
index => "index"
index_type => "type"
}
}
Now, the mapping produced in elasticsearch types the DATE field as string. I would like to type as a date field.
In the filter element, I tried to convert the type field in date but it doesn't work.
How can I fix that ?
Regards,
Alexandre
You have your filter chain setup in the wrong order. The date{} block needs to come after the csv {} block.

Resources