Plot a Tile map with the ELK stack - elasticsearch

I'm trying to create a tile map with Kibana. My conf file logstash works correctly and generates all what Kibana needs to plot a tile map. This is my conf logstash :
input {
file {
path => "/home/ec2-user/part.csv"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter {
csv {
separator => ","
columns => ["kilo_bytes_total","ip","session_number","request_number_total","duration_minutes_total","referer_list","filter_match_count_avg","request_number_avg","duration_minutes_avg","kilo_bytes_avg","segment_duration_avg","req_by_minute_avg","segment_mix_rank_avg","offset_avg_avg","offset_std_avg","extrem_interval_count_avg","pf0_avg","pf1_avg","pf2_avg","pf3_avg","pf4_avg","code_0_avg","code_1_avg","code_2_avg","code_3_avg","code_4_avg","code_5_avg","volume_classification_filter_avg","code_classification_filter_avg","profiles_classification_filter_avg","strange_classification_filter_avg"]
}
geoip {
source => "ip"
database => "/home/ec2-user/logstash-5.2.0/GeoLite2-City.mmdb"
target => "geoip"
add_field => [ "[geoip][coordinates]", "%{[geoip][longitude]}" ]
add_field => [ "[geoip][coordinates]", "%{[geoip][latitude]}" ]
add_tag => "geoip"
}
mutate {
convert => [ "[geoip][coordinates]", "float"]
}
}
output {
elasticsearch {
index => "geotrafficip"
}
}
And this is what that generates :
It looks cool. Trying to create my tile map, I have this message :
What to do ?
It seems that I must add somewhere the possiblity to use dynamic templates.. Should I create a template and add it to my file conf logstash ?
Can anybody give me some feedback ? Thanks !

If you look in the Kibana settings for your index, you'll need at least one field to show up with a type of geo_point to be able to get anything on a map.
If you don't already have a geo_point field, you'll need to re-index your data after setting up an appropriate mapping for the geoip.coordinates field. For example: https://stackoverflow.com/a/42004303/2785358
If you are using a relatively new version of Elasticsearch (2.3 or later), it's relatively easy to re-index your data. You need to create a new index with the correct mapping, use the re-index API to copy the data to the new index, delete the original index and then re-index back to the original name.

You are using the geoip filter wrong and are trying to convert the longitude and latitude to float. Get rid of your mutate filter and change the geoip filter to this.
geoip {
source => "ip"
fields => ["latitude","longitude"]
add_tag => "geoip"
}
This will create the appropriate fields. And the required GeoJSON object.

Related

Logstash filter to identify address matches

I have a CSV file with customer addresses. I have also an Elasticsearch index with my own addresses. I use Logstash as tool to import the CSV file. I'd like to use a logstash filter to check in my index if the customer address already exists. All I found is the default elasticsearch filter ("Copies fields from previous log events in Elasticsearch to current events") which doesn't look the correct one to solve my problem. Does another filter exist for my problem?
Here my configuration file so far:
input {
file {
path => "C:/import/Logstash/customer.CSV"
start_position => "beginning"
sincedb_path => "NUL"
}
}
filter {
csv {
columns => [
"Customer",
"City",
"Address",
"State",
"Postal Code"
]
separator => ";"
}
}
output {
elasticsearch {
hosts => [ "localhost:9200" ]
index => "customer-unmatched"
}
stdout{}
}
You don't normally have access to the data in Elasticsearch while processing your Logstash event. Consider using a pipeline on an Ingest node

ELK Stack - Customize autogenerated field mappings

I've got a very basic ELK stack setup and passing logs to it via syslog. I have used inbuilt grok patterns to split the logs in to fields. But the field mappings are auto-generated by logstash elasticsearch plugin and I am unable to customize them.
For instance, I create a new field by name "dst-geoip" using logstash config file (see below):
geoip {
database => "/usr/local/share/GeoIP/GeoLiteCity.dat" ### Change me to location of GeoLiteCity.dat file
source => "dst_ip"
target => "dst_geoip"
fields => [ "ip", "country_code2", "country_name", "latitude", "longitude","location" ]
add_field => [ "coordinates", "%{[dst_geoip][latitude]},%{[geoip][longitude]}" ]
add_field => [ "dst_country", "%{[dst_geoip][country_code2]}"]
add_field => [ "flow_dir", "outbound" ]
}
I want to assign it the type "geo_point" which I cannot edit from Kibana. Online documents mentions manually updating the mapping on respective index using ElasticSearch APIs. But Logstash generates many indices (one per day). If I update one index, will the mapping stay the same in future indices?
What you're looking for is a "template".

“rename” filter didn’t rename the field in events

I used the filter plugin "rename" to rename a field in my events. Logstash didn't show any error on restart. But there doesn't seem to be any changes to the field name(checked in sense plugin). I added "rename" after indexing the file. How can I rename a field after indexing? I'm using LS-2.0, ES-2.0. This is a piece of my logstash.conf file:
filter {
grok {
match => {"message" => "^(?<EventTime>[0-9 \-\:]*)\s(?<MovieName>[\w.\-\']*)\s(?<Rating>[\d.]+)\s(?<NoOfDownloads>\d+)\s(?<NoOfViews>\d+)" }
}
mutate {
convert => {"Rating" => "float"}
}
mutate {
convert => {"NoOfDownloads" => "integer"}
}
mutate {
convert => {"NoOfViews" => "integer"}
}
mutate{
rename => { "NoOfViews" => "TotalViews" }
}
> }
You need to either reindex or create elastic search alias for the already indexed data. Direct renaming of the field is not possible on indexed data.

Logstash kibana geoip filter conflict

I have been trying to make Geo Filter working for logs. No luck yet.
I keep recreating my Logstash Index in ES, recreating GeoIP field with with Default type, double, and float, but Kibana keep complaining that my geoip.location property has Conflict.
Any suggestion would be appreciated.
geoip {
source => "[headers][x-forwarded-for]"
target => "geoip"
database => "/etc/logstash/GeoLiteCity.dat"
add_field => [ "[geoip][coordinates]", "%{[geoip][longitude]}" ]
add_field => [ "[geoip][coordinates]", "%{[geoip][latitude]}" ]
}
mutate {
convert => [ "[geoip][coordinates]", "float"]
}
Resolved the issue by specifying default Mapping template, and recreating index now has "geo_point" data type for geoip.location.

Data type conversion using logstash grok

Basic is a float field. The mentioned index is not present in elasticsearch. When running the config file with logstash -f, I am getting no exception. Yet, the data reflected and entered in elasticsearch shows the mapping of Basic as string. How do I rectify this? And how do I do this for multiple fields?
input {
file {
path => "/home/sagnik/work/logstash-1.4.2/bin/promosms_dec15.csv"
type => "promosms_dec15"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter {
grok{
match => [
"Basic", " %{NUMBER:Basic:float}"
]
}
csv {
columns => ["Generation_Date","Basic"]
separator => ","
}
ruby {
code => "event['Generation_Date'] = Date.parse(event['Generation_Date']);"
}
}
output {
elasticsearch {
action => "index"
host => "localhost"
index => "promosms-%{+dd.MM.YYYY}"
workers => 1
}
}
You have two problems. First, your grok filter is listed prior to the csv filter and because filters are applied in order there won't be a "Basic" field to convert when the grok filter is applied.
Secondly, unless you explicitly allow it, grok won't overwrite existing fields. In other words,
grok{
match => [
"Basic", " %{NUMBER:Basic:float}"
]
}
will always be a no-op. Either specify overwrite => ["Basic"] or, preferably, use mutate's type conversion feature:
mutate {
convert => ["Basic", "float"]
}

Resources