Elasticsearch long and double mapping does not work - elasticsearch

I am using logstash for pushing logs to ES from DynamoDB:
filter {
json {
source => "message"
target => "doc"
}
mutate {
convert => {
"[doc][dynamodb][keys][customerId][N]" => "integer"
"[doc][dynamodb][newImage][callDate][N]" => "integer"
"[doc][dynamodb][newImage][price][S]" => "float"
}
}
date {
match => [ "[doc][dynamodb][newImage][callDate][N]", "UNIX" ]
target => "#timestamp"
}
}
output {
elasticsearch {
hosts => ["localhost"]
codec => "json"
index => "cdr-%{+YYYY.MM.dd}"
document_type => "cdr"
document_id => "%{[doc][dynamodb][keys][uniqueId][S]}"
template_name => "cdr"
template => "/opt/logstash/templates/logstash_dynamodb_template.json"
template_overwrite => true
}
stdout { }
}
Actually mutate.convert does not make any changes, no matter if it is removed or added.
{
"order": 0,
"template": "cdr*",
"settings": {
"index.refresh_interval": "5s"
},
"mappings": {
"cdr": {
"dynamic": "false",
"_all": {
"enabled": false
},
"properties": {
"doc": {
"properties": {
"dynamodb": {
"properties": {
"keys": {
"properties": {
"customerId": {
"properties": {
"N": {
"type": "long"
}
}
},
"uniqueId": {
"properties": {
"S": {
"type": "string",
"index": "not_analyzed"
}
}
}
}
},
"newImage": {
"properties": {
"callDate": {
"properties": {
"N": {
"type": "date",
"format": "epoch_second"
}
}
},
"direction": {
"properties": {
"S": {
"type": "string",
"index": "not_analyzed"
}
}
},
"disposition": {
"properties": {
"S": {
"type": "string",
"index": "not_analyzed"
}
}
},
"price": {
"properties": {
"S": {
"type": "double"
}
}
},
"uniqueId": {
"properties": {
"S": {
"type": "string",
"index": "not_analyzed"
}
}
}
}
}
}
}
}
}
}
}
}
}
Yes, doc.message contains all described fields, but they are not mapped. Here is the screenshot from ES:
As you can see only string mappings work as it is supposed.
Error while querying says: No mapping found for [doc.dynamodb.newImage.callDate.N] in order to sort on
Does anyone know what is the reason of such behavior?
Tip: logstash debug bin/logstash -f filters/logstash-dynamodb.conf --debug does not contain any errors.
Thanks in advance for any ideas.

Related

Can't Filter by geoip.location

Using ELk 6.X It seems i cannot plot points due to geoip.location not populated?
I also added a template which i hope is correct. Not an expert but i am pretty sure my points aren't rendered bc its missing data there.
Kibana 6.4.2
Logstash 6.4.2-1
Elasticsearch 6.4.2
Following configs
input {
udp {
port => 9996
codec => netflow {
versions => [5, 7, 9, 10]
}
type => netflow
}
}
filter {
geoip {
source => "[netflow][ipv4_src_addr]"
target => "src_geoip"
database => "/usr/share/GeoIP/GeoLite2-City.mmdb"
}
geoip {
source => "[netflow][ipv4_dst_addr]"
target => "dst_geoip"
database => "/usr/share/GeoIP/GeoLite2-City.mmdb"
}
}
output
output {
if [type] == "netflow" {
elasticsearch {
hosts => ["localhost:9200"]
index => "logstash-%{+YYYY.MM.dd}"
}
} else {
elasticsearch {
hosts => ["localhost:9200"]
sniffing => true
manage_template => false
index => "%{[#metadata][beat]}-%{+YYYY.MM.dd}"
document_type => "%{[#metadata][type]}"
}
}
}
The Mapping is like such
"geoip": {
"dynamic": "true",
"properties": {
"ip": {
"type": "ip"
},
"latitude": {
"type": "half_float"
},
"location": {
"type": "geo_point"
},
"longitude": {
"type": "half_float"
}
}
},
Template
{
"logstash": {
"order": 0,
"version": 60001,
"index_patterns": [
"logstash-*"
],
"settings": {
"index": {
"refresh_interval": "5s"
}
},
"mappings": {
"_default_": {
"dynamic_templates": [
{
"message_field": {
"path_match": "message",
"match_mapping_type": "string",
"mapping": {
"type": "text",
"norms": false
}
}
},
{
"string_fields": {
"match": "*",
"match_mapping_type": "string",
"mapping": {
"type": "text",
"norms": false,
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
}
}
}
],
"properties": {
"#timestamp": {
"type": "date"
},
"#version": {
"type": "keyword"
},
"geoip": {
"dynamic": true,
"properties": {
"ip": {
"type": "ip"
},
"location": {
"type": "geo_point"
},
"latitude": {
"type": "half_float"
},
"longitude": {
"type": "half_float"
}
}
}
}
}
},
"aliases": {}
}
}
My indexes come back with
src or dst but only the below
# dst_geoip.latitude 26.097
# dst_geoip.location.lat 26.097
# dst_geoip.location.lon -80.181

ElasticSearch overriding mapping from text to object

I am trying to override a mapping for a field.
There is a default index template (which I can't change) and I am overriding it with a custom one.
The default index has a mapping for "message" field as text, but I need to make it treated like an object and make its fields indexable/searchable.
This is the default index template, with order 10.
{
"mappings": {
"_default_": {
"dynamic_templates": [
{
"message_field": {
"mapping": {
"index": true,
"norms": false,
"type": "text"
},
"match": "message",
"match_mapping_type": "string"
}
},
...
],
"properties": {
"message": {
"doc_values": false,
"index": true,
"norms": false,
"type": "text"
},
...
}
}
},
"order": 10,
"template": "project.*"
}
And here's my override:
{
"template" : "project.*",
"order" : 100,
"dynamic_templates": [
{
"message_field": {
"mapping": {
"type": "object"
},
"match": "message"
}
}
],
"mappings": {
"message": {
"enabled": true,
"properties": {
"tag": {"type": "string", "index": "not_analyzed"},
"requestId": {"type": "integer"},
...
}
}
}
}
This works nice, but I end up defining all fields (tag, requestId, ...) in the "message" object.
Is there a way to make all the fields in the "message" object indexable/searchable?
Here's a sample document:
{
"level": "30",
...
"kubernetes": {
"container_name": "data-sync-server",
"namespace_name": "alitest03",
...
},
"message": {
"tag": "AUDIT",
"requestId": 1234,
...
},
}
...
}
Tried lots of things, but I can't make it work.
I am using ElasticSearch version 2.4.4.
You can use the path_match property in your dynamic mapping :
Something like :
{
"template": "project.*",
"order": 100,
"mappings": {
"<your document type here>": {
"dynamic_templates": [
{
"message_field": {
"mapping": {
"type": "object"
},
"match": "message"
}
},
{
"message_properties": {
"path_match": "message.*",
"mapping": {
"type": "string",
"index": "not_analyzed"
}
}
}
]
}
}
}
But you will maybe have to distinguish between string / numeric with match_mapping_type

query on a date range in elasticsearch

I want to get documents from last 30 days in elastic search but it returns empty.
it is my mapping:
PUT /books
{
"mappings": {
"impressions": {
"properties": {
"booksCreated" : {
"type": "date",
"format": "yyyy-MM-dd HH:mm:ss||yyyy-MM-dd||epoch_millis",
"index": true
}
}
}
}
}
and it is my query:
POST /books/_search?size=0
{
"aggs": {
"range": {
"date_range": {
"field": "booksCreated",
"format": "yyyy-MM-dd",
"ranges": [
{ "to": "now" },
{ "from": "now-1M/M" }
]
}
}
}
}
I've tried all possible ways but it returns empty.
but i can query on #timestamp field
the problem is that logstash changes the field type from date to string. my json is :
{
"index":"books",
"type":"book",
"body":{
"impressions":{
"_source":{
"enabled":true
},
"properties":{
"BookCreated":"2017-09-18 12:18:39"
}
}
}
}
and my logstash config:
input {
file {
path => "E:\data2\log\logstash.log"
start_position => "beginning"
sincedb_path => "/dev/null"
codec => json
}
}
filter {
mutate {
strip => ["message"]
}
}
output {
elasticsearch {
hosts => "localhost"
index => "books"
document_type => "book"
}
}
i will log the json to a log file and logstash send them to elasticsearch
after adding json the mapping chasnges to this:
{
"Books": {
"mappings": {
"Books": {
"properties": {
"#timestamp": {
"type": "date"
},
"#version": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
"BookCreated": {
"type": "date",
"format": "yyyy-MM-dd HH:mm:ss"
},
"body": {
"properties": {
"Books": {
"properties": {
"_source": {
"properties": {
"enabled": {
"type": "boolean"
}
}
},
"properties": {
"properties": {
"BookCreated": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
}
}
}
}
}
}
},
"host": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
"index": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
"path": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
"type": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
}
}
}
}
}
}
it has two BookCreated one isdate and the other is text
You need to put from and to in the same range, like this:
POST /books/_search?size=0
{
"aggs": {
"range": {
"date_range": {
"field": "BookCreated",
"format": "yyyy-MM-dd",
"ranges": [
{
"from": "now-1M/M",
"to": "now"
}
]
}
}
}
}
I'm pretty sure there is an issue with your mapping. First of all, make sure the bookCreated field is named consistently, both in regards to naming as well as capitalization!
Secondly, I believe the reason you have two bookCreated is because your mapping contains a bookCreated property. Your JSON however contains a nested structure: body => properties => bookCreated. Either flatten/transform the book in logstash to the required index structure, or model your index according to your json, which could look something like this?
"mappings": {
"properties": {
"body": {
"type": "object",
"properties": {
"properties": {
"type": "object",
"properties": {
"bookCreated": {
"type": "date",
"format": "yyyy-MM-dd HH:mm:ss||yyyy-MM-dd||epoch_millis",
"index": true
}
}
}
}
}
}
}
Either way, I recommend you to set "dynamic": "strict" so you will actually see when you make a mistake in the mapping rather than just new fields being created

ElasticSearch Dynamic Template for nested

I try to create dynamic template for a nested object.
Here is a document to index :
{
"title": "My title",
"attributes": {
"color": {
"id": 42,
"label": "red"
},
"material": {
"id": 43,
"label": "textile"
}
}
}
This is the template i tried, without success
{
"dynamic": "false",
"dynamic_templates": [
{
"attributes": {
"path_match": "attributes",
"mapping": {
"type": "nested"
}
}
},
{
"attributes_nested": {
"path_match": "attributes.*",
"mapping": {
"properties": {
"id": {
"type": "integer"
},
"value": {
"type": "string"
}
}
}
}
}
],
"properties": {
"title": {
"type": "string"
}
}
}
I'd like to be able to make aggregations on attributes.color.id, and attributes.material.id
Nevermind, the problem was that i had
{ "dynamic": false}
The correct mapping is
{
"dynamic": "false",
"dynamic_templates": [
{
"attributes_nested": {
"path_match": "attributes.*",
"mapping": {
"properties": {
"id": {
"type": "integer"
},
"value": {
"type": "string"
}
}
}
}
}
],
"properties": {
"title": {
"type": "string"
},
"attributes": {
"type": "nested",
"dynamic": true
}
}
}

Make a field as an "INDEXED" in elasticsearch

I am using ELK stack with filebeat.
I am using a default template for mapping.
I am not getting all needed fields as "indexed"
Here is my mapping file,
{
"mappings": {
"_default_": {
"_all": {
"enabled": true,
"norms": {
"enabled": false
}
},
"dynamic_templates": [
{
"template1": {
"mapping": {
"doc_values": true,
"ignore_above": 1024,
"index": "not_analyzed",
"type": "{dynamic_type}"
},
"match": "*"
}
}
],
"properties": {
"#timestamp": {
"type": "date"
},
"offset": {
"type": "long",
"doc_values": "true"
}
}
}
},
"settings": {
"index.refresh_interval": "5s"
},
"template": "filebeat-*"
}
Here is my config file for output.
output {
elasticsearch {
hosts => ["localhost:9200"]
sniffing => true
manage_template => false
index => "%{[#metadata][beat]}-%{+YYYY.MM.dd}"
document_type => "%{[#metadata][type]}"
}
}
Let's say I want a field name channelas an indexed field. How to modify the template?

Resources