Need info in getting the date from filename in logstash - logstash-configuration

currently i have filename with the below format
[XXXXXXXX][YYYYYYYYYY][2016_07_21][19_21_12][160721T192103][ZZZZ]AB_RTRT.0.log.
is there a way, i can extract the datetimestamp and index it to a specific field in elastic search.
thanks
Subbu

Related

nifi: How to specify dynamic index name when sending data to elasticsearch

I am new to apache NiFi.
I am trying to put data into elasticsearch using nifi.
I want to specify an index name by combining a specific string and the value converted from a timestamp field into date format.
I was able to create the desired shape with the expression below, but failed to create the index name with the value of the timestamp field of the content.
${now():format('yyyy-MM-dd')}
example json data
{
"timestamp" :1625579799000,
"data1": "abcd",
"date2": 12345
}
I would like to get the following result:
index : "myindex-2021.07.06"
What should I do? please tell me how
I know that if you use the PutElasticSearch Processor, you can provide it with a specific index name to use. And as long as the index name meets the proper ElasticSearch format for naming a new index, if the enable auto index creation in ElasticSearch is turned on, then when sent, Elastic will create the new index with the desired name. This has worked for me. Double check the Elastic Naming Rules that can be found here or https://docs.aws.amazon.com/elasticsearch-service/latest/developerguide/es-indexing.html

How to specify index and date / time field in Timelion?

I'm trying to use the Timelion app in Kibana, but I don't find where to specify the index name and the time field. Is there a way to do that on-the-fly or does it have to be done in a configuration file somewhere? If so, where is that file?
.es(index=your_index_name, timefield=#timestamp, metric=count, q=whatever_field:some_matching_text)
Under the es() function you have index and timefield.

ELK most appropriate timestamp name _ or #

What is the most appropriate name for the timestamp when utilizing Logstash to parse logs into Elasticsearch, then visualizing with Kibana?
I am defining the timestamp using date in a filter:
date {
match => [ "logtime", "yy-MM-dd HH:mm:ss" ]
}
Logstash automatically puts this into the #timestamp field. Kibana can be configured to use any correctly formatted field as the timestamp, but it seems to be correct to use _timestamp in Elasticsearch. To do that, you have to mutate and rename the datestamp field.
mutate {
rename => { "#timestamp" => "_timestamp" }
}
Which is slightly annoying.
This question could be entirely semantic - but is it most correct to use _timestamp, or is it just fine to use #timestamp? Are there any other considerations which should influence the naming of the timestamp field?
Elasticsearch allows you to define fields starting with an underscore, however, Kibana (since v4) will only show the ones declared outside of the _source document.
You should definitely keep with #timestamp which is the standard way to name the timestamp field in Logstash. Kibana will not allow you to use _timestamp.
Please note that _timestamp is reserved and deprecated special field name. Actually any field names starting with underscore are reserved for elasticsearch future internal usage. AFAIK logstash documentation examples use #timestamp as field name
without any renaming.

Custom field with embedded html in kibana

I have a field in elasticsearch index with a mp3 filename, i want to create a custom field that shows a player with a formatted url and adding the filename stored in elasticsearch. Is it possible to do it?
You need to create a scripted field : https://www.elastic.co/blog/kibana-4-beta-3-now-more-filtery
Something like that "url/"+doc["filename"].value

Elasticsearch date field: epoch millis input, string output?

Steps:
1. Define a date field in a mapping.
2. Insert a epoch millisecond (long) value into that field.
Can elastic search returns a string value (yyyy-MM-ddTHH:mm:SS) of that field for a search?
From what I understand of the date-format documentation of ElasticSearch, it will always accept a milliseconds-since-epoch input next to input in the format given by the format, and it will produce a String output using the (first) format given. If you don't provide a format, then the "date_optional_time" format will be used (yyyy-MM-dd’T'HH:mm:ss.SSSZZ).
If the time zone in there is a problem for you, you'd need to give ElasticSearch your intended format.
I don't have the code to hand, but in my testing I believe I managed to do the following:
I used the date formatter on the field and the query fields definition to do this:
curl -XGET 'http://localhost:9200/twitter/tweet/1?fields=title,date_field.date_time'
using the date formats specified here: http://www.elasticsearch.org/guide/reference/mapping/date-format/
If you want a full document returned, this may be onerous. In which case is it possible to use an alias 'view' mapping to get the result to return differently from your primary mapping? Possibly this has become a half-answer.

Resources