Kibana doesn't show any results in "Discover" tab - elasticsearch

I setup elasticsearch and Kibana for indexing our application (error) logs. The issue is that Kibana doesn't display any data in the "Discover" tab.
Current situation
Elasticsearch is up and running, responds to API
executing a query directly on Elasticsearch like http://elasticserver.com:9200/applogs/_search?q=* returns lots of results (see below on how a single found record looks like)
Kibana is up and running, even finds the applogs index exposed by Elasticsearch
Kibana also shows the correct properties and data type of the applogs documents
"Discover" tab doesn't show any results...even when setting the time period to a couple of years...
Any ideas??
Here's how Kibana sees the applogs index:
Elastic search query result object looks like this:
{
_index: "applogs",
_type: "1",
_id: "AUxv8uxX6xaLDVAP5Zud",
_score: 1,
_source: {
appUid: "esb.Idman_v4.getPerson",
level: "trace",
message: "WS stopwatch is at 111ms.",
detail: "",
url: "",
user: "bla bla bla",
additionalInfo: "some more info",
timestamp: "2015-03-31T15:08:49"
}
},
..and what I see in the discover tab:

For people who have a problem like this:
Change time frame in top right corner.
By default it shows data only for last 15 min.

I wanted to put this as a comment but unfortunately, I am not able to given my deficient repo to do so. So as #Ngeunpo suggested, this is how you add a time field to an index while creating it:. If you did not do that while creating your index, I suggest you delete that index and recreate it. The index name logstash-* in the gif is analogous to your index applogs. In this case, field #timestamp is added as the time field. Let me know if this works.
EDIT: Image courtesy: This wonderful ELK setup guide

Kibana does not understand the timestamp field, if it's format is incorrect.Timestamp, which you selected by clicking on Time-field name when Configure an index pattern, need to be :
"timestamp":"2015-08-05 07:40:20.123"
then you should update your index mapping like this:
curl -XPUT 'http://localhost:9200/applogs/1/_mapping' -d'
{
"1": {
"timestamp": {
"enabled": true,
"type": "date",
"format": "yyyy-MM-dd HH:mm:ss.SSS",
"store": true
}
}
}'
See this question and answer
UPDATE
If you are using ES 2.X, you can set the "format" to "epoch_millis" like this:
curl -XPUT 'http://localhost:9200/applogs/1/_mapping' -d'
{
"1": {
"timestamp": {
"type": "date",
"format": "epoch_millis",
"store": true,
"doc_values": true
}
}
}'

Try this: unclick "Index contains time-based events" checkbox
and then provide your index name then check "Discover" whether it contains data or not

I had same issue and this worked for me:
Delete the index from the Settings tab.
restart Kibana
then re-add in Settings
The issues with Time-Series I'm sure can also be an issue, but if no fields actually show up in the Discover tab, then you might have same issue as original reporter and what I had.

I had probably the same issue - I see data in the dashboard but 0 results in discover. Going to Managerment > Index Pattern > Refresh filed list button (a button with refresh icon only) solved it for me.

I had the same issue, and #tAn-'s comment helped me to resolve it. Changing date field to #timestamp did the trick. Thanx!
The next step should be to find out that was wrong with my custom date field.

I had the same problem, but now its working fine.
The problem was with the #timestamp. Actually I have uploaded the file to elasticsearch using logstash thus it automatically generate a #timestamp field. Kibana compare time range with this #timestamp,that is, when the actual event occurred.Even if I deselect "Index contains time-based events" option in add new index pattern page, kibana will automatically consider the #timestamp field.So toggle with timeframe on kibana based on the #timestamp field worked for me.
You can also check by adding index pattern with out a time stamp and deselect "Index contains time-based events" option.See what happens ..now there wont be any time frame select option in kibana discover page and you will most probably get the result in discover page.
These are all my observations, not sure, this solution fits your case ..you may try..
I am using ES 1.5x, logstash 1.5.1 and kibana 4.1.0

I also experienced the same error. Mostly this happens because of time format. Basically, make sure you have valid time frame for your data (top-right filter). Anyway,in my case, I used epoch time format for timestamp but it didn't work. So I changed to epoch_millisec instead and it worked like a charm.
In sum, make sure that Kibana can understand your date time format. It is required epoch_millisec by default not just epoch.

In my situation, everything was working previously and then I couldn't see the latest data starting February 1st (actually, I could if I looked back a month). It turns out that the mapping format for my custom timefield was incorrect. My mapping format was YYYY-MM-DD'T'HH:mm:ss.SSSZ. The problem was that DD is interpreted as day of the year and I wanted day of the month which is dd. Changing the mapping and reindexing fixed the problem.

In my case, I set time from server log.
and the time was different with UTC(the log's time was future comparing to UTC time)
so, when I search logs with filter of days/months/years ago. there was no log because it was future time.
so, when I use Today filter. or with future time.
It showed the logs.
after, changing the time zone. it's fixed

I had the same issue, So, as shown in one of the solutions above, I went to settings and deleted the previous index and made a new with #timestamp.
But that didnt solve the issue. So, I looked into the issue and saw, after a deployment, there was nothing coming into Kibana.
So, I went into the server, and saw that the indexes were corrupted. SO I just stopped the logstash and elasticsearch on the instance/server and restarted the service.
And Voila , It successfully restarted the service and kibana was back.
WHY DID IT HAPPEN ?
Someone might have stopped the server abruptly which caused indexes to get corrupted.

Related

Elasticsearch/Kibana Unindexed field cannot be searched

I'm having some trouble in querying / filtering data on kibana with respect to a geo_point field that is indexed.
Here is a relevant section of the mapping template:
"dstGeoLocation": {
"type": "geo_point"
},
"srcGeoLocation": {
"type": "geo_point"
},
The ingestion happens okay, since the data ends up in ES and am able to view it in Kibana like so:
0,0 is the default that has been given.
However, in Kibana, I still get a message that this is an unindexed field and hence is not searchable.
How do I remedy this situation?
I have already tried to:
Remove and reload the index mappings
Remove and recreate the kibana index pattern (there is no manual refresh in v7.13)
Version of ES and Kibana: 7.13.12
Hi I just fixed the error you are showing by clicking the small refresh button up right in Stack Management > Kibana > Index Patterns > (select/create some pattern)
IMGUR Screenshot
So give it a try.

Property not available for visualize in kibana

While trying to change a Visualization in Kibana to use another property for the x-axis, that property doesn't appear there.
I changed recently nlog to target elastic search using the Elastic common schema.
After that change the property is not longer called ResolvedRoute but instead _metadata.resolved_route, the problem is that it doesn't appear on the field for x-axis, it says no matches found.
It is not on the available fields
I'm still new to elastic search and kibana, so it's possible i'm missing something simple.
Don't know if it's related, but when on Discovermenu, looking at the Available fields all of _metadata fields have a question mark
I'm already trying to map some of these fields in Index Management / Edit template
Also, if i go to the console and type
GET /logstash-2020.11.25/_search
{
"query": {
"match_all": {}
}
}
I can see the fields of _metadata that i want, inside _source which is inside of hits.
I think i already had a similar problem where i had to delete all indexes that match the pattern and then the field appeared, but that doesn't make much sense.
What could be the problem?
Chances are high that you haven't refreshed the corresponding index pattern in Kibana. Therefore the data might exist as documents in Elasticsearch but not yet as a field in the index pattern, which is a Kibana Saved Object.
Please go to Settings / Stack Management (depending on your Kibana version), click on the index pattern you expect the field to be in and refresh the fields list (icon is in the upper right corner).
Please let me know if that solved your problem.
The fields in question were not correctly mapped in the template.
since metadata is an object it needs to be mapped like that first,
then inside of it we can map it's own properties.

geoip.location doesnot work with modified indexnames sent via logstash

geoip.location is of geo_point datatype when an event is sent from logstash to elasticsearch with default indexName. As geoip.location has geo_point datatype, i can view the plotting of location in maps in kibana as kibana looks for geo_point datatype for maps.
geoip.location becomes geoip.location.lat, geoip.location.lon with number datatype, when an event is sent from logstash to elasticsearch with modified indexName. Due to this i'm not able to view the plotting of location in maps in kibana.
i don't understand why elasticsearch would behave differently when i try to add data to a modifiedIndexName. is this a bug with elasticsearch?
For my usecase i need to use modified indexname, as i need new index for each day. The plan is to store the logs of a particular day in a single index. so, if there are 7 days then i need to have 7 indexes that contains logs of each day (new index should be created based on currentdate).
i searched around for solution, but i'm not able to comprehend and make it to work for me. Kindly help me out on this
Update (what i did after reading xeraa's answer?)
In the devtools in kibana,
GET _template/logstash - showed the allowed patterns in index_patterns property along with other properties
i included my pattern (dave*) inside index_patterns and triggered the PUT request. You have to pass the entire existing body content (which you would receive in the GET request) inside PUT request along with your required index_patterns, otherwise the default setting will disappear as the PUT api will replace whatever data you pass in the PUT body
PUT _template/logstash
{
...
"index_patterns": [
"logstash-*","dave*"
],
...
}
I'd guess that there is a template set for the default name, which isn't happening if you rename it.
Check with GET _template if any match your old index name and update the setting so that it also gets applied to the new one.

Grafana - Show metric by field value

I'm currently trying to create a graph on Grafana to monitor the status of my servers, however, I can't seem to find a way to use the value of a field as the value to be displayed on the graph. (Datasource is ElasticSearch)
The following "document" is going to be sent to GrayLog (which saves to Elastic) every 1 minute for an array of regions.
{
"region_key": "some_key",
"region_name": "Some Name",
"region_count": 1610
}
By using the following settings, I can get Grafana to display the count of messages it received for each region, however, I want to display the number on the region_count field instead.
Result:
How can I accomplish this? is this even possible using Elastic as the datasource?
1) Make sure that your document includes a timestamp in ElasticSearch.
2) In the Query box, provide the Lucene query which narrows down the documents to only those related to this metric
3) In the Metric line, press "Count" and change that to one which takes a specific field: for example, "Average"
4) Next to the "Average" box will appear "select field", which is a dropdown of the available fields. If you see unexpected fieldnames here, it's probably because your Lucene query isn't specific enough. (Kibana can be useful for getting this query right)

Kibana keeps some fields unindexed

So I have an index in elasticsearch, and I want to search and visualize the index with Kibana. But several fields are not indexed by Kibana, and have this bubble:
This field is not indexed thus unavailable for visualization and search.
This is a snippet of one of the fields that is not indexed by Kibana:
"_event_name" : {
"type" : "string"
},
I tried to enter Kibana's index settings and click "Reload field list", but it doesn't help.
Does anyone knows what could be the problem?
Thanks in advance
The fields might not be indexed as mentioned here.
Apparently, Kibana doesn't index fields that start with underscore.
How are you loading the data into Elasticsearch? Logstash? A Beat? curl? Please describe that and if you can include your config file that would be good.
You can look at your mapping in your browser with something like this;
http://localhost:9200/logstash-2016.07.20/_mapping?pretty
(change the host and index name)

Resources