Kibana not displaying any data - elasticsearch

I created visualizations on fly through curl. However, it does not display data.
I have created visualizations by doing as follows
1. Exporting an already built visualization in JSON
2. Modifying the index_name and field names of JSON
3. Importing back the visualization
The visualizations are created in Kibana, they have the same visualization name and refer the index pattern and fields I want. However it does not display any data (graphs)
Can anyone explain what is the problem with the way I am doing ?

The problem was with the variable. I used for find and replacs. The variable I used was also being used to filter data and hence replacing it was not showing any data. My bad

Related

I want to use Elasticserach and kibana alerts to detect line passing

We would like to implement a system that draws a line on a map displayed by kibana in advance and detects when a moving object (such as a boat) passes through the line.
I believe a possible way to do this is to set up rules using Elasticsearch query from kibana's rule creation.
But I don't know how to realize it.
I drew a line by selecting create index in add layer from maps in kibana.
A json file containing location, speed, and time information was imported into elasticserch and displayed on a map.

Elastic search document storing

Basic usecase that we are trying to solve is for users to be able to search from the contents of the log file .
Lets say a simple situation where user searches for a keyword and this is present in a log file which i want to render it back to the user.
We plan to use ElasticSearch for handling this. The idea that i have in mind is to use elastic search as a mechanism to store the indexed log files.
Having this concept in mind, i went through https://www.elastic.co/guide/en/elasticsearch/reference/current/index.html
Couple of questions i have,
1) I understand the input provided to elastic search is a JSON doc. It is going to scan this JSON provided and create/update indexes. So i need a mechanism to convert my input log files to JSON??
2) Elastic search would scan this input document and create/update inverted indexes. These inverted indexes actually point to the exact document. So does that mean, ES would store these documents somewhere?? Would it store them as JSON docs? Is it purely in memory or on file sytem/database?
3) No when user searches for a keyword , ES returns back the document which contains the searched keyword. Now do i need to have the ability to convert back this JSON doc to the original log document that user expects??
Clearly im missing something.. Sorry for asking questions this silly , but im trying to improve my skills and its WIP.
Also , i understand that there is ELK stack out there. For some reasons we just want to use ES and not the LogStash and Kibana part of the stack..
Thanks
Logs needs to be parsed to JSON before they can be inserted into Elasticsearch
All documents are stored on the filesystem and some data is kept in memory but all data is persistent.
When you search Elasticsearch you get back matching JSON documents. If you want to display the original error message, you can store that original message in one of the JSON fields and display just that.
So if you just want to store log messages and not break them into fields or anything, you can simply take each row and send it to Elasticsearch like so:
{ "message": "This is my log message" }
To parse logs, break them into fields and add some logic, you will need to use some sort of app, like Logstash for example.

How to create new Kibana visualization through REST?

I want to automate the creation of a set visualizations for new kibana/elasticsearch installations.
So I need to know if I can automate this, independent the programming language.
There are no APIs yet in Kibana to manage the searches, visualizations and dashboards. Some feature requests have been suggested (here and here) but they are still being discussed.
However, since Kibana visualizations are stored in the .kibana index with the visualization mapping type, you can definitely GET them, learn how they are built, modify them and PUT them again.
For a visualization named "Top consumers by country", you can get the visualization spec using
curl -XGET http://localhost:9200/.kibana/visualization/Top-consumers-by-country
You'll get a document containing the title of your visualization, another field called visState containing the specification of your visualization (obvisouly different for each visualization) and finally a field named kibanaSavedObjectMeta which contains the Elasticsearch query and index details.
You can also view/edit/export the same data in Settings > Objects > Visualizations

How to create visualisation on the fly using a script in Kibana 4

I have some requirement where I need to create different visualization for different users which will differ very slightly on the query param. So, I am considering to create a script which will enable me to do this.Have anyone done this on Kibana 4. Some pointers on how to create visualization using query would be of great help.
I would also like to create Dashboards on the fly but that can wait till I get this one sorted out.
If you want to go ahead with Java plugin (as mentioned in comments), here are the steps:
Create different visualizations with different X-axis parameters. Visualizations are basically json strings so you can write a java code which changes the value of x aggregation based on the mapping that you have. Now each chart will have different ids.
While you are creating a custom dashboard based on the user, check the mapping between user and the visualization and use the following command to add the visualization:
client.prepareIndex(,"visualization",).setSource().execute();

Export/Import Kibana 4 saved Searches, Visualization & Dashboards

I'm looking for a list of commands required to export and then import all Kibana 4 saved Searches, Visualizations and Dashboards.
I'd also like to have the default Kibana 4 index pattern created automatically for logstash.
I've tried using elasticdump as outlined here http://air.ghost.io/kibana-4-export-and-import-visualizations-and-dashboards/ but the default Kibana index pattern isn't created and the saved searches don't seem to get exported.
You can export saved visualizations, dashboards and searches from Settings >> Objects as shown in image below
you also have to export associated visualizations and searches with the dashboard. clicking on dashboard export will not include dependent objects.
All information pertaining to saved objects like saved searches, index patterns, dashboards and visualizations is saved in the .kibana index in Elasticsearch.
The GitHub project elastic/beats-dashboards contains a Python script for dumping Kibana definitions (to JSON, one file per definition), and a shell script for loading those exported definitions into an Elasticsearch instance.
The Python script dumps all Kibana definitions, which, in my case, is more than I want.
I want to distribute only some definitions: specifically, the definitions for a few dashboards (and their visualizations and searches), rather than all of the dashboards on my Elasticsearch instance.
I considered various options, including writing scripts to get a specific dashboard definition, and then parse that definition, and get the cited visualization and search definitions, but for now, I've gone with the following solution (inelegant but pragmatic).
In Kibana, I edited each definition, and inserted a string into the Description field that identifies the definition as being one that I want to export. For example, "#exportme".
In the Python script (from beats-dashboards) that dumps the definitions, I introduced a query parameter into the search function call, limiting it to definitions with that identifying string. For example:
res = es.search(
index='.kibana',
doc_type=doc_type,
size=1000,
q='description:"#exportme"')
(In practice, rather than hardcoding the "hashtag", it's better practice to specify it via a command-line argument.)
One aspect of the dump'n'load scripts provided with elastic/beats-dashboards that I particularly like is their granularity: one JSON file per definition. I find this useful for version control.
You can get searches using elasticdump like so:
elasticdump --input=http://localhost:9200/.kibana --output=$ --type=data --searchBody='{"filter": {"type": {"value": "search"}} }'

Resources