I indexed a dataset of geo-data records in ElasticSearch for analysis in Kibana. My issue is that the 'Discover' tab doesn't pick up the data but instead displays the error message
Discover: An error occurred with your request. Reset your inputs and try again.
In 'Settings', I could configure my data index just fine, and Kibana is picking up all the mapping fields with correct type/analysis/indexing metadata. 'Visualize' works fine, too. I can create my charts, add them to the dashboard, drill down - everything. Just the 'Discover' tab is broken for me.
I'm running ElasticSearch 1.5.2, and tried with Kibana 4.0.1, 4.0.2 and 4.1-snapshot now (on Ubuntu 14.04), all with the same results.
Another effect I'm noticing: the sidebar is not showing any 'Available Fields'. Only if I unfold the field settings and untick 'Hide Missing Fields' I'll get my list of schema fields. (These are greyed out as they are considered 'missing' by Kibana. But interestingly, clicking on 'Visualize' on one of them to chart their distribution works, again, perfectly fine.)
My only suspicion is: my data doesn't have a timestamp field, so maybe that's what's messing things up. Although judging from the docs I'd assume that non-timeseries data should be supported.
Any hints appreciated!
In my case, the cause was that I had indexed malformed JSON into elasticsearch. It was valid Javascript, but not valid JSON. In particular I neglected to quote the keys in the objects
I had inserted my (test) data using curl, e.g.
curl -X PUT http://localhost:9200/foo/doc/1 -d '{ts: "2015-06-24T01:07:00.000Z", employeeId: 105, action: "PICK", quantity: 8}'
Note that ts: should have been "ts":, etc.
Seems like elasticsearch tolerates such things, but Kibana does not. Once I fixed that, Discover worked fine.
Note that the error you are seeing is generated client side when an error arises. If you open your client debugger (e.g. Firefox) you will see the error in the console log. In my case, the error message was
Error: Unable to parse/serialize body
If your error is different, it will be a different cause.
It was my fault for entering bad JSON to begin with. Odd that elasticsearch is more tolerant than Kibana.
It happened to me as well. I tried all...:
Deleting all the indices (.kibana, my own, etc) didn't work
Restarting the ES, Kibana and LS services didn't help.
I didn't have the Request Timeout problem in kibana.yml either.
My problem was that the timestamp field was using an incorrect time format. I changed it to this format and it worked: "date": "2015-05-13T00:00:00"
I had the same problem. None of the suggested solutions helped. I finally found the problem while comparing a working version with a non-working version in Wireshark.
Don't emit a UTF8 byte order mark in front of your JSON. Somehow, my serializer was set up to do that... ElasticSearch is fine with it, but Kibana cannot handle it on the Discover page.
Related
When trying to create visualizations with Timelion, I keep getting this error message: [timelion_vis] > Error: in cell #1: [search_phase_execution_exception]
It is the full message, there is no more information provided.
I recently did a project about Access Log Analytics using the ELK-Stack. I set up Logstash and fed our Apache Access Logs to Elasticsearch and in Kibana I created a handful of visualizations. Timelion was especially helpful, I have a bunch of queries that work perfectly - in Kibana 5.2, the version I was required to use for the project.
Now we have it all set up on a live server using the current versions (7.3) of Logstash, Elasticsearch and Kibana. I made adjustments to my visualizations to accomodate the current versions, however I have problems setting up Timelion.
We can't find any more help in the logs either, there is just nothing there. No hint as to what might be wrong.?
Even with .es(*) I get the error. Using my old queries I get the same error, I checked my syntax, it is fine.
However if I type .es(index=logstash-*, timefield='#timestamp', metric='avg:size') I get a result.
The size-field holds a numeric value. all my other queries refer to fields with string-values. So there seems to be problem with that, I just cannot get behind it.
Any ideas? Has anyone else encountered this error before?
This is what comes up in the Kibana Logs:
Error: in cell #1: [search_phase_execution_exception] : Error: in cell #1: [search_phase_execution_exception]
at throwWithCell (/usr/share/kibana/src/legacy/core_plugins/timelion/server/handlers/chain_runner.js:54:11)
at /usr/share/kibana/src/legacy/core_plugins/timelion/server/handlers/chain_runner.js:205:13
at arrayEach (/usr/share/kibana/node_modules/lodash/index.js:1315:13)
at Function. (/usr/share/kibana/node_modules/lodash/index.js:3393:13)
at /usr/share/kibana/src/legacy/core_plugins/timelion/server/handlers/chain_runner.js:197:23
at tryCatcher (/usr/share/kibana/node_modules/bluebird/js/release/util.js:16:23)
at Promise._settlePromiseFromHandler (/usr/share/kibana/node_modules/bluebird/js/release/promise.js:517:31)
at Promise._settlePromise (/usr/share/kibana/node_modules/bluebird/js/release/promise.js:574:18)
at Promise._settlePromise0 (/usr/share/kibana/node_modules/bluebird/js/release/promise.js:619:10)
at Promise._settlePromises (/usr/share/kibana/node_modules/bluebird/js/release/promise.js:699:18)
at Promise._fulfill (/usr/share/kibana/node_modules/bluebird/js/release/promise.js:643:18)
at SettledPromiseArray.PromiseArray._resolve (/usr/share/kibana/node_modules/bluebird/js/release/promise_array.js:126:19)
at SettledPromiseArray._promiseResolved (/usr/share/kibana/node_modules/bluebird/js/release/settle.js:16:14)
at SettledPromiseArray._promiseRejected (/usr/share/kibana/node_modules/bluebird/js/release/settle.js:32:17)
at Promise._settlePromise (/usr/share/kibana/node_modules/bluebird/js/release/promise.js:581:26)
at Promise._settlePromise0 (/usr/share/kibana/node_modules/bluebird/js/release/promise.js:619:10)
i try to setup the youtube api but i get a 403.
I tried to setup several times- without success.
https://www.googleapis.com/youtube/v3/videos?part=snippet,contentDetails&id=-DIJvggBrg8&key=xyz
Maybe someone is able to help me or even login to the console for a setup?
The 403 error by itself is not of immediate use. Its attached error code (as I already pointed out above) sheds light on the things that happened.
The API responds to any query you make with a text (that is structured in JSON format). That response text contains the needed error code.
I suggest you to proceed with the following steps:
delete the old API key (right now it is still accessible!);
create a new API key and
run your API query using the new key and then post here the API response text.
Note that I checked the video -DIJvggBrg8 to which your query refers to with my own API key and got no error but a JSON text describing the video.
I can't define default index in Kibana 4.0.2, as you can see on the image below. It doesn't saves if I enter it manually in "Advanced" tab, it only glitches when I click on "Set as default index" button but it doesn't make needed changes.
I'm using Couchbase 3.0.3, transport-couchbase plugin 2.0, ElasticSearch 1.5.2.
I've tried reinstalling everything (except couchbase), one by one, always the same.
I've googled it whole day, nothing. Someone has any idea?
it seems your not the only one having this issue with couchbase => https://github.com/elastic/kibana/issues/3331#issuecomment-84942136
After changing couchbase_template.json it's needed to set
curl -XPUT http://localhost:9200/_template/couchbase -d #/usr/share/elasticsearch/plugins/transport-couchbase/couchbase_template.json
and now it works.
Thnx juliendangers for directions.
I have a logstash-elasticsearch-kibana local setup and I have a problem when it comes to save Kibana dashboards.
Selecting the "Save" option I get the following error: "Save failed Dashboard could not be saved to Elasicsearch"
I'm using the logstash dashboard that comes with Kibana and after some modifications I tried to save it getting this error.
As far as I understand dashboards loaded from templates (json files located in kibana3/app/dashboards) cannot be saved to Elasticsearch (as stated in kibana templates). But I haven't been able to figure out how to create a new dashboard for logstash and save it to Elasticsearch, nor find instructions to do that. I would like to have different dashboards and be able to modify them and load them as needed.
I have exported the dashboard schema and successfully load it back, which works as far as saving a dashboard after all customization is done. But I would prefer to save them to elasticsearch rather than to template files.
Communication between ES and Kibana works fine (no errors show up in logs and information is retrieved and showed in Kibana).
Someone who could tell me what I'm missing here?
Thanks!
I got the error when I had a '/' (slash) in the name of the dashboard. Changing this to '-' solved the problem. See the following issue on GitHub: https://github.com/elasticsearch/kibana/issues/837
I tried to set-up a elasticsearch on my windows 7 os pc. Installed elasticsearch & sense and it's working as the loacahost:9200 is working fine.
Now i am strugging to search in a file located at c:\user\rajesh\default.json.
indexing of a data i.e
Put test\te\2
{
---datas
}
is working fine, but when i try to reference to file i.e POST test\te\2-d #default.json it's giving error as Unexpected '<'
I Installed KIBANA but not able to do anything with it's UI. when trying to search anything it's giving error:No index found at http://localhost:9200/INDEX_MISSING/_mapping/field/
I have edited config.js file with elasticsearch: "http://"+"localhost"+":9200", but it is not able to use the index which i created using sense to ES.
Thanks in Advance
First try this:
http://localhost:9200/_search?pretty
If you get no data returned (no indices), then you have some error either in elistcsearch or logstash.
Additionally, I recomend you try to access kopf from your browser.
http://lmenezes.com/elasticsearch-kopf/?location=http://localhost:9200
If you have data in the port, it will show you the indexes.
Regards
Ari