Error When Running Released Version of Kibana (ZeroClipboard) - elasticsearch

So I'm running Kibana on a server in the cloud and I'm getting the following error:
Error: Uncaught ReferenceError: ZeroClipboard is not defined (http://.../index.js?_b=6004:89886)
at window.onerror (http://.../index.js?_b=6004:45829:24)
I've googled this and I've only found people talking about this issue on pre-release versions of Kibana. I'm running version 4.0.2 and I get the same issue when I run 4.0.1. Anyone see this before?
Here's a public url to my kibana server: http://52.7.27.45:5601/
UPDATE:
kibana.yml
port: 5601
host: "0.0.0.0"
elasticsearch_url: "http://...:9200"
elasticsearch_preserve_host: true
kibana_index: ".kibana"
default_app_id: "discover"
ping_timeout: 300000
request_timeout: 300000
shard_timeout: 0
verify_ssl: true
Update 2:
I just tried running kibana on the same instance as one of my elasticsearch nodes and I didn't get this error. This seems to be related to running kibana on a host that isn't running an elasticsearch node. Seems crazy to me...

I had a similar problem in the past and I believe the plugins are not being loaded and in the kibana.yml, you should have the following:
# Plugins that are included in the build, and no longer found in the plugins/ folder
bundled_plugin_ids:
- plugins/dashboard/index
- plugins/discover/index
- plugins/doc/index
- plugins/kibana/index
- plugins/markdown_vis/index
- plugins/metric_vis/index
- plugins/settings/index
- plugins/table_vis/index
- plugins/vis_types/index
- plugins/visualize/index
Related: https://github.com/elastic/kibana/issues/2617

Related

how can i solve kibana bad request issue while runnig local host

I am trying to run ElasticSearch with Kibana in Windows 10.
i followed this tutorial step by step https://www.youtube.com/watch?v=16NeBf_IAmU
when i go to :
localhost:5601
all i get is:
{"statusCode":400,"error":"Bad Request","message":"Invalid cookie header"}
So it seems like the ElasticSearch is running, but for some reasons the Kibana cannot connect to it.Any idea ho can i solve this issue!!
Edit your kibana.yml
Remove comment from bellow lines ,if server.host is binded to localhost also no problems
server.host: 0.0.0.0
elasticsearch.url: "http://localhost:9200"
restart kibana and try to access it,
If it is causing problems because of cookies you can go through this
https://discuss.elastic.co/t/bug-cookies-are-invalidated-after-some-time/34252/5

Elasticsearch Error 56

I finaly switched from Ubuntu to MacOS High Sierra and I am running into a problem.
I have installed Elasticsearch localy and I am getting a response.
Unfortunately I am getting the following error when executing bin/console fos:elastica:populate
[2018-08-15 08:07:53] elastica.ERROR: Elastica Request Failure {"exception":"[object] (Elastica\Exception\Connection\HttpException(code: 0): Unknown error:56 at /srv/www/litedesk/vendor/ruflin/elastica/lib/Elastica/Transport/Http.php:180)","request":{"path":"_bulk","method":"POST","data":"{\"index\":{\"_index\":\"******\",\"_type\":\"******\",\"_id\":\"1\"}}\n{\"name\":\"*******\",\"shortcut\":\"\"}\n{\"index\":{\"_index\":\"*******\",\"_type\":\"*********\",\"_id\":\"2\"}}\n{\"name\":\"********\",\"shortcut\":\"****\"}\n{\"index\":{\"_index\":\"******\",\"_type\":\"*****\",\"_id\":\"3\"}}\n{\"name\":\"*****\",\"shortcut\":\"\"}\n{\"index\":{\"_index\":\"********\",\"_type\":\"*********\",\"_id\":\"4\"}}\n{\"name\":\"******\",\"shortcut\":\"****\"}\n","query":[],"connection":{"config":{"headers":[]},"host":"localhost","port":9200,"logger":"fos_elastica.logger","compression":false,"retryOnConflict":0,"enabled":false}},"retry":false}
[Elastica\Exception\Connection\HttpException]
Unknown error:56
The Elastic settings in my Symfony Project are
fos_elastica:
clients:
default: { host: localhost, port: 9200 }
Thank you in advance
Best regards, Andrea
It was simply the http.port which I had to set to 9200 in the elasticsearch.yml - That's it! :-)
Config the port you wish to run the elasticsearch in elasticsearch.yml

Setting up ELK stack

I'm completely new to ELK and trying to install the stack with some beats for our servers.
Elasticsearch, Kibana and Logstash are all installed (on server A). I followed this guide here https://www.elastic.co/guide/en/elastic-stack/current/installing-elastic-stack.html.
Filebeat template was installed as well.
I also installed filebeat on another server (server B), and was trying to test the connection
$ /usr/share/filebeat/bin/filebeat test output -c
/etc/filebeat/filebeat.yml -path.home /usr/share/filebeat -
path.config /etc/filebeat -path.data /var/lib/filebeat -path.logs
/var/log/filebeat
logstash: my-own-domain:5044...
connection...
parse host... OK
dns lookup... OK
addresses: 163.172.167.147
dial up... OK
TLS...
security: server's certificate chain verification is enabled
handshake... OK
TLS version: TLSv1.2
dial up... OK
talk to server... OK
Things seems to be ok, yet data from filebeat on server B doesn't seem to be sending data to logstash.
Accessing Kibana keeps redirecting me back to Create Index pattern, with the message
Couldn't find any Elasticsearch data
Any direction pointing would be really appreciated.
Can you check your filebeat.yml file and see if configuration for logs are activated :
filebeat.prospectors:
- type: log
enabled: true
paths:
- /var/log/*.log

Configuring elastic search not to be localhost

After installing Elasticsearch 5.6.3 and setting Nodename to the server name. I tried to browse to Elasticsearch using IP:9200 but it didn't work. If I browse to localhost:9200 it works. Where do I go to change th default behaviour of Localhost. Since I want to open this up to other external servers so the loop back address of localhost isn't any good.
After installing Kibana 5.6.3, the same is obviously true here as well. Starting the kibana server with the ip fails, but with localhost doesn't.
At this point I have no indexes, I just want to prove Elasticsearch can be reached beyond localhost.
Thanks
Bill
You can configure your IP with the "network.host" setting in 'elasticsearch.yml' and 'kibana.yml' in your config directory.
Here is some link to the Elasticsearch doc to config yours :)
Configuring Elasticsearch
Important Settings
For a quick start development configuration the following settings can be placed into 'elasticsearch.yml':
network.host e.g.
network.host: 192.168.178.49
cluster.initial_master_nodes e.g.
cluster.initial_master_nodes: ["node_1"]
You can also define a cluster name:
cluster.name: my-application
Start it with the node name (example for Windows)
C:\InstallFolder\elasticsearch-7.10.0>C:\InstallFolder\elasticsearch-7.10.0\bin\elasticsearch.bat -Enode.name=node_1
Go to your browser and open http://192.168.178.49:9200 (replace with your IP). It shows a JSON result. The localhost:9200 will no longer work.
This config should not be used for production environments. See the official docs.
In general when starting from a command prompt it shows any errors when something fails. These are very helpful.

how to deploy elasticsearch in predix?

I am trying to deploy elasticsearch in predix, I tried to push the downloaded elasticsearch folder into predix with the following manifest.
---
applications:
- name: elastic-search-test3
buildpack: java_buildpack
# path: target/elstic-search-test-1.0.0.jar
command: elasticsearch-5.2.2/bin/elasticsearch -f
#timeout : 180
am getting error like port should not be hard coded, should need to use $PORT.
then I tried to set to set the port and host in elasticsearch config as follows
http.port: ${VCAP_APP_PORT}
network.host: ${VCAP_APP_HOST}
but no luck.
Can someone point to solution to deploy elasticsearch on predix?
You should use CF_INSTANCE_* variables instead of depreciated VCAP_APP_* DEA ones on newer CF versions.
More on https://docs.cloudfoundry.org/devguide/deploy-apps/environment-variable.html

Resources