Kibana shows a blank dashboard - elasticsearch

I'm trying to make Elasticsearch+Kibana work. For some reason I get a blank Kibana dashboard:
My config.js is a default file with only one line changed:
elasticsearch: "http://127.0.0.1:9200",
Elasticsearch is working correctly, http://127.0.0.1:9200 returns this json:
{
"status" : 200,
"name" : "Ikthalon",
"version" : {
"number" : "1.1.1",
"build_hash" : "f1585f096d3f3985e73456debdc1a0745f512bbc",
"build_timestamp" : "2014-04-16T14:27:12Z",
"build_snapshot" : false,
"lucene_version" : "4.7"
},
"tagline" : "You Know, for Search"
}
But why is my Kibana dashboard blank? Maybe this is because I run it with URL file:///home/sergey/Desktop/kibana-3.1.1/index.html#/dashboard/file/default.json? If so, how do I make it work?

You could open the same file from Firefox and Kibana would work.
Chrome blocks it as a security feature.

You need to run kibana over a server. If you have python installed you can use
cd /path/to/kibana
python -m SimpleHTTPServer
Or if you can put kibana source code in following directories if you are using Apache:
LAMP: /var/www
WAMP: C:/wamp/www

If you're using Logstash, there is an option to run Kibana embedded in Logstash. See -a and -p flags here http://logstash.net/docs/1.4.2/flags

Javascript errors may be occured.In firefox i had 2 errors. fontawesome-webfont.woff and logstash.json files couldnt be found.I added iis MIME Types for .woff and .json. And then problem resolved.

Related

how do I know if my changes in elasticsearch.yml config file are reflecting or not? I have switched script.inline and script.indexed to true

how do I know if my changes in the elasticsearch.yml config file are reflecting or not? I have added the following two lines and restarted elasticsearch.
script.inline : true
script.indexed : true
I get the same error even on commenting out these two lines, when I run a "query" with "script" inline.
Attached screenshot of query and error message.
enter image description here
Thanks

How to know if X-Pack is installed in Elasticsearch?

I install Elasticsearch with Debian package and installed X-pack in it.
Now, I want to verify if X-Pack is successfully installed.
Is there a simple way to do verify this?
You can call
GET _cat/plugins?v
xpack comes pre-installed from ElasticSearch 6.3 onwards. Refer to : https://www.elastic.co/what-is/open-x-pack for more info on this.
You can check if xpack is installed using: curl -XGET 'http://localhost:9200/_nodes'
The relevant output snippet looks like below:
"attributes": {
"ml.machine_memory": "67447586816",
"xpack.installed": "true",
"transform.node": "true",
"ml.max_open_jobs": "512",
"ml.max_jvm_size": "27917287424"
}

elasticsearch bulk script does not work neither with elasticsearch.yml change

When I try to run a curl command like:
curl -s -XPOST localhost:9200/_bulk --data-binary "#bulk_prova.elastic"; echo
Where bulk_prova.elastic is:
{ "update" : {"_id" : "1", "_type" : "type1", "_index" : "indexName"} }{ "script" : "ctx._source.topic = \"topicValue\""}
I got this error
{"took":19872,"errors":true,"items":[{"update":{"_index":"indexName","_type":"type1","_id":"1","status":400,"error":{"type":"illegal_argument_exception","reason":"failed to execute script","caused_by":{"type":"script_exception","reason":"scripts of type [inline], operation [update] and lang [groovy] are disabled"}}}}]}
I searched to solve the issue and I've managed the elasticsearch.yml file to enable the dynamic script, but every time that I try to change the file and stop elastic when I restart the elasticsearch service it does not start.
Due to this strange behavior I do not know how to do to solve the issue.
I have the 2.2.0 version and my intention is to add a field to a index (for now) or more than an index (once the problem is solved)
In Elasticsearch 2.3 it has been modified from:
script.disable_dynamic: false
TO:
script.file: true
script.indexed: true

Elasticsearch 1.3.8 transform with groovy script file

Due to a security vulnerability with ES 1.3.4 we upgraded to ES 1.3.9 which disabled dynamic groovy scripting, as a result the mapping transformations are failing with the error message "dynamic scripting for [groovy] disabled". I tried the approach in https://www.elastic.co/guide/en/elasticsearch/reference/1.3/modules-scripting.html by externalizing the script to a script file but the script file is not getting invoked or the transformation is not working. How can we achieve the transformation using a script file
Mapping file transform is as follows:
"transform" : [{
"script" : "ctx._source['downloadCountInt'] = (ctx._source['downloadCount']==null)? ctx._source['downloadCount'] : ctx._source['downloadCount'].replaceAll(/\\D/, '');",
"lang" : "groovy"
}]
Tried putting the script ctx._source['downloadCountInt'] = (ctx._source['downloadCount']==null)? ctx._source['downloadCount'] : ctx._source['downloadCount'].replaceAll(/\\D/, ''); into a script file named "transform_download_count.groovy" in /etc/elasticsearch/scripts/transform_download_count.groovy and the log messages show that it was compiled correctly but the transformation is never invoked.
With the script file /etc/elasticsearch/scripts/transform_download_count.groovy in place try:
"transform" : {
"script_file" : "transform_download_count",
"lang" : "groovy"
}

MongoDB How to find out data directory using Java driver

I am using an instance of MongoDB with just one node. I would like to write a web service that fsyncs the data files and zips them into a backup folder.
Ideally, I would get the location of the data directory programatically (rather than reading a config file) so I can easily port this from a development to a production machine, where the installation paths differ. Is there any way to do this using the Java driver?
Try using use admin
db.runCommand({getCmdLineOpts: 1}) as outlined here and then playing with the returned data.
Example return data is
{
"argv" : [
"mongod",
"--port",
"6669",
"--dbpath=c:\\data\\mongo2",
"--rest"
],
"parsed" : {
"dbpath" : "c:\\data\\mongo2",
"port" : 6669,
"rest" : true
},
"ok" : 1
}
You could use mongoexport to get the data; run it from the production machine and specify the host/port/collection of the development machine. The data can be imported to the production machine using mongoimport.

Resources