Running Kibana and Elasticsearch under Mongrel2 - https

I have a Vagrant VM running Elasticsearch and Kibana. Kibana is currently being served by Mongrel2 as per the following configuration
kibana_directory = Dir(base='kibana/',
index_file='index.html',
default_ctype='text/plain')
myhost = Host(name = 'localhost', routes = {
'/logs/': kibana_directory
})
Elasticsearch is running on port 9200. When I try view Kibana, however, it doesn't load properly, giving me the following error in the browser
[blocked] The page at 'https://dev.demo.vm/logs/' was loaded over HTTPS, but
ran insecure content from 'http://dev.demo.vm:9200/_nodes': this content should
also be loaded over HTTPS.
The problem seems obvious enough, Mongrel2 is serving content over https but Elasticsearch is queried over http, so Kibana (served over https) cannot communicate with it. I want to keep the https for Mongrel2, but I don't know what to do to get Kibana working. Should I be using proxies of some sort with Mongrel2? Is there a straightforward solution?

Related

ElasticSearch Kibana version 7 not displaying dashboard because of browser security or other issue

I believe there is something wrong with Kibana 7.0.1-linux-x86_64. It will not not display the dashboard in the browser but will with curl using the public IP address.
If you open this URL it timeout http://ec2-35-180-186-122.eu-west-3.compute.amazonaws.com:5601/app/kibana . But I know that Kibana is listening on that port because I can open it with curl (of course that's just to test it as curl cannot load the graphics).
At first I thought this was a browser issue. As I got it to work with Chrome incognito and Safari. But I could not get it working again. So I started over and took the default configuration and only changed one item in kibana.yml to server.host: "172.31.46.15", so that it could be reached from the internet.
When I do curl
curl http://ec2-35-180-186-122.eu-west-3.compute.amazonaws.com:5601/app/kibana
it responds with this message showing that at least it is listening.
This Kibana installation has strict security requirements enabled that your current browser does not meet.
Kibana stdout says:
{"type":"log","#timestamp":"2019-05-20T13:21:28Z","tags":["listening","info"],"pid":15611,"message":"Server
running at http://172.31.46.15:5601"}
{"type":"log","#timestamp":"2019-05-20T13:21:28Z","tags":["status","plugin:spaces#7.0.1","info"],"pid":15611,"state":"green","message":"Status
changed from yellow to green -
Ready","prevState":"yellow","prevMsg":"Waiting for Elasticsearch"}

How to Access ElasticSearch From Server?

I am using elastic search in my ubuntu16.04 server. When i am trying to access elasticsearch from browser by using url ip:port/_cat/indices?v. I am getting site can't be reached. After that i am change the network.host value to network.host: 0.0.0.0. After change the network.host ip the search engine not started. How can i access the elasticsearch in my browser.I changed the port also.
Thank you..
There can be many reasons for ES not being reachable. I would start with the obvious and make sure that:
ES is listening on the port: on the ES instance when you run 'curl
ip:port' you should get an answer. if not the service didn't start
well.
make sure there are no firewall rules/security groups that prevent
access from remote network.
make sure network.publish_host is configure correctly:
https://www.elastic.co/guide/en/elasticsearch/reference/current/modules-network.html#advanced-network-settings
more info here: ElasticSearch instance not reachable from outside the server - Azure Windows 2012

Cannot access data source of elasticsearch using grafana cloud

I'm using grafana cloud for creating visualization but when i'm trying to load the data source with elasticsearch i'm getting 502 error.
502 usually means bad gateway (there is no connection) and that IP address looks like an internal IP address. GrafanaCloud is a cloud service so it does not have access to internal IP addresses.
Your options are:
Install Grafana locally if you do not want to open up anything over the internet.
Use direct mode instead of proxy mode. This means that requests will go directly from your browser to the elasticsearch server and not go through the Grafana backend server. However, GrafanaCloud is on https so you will get a mixed content warning and you would need to solve that by having a proxy in front of your elasticsearch server (or by setting up https for your server).
Make your server accessible over the internet. Setup a static IP address for your elasticsearch server, setup firewall rules etc. so that GrafanaCloud can query your server.
Add the following configurations in config/elasticsearch.yml:
transport.host: localhost
transport.tcp.port: 9300
http.port: 9200
network.host: 0.0.0.0

Kibana asking for credentials

I set up a Kibana server that is accesing an External ElasticServer Datasource. Nignx is on top, and I access Kibana through it.
On the initial config, I set up the credentials of Kibana using:
sudo htpasswd -c /etc/nginx/htpasswd.users kibanaadmin
Then I was able to access the Kibana Web Console, and see it running. However, the external elasticServer was not configured, so I edited the kibana.yml file to point that external ElasticServer.
elasticsearch.url: "https://bluemix-sandbox-dal-9-portal0.dblayer.com:18671/"
elasticsearch.username: "admin"
elasticsearch.password: "mypass"
When I restarted Kibana, it was able to connect to the elasticsearch server, and in fact it seems that it wrote an entry on the index there.
However, now I am asked for some credentials to get connected to the Kibana Web interface. They are not the kibanaadmin I set up previously, or the ones on elasticsearch database. Which credentials should i use?
Are you sure you're not running Kibana from the wrong ES instance and both Kibana and Nginx are running on the same server. Haven't tried it out personally but then the below links could be handy.
Enabling Kibana Authentication with Nginx
Securing Elasticsearch, Kibana with nginx
Git- Kibana with Nginx Reverse Proxy

How to allow requests to elasticsearch only from a list of ips/domains

I read the docs, but I couldn't make it work.
I have a server that holds elasticsearch and external ones that query it. Until now I can access the elasticsearch from any ip.
Example:
the public ip:port of elasticsearchserver: 123.123.123.123:9200
I have the domains: anothersocialnetwork.com and anothersocialnetwork2.com
and I want only them and localhost to be able to query the elasticsearch server.
Thank you alot
There are multiple way to achieve this. The one i would like to advice is as follows -
Run Elasticsearch in localhost interface by network.host as localhost in elasticsearch.yml file.
Now only applications in localhost can access the application
Place a proxy like nginx or apache and this proxy would be able to access elasticsearch. Now whitelist the IP's you want to access Elasticsearch in the proxy.
Also you can take a look at Elasticsearch jetty plugin. It has some security configurations along with it. But i am not sure if its actively developed.
Also on security Elasticsearch , i would recommend to go through this blog.

Resources