Connect Kibana to Elasticsearch - ELASTICSEARCH_URL vs ELASTICSEARCH_HOSTS - elasticsearch

I don't know which environment variable to use:
version: '2'
services:
kibana:
image: docker.elastic.co/kibana/kibana:6.2.4
environment:
SERVER_NAME: kibana.example.org
ELASTICSEARCH_HOSTS: http://ip-xxx-31-9-xxx.us-west-2.compute.internal:9200
ELASTICSEARCH_URL: http://ip-xxx-31-9-xxx.us-west-2.compute.internal:9200
Should I be using ELASTICSEARCH_URL or ELASTICSEARCH_HOSTS?

Since you are using the docker image of kibana 6.2.4 it has to be ELASTICSEARCH_URL. In the official guide to configure kibana 6.2 the setting ELASTICSEARCH_HOSTS is not even listed. This one came with later versions.

Related

Could not communicate to Elasticsearch

I am trying to send my node app logs to fluentd to elasticsearch to kibana, but having a problem connecting fluentd with elasticsearch with docker. I want to dockerize this efk stack.
I have attached the folder structure and shared relevant files.
Following is the folder structure:
Error
Could not communicate to Elasticsearch, resetting connection and trying again. Connection refused - connect(2) for 172.20.0.2:9200 (Errno::ECONNREFUSED)
fluent.conf:
#type forward
port 24224
bind 0.0.0.0
</source>
<match *.**>
#type copy
<store>
#type elasticsearch
host elasticsearch
port 9200
user elastic
password pass
</store>
</match>
DockerFile
FROM fluent/fluentd:v1.15-1
USER root
RUN gem install elasticsearch -v 7.6.0
# RUN gem install fluent-plugin-elasticsearch -v 7.6.0
RUN gem install fluent-plugin-elasticsearch -v 4.1.1
RUN gem install fluent-plugin-rewrite-tag-filter
RUN gem install fluent-plugin-multi-format-parser
USER fluent
Docker-compose.yml
version: '3'
services:
fluentd:
build: ./fluentd
container_name: loggingFluent
volumes:
- ./fluentd/conf:/fluentd/etc
# - ./fluentd/conf/fluent.conf:/fluentd/etc/fluent.conf
ports:
- "24224:24224"
- "24224:24224/udp"
links:
- elasticsearch
depends_on:
- elasticsearch
- kibana
elasticsearch:
image: docker.elastic.co/elasticsearch/elasticsearch:7.8.1
container_name: elasticsearch-Logging
ports:
- 9200:9200
expose:
- 9200
environment:
discovery.type: 'single-node'
ES_JAVA_OPTS: '-Xms1024m -Xmx1024m'
xpack.security.enabled: 'true'
ELASTIC_PASSWORD: 'pass'
kibana:
image: docker.elastic.co/kibana/kibana:7.8.1
container_name: kibana-Logging
volumes:
- ./kibana.yml:/usr/share/kibana/config/kibana.yml
ports:
- 5601:5601
depends_on:
- elasticsearch
links:
- elasticsearch
Maybe I am missing some with docker networking because I am using docker for the first time I have checked the ports exposed by docker containers and they are fine. I have done this without docker and have used the same settings but having a problem doing it with docker. Looking forward to seeing your responses. Thank you very much.
Adding a username to the elastic search environment solved the issue :
elasticsearch-environment
environment:
discovery.type: 'single-node'
ES_JAVA_OPTS: '-Xms1024m -Xmx1024m'
xpack.security.enabled: 'true'
ELASTIC_PASSWORD: 'pass'
ELASTIC_USERNAME: 'elastic'

Unable configure Alerts and Actions in Kibana

I'm using a Docker Compose file for ELK setup and using the latest version (above 7) for Kibana. Now I set the xpack.encryptedSavedObjects.encryptionKey parameter in the kibana.yml so that I can use the alert and actions feature. But even after that I'm not able to create alert. Can anyone help me please?
I generated 32 character encryption key using Python uuid module.
According to https://github.com/elastic/kibana/issues/57773 the environment variable XPACK_ENCRYPTEDSAVEDOBJECTS_ENCRYPTIONKEY was missing in the kibana config. In Feb 2020 it was merged and is now working.
The encryption key XPACK_ENCRYPTEDSAVEDOBJECTS_ENCRYPTIONKEY has to be 32 characters or longer. https://www.elastic.co/guide/en/kibana/current/using-kibana-with-security.html
A working configuration could look like this:
...
kibana:
depends_on:
- elasticsearch
image: docker.elastic.co/kibana/kibana:8.0.0-rc2
container_name: kibana
environment:
- ...
- SERVER_PUBLICBASEURL=https://kibana.stackoverflow.com/
- XPACK_ENCRYPTEDSAVEDOBJECTS_ENCRYPTIONKEY=a7a6311933d3503b89bc2dbc36572c33a6c10925682e591bffcab6911c06786d
- ...
...
I have tried using the environment variable in my docker-compose.yml file as
kib01:
image: docker.elastic.co/kibana/kibana:${VERSION}
container_name: kib01
depends_on: {"es01": {"condition": "service_healthy"}}
ports:
- 5601:5601
environment:
SERVERNAME: localhost
ELASTICSEARCH_URL: https://es01:9200
ELASTICSEARCH_HOSTS: https://es01:9200
XPACK_ENCRYPTEDSAVEDOBJECTS_ENCRYPTIONKEY: "743787217A45432B462D4A614EF35266"
volumes:
- /var/elasticsearch/config/certs:$CERTS_DIR
networks:
- elastic
We have changed the string format of xpack.encryptedSavedObjects.encryptionKey in to environment variable format XPACK_ENCRYPTEDSAVEDOBJECTS_ENCRYPTIONKEY by replacing . with _ and all caps.
Maybe there is a problem with mounting the file, I opted for the environment variables in my docker-compose file.
services:
kibana:
...
environment:
...
XPACK_ENCRYPTEDSAVEDOBJECTS_ENCRYPTIONKEY: abcd...

Why elasticsearch on docker swarm requires a transport.host=localhost setting?

I'm trying to run Elasticsearch on an docker swarm. It works as a single node cluster for now, but only when the transport.host=localhost setting is included. Here is main part of docker-compose.yml:
version: "3"
services:
elasticsearch:
image: "elasticsearch:7.4.1" #(base version)
hostname: elasticsearch
ports:
- "9200:9200"
environment:
- cluster.name=elasticsearch
- bootstrap.memory_lock=true
- ES_JAVA_OPTS=-Xms512m -Xmx512m
- transport.host=localhost
volumes:
- "./elasticsearch/volumes:/usr/share/elasticsearch/data"
networks:
- logger_net
volumes:
logging:
networks:
logger_net:
external: true
Above configuration results in the yellow cluster state (because some indexes require additional replica).
Elasticsearch status page is unavailable when I'm using IP of the elasticsearch docker container in a transport.host setting or without a transport.host=localhost setting.
I think that using a transport.host=localhost setting is wrong. Is proper configuration of Elasticsearch in docker swarm available?

Start ElasticSearch in Wercker

We have a Ruby project where we are using Wercker as Continuous Integration.
We need to start an Elastic Search service in order to run some integration tests.
Locally, we added the Elastic configuration to the docker file and everything runs smoothly:
services:
elasticsearch:
image: docker.elastic.co/elasticsearch/elasticsearch:6.5.1
container_name: elasticsearch
environment:
- discovery.type=single-node
ports:
- "9200:9200"
- "9300:9300"
In The Wercker.yml file, we tried several things, but we cannot reach the elastic service.
Our wercker.yml contains:
services:
- id: elasticsearch:6.5.1
env:
ports:
- "9200:9200"
- "9300:9300"
We have this king of error when trying to use Elastic in our tests:
Errno::EADDRNOTAVAIL: Failed to open TCP connection to localhost:9200 (Cannot assign requested address - connect(2) for "localhost" port 9200)
Do you have any idea of what we are missing?
So, we found a solution:
In wercker.yml
services:
- id: elasticsearch:6.5.1
cmd: "/elasticsearch/bin/elasticsearch -Ediscovery.type=single-node"
And we added a step to check the connection:
build:
steps:
- script:
name: Test elasticsearch connection
code: curl http://elasticsearch:9200

Config Elasticsearch and Kibana with docker

I'm working with docker for the first time.
I successfully installed elasticsearch and kibana on docker, but when I try to connect kibana with elastic I get a red status with the following errors:
ui settings Elasticsearch plugin is red
plugin:elasticsearch#5.1.1 Authentication Exception
I'm not sure but I think the problem is kibana doesn't pass elastic x-pack authentication.
Now, I'm trying to disable this authentication via elastic yml file, according to the instructions here.
But I can't find the yml file anywhere (I searched /usr/share/elasticsearch but I can't find either config directory or elasticsearch.yml file).
How do I config elastic with docker?
P.S.
I'm working with ubuntu 16.04
For Debian/Ubuntu/Mint, you can find the config files under /etc folder.
/etc/elasticsearch/elasticsearch.yml
Take a look at: https://www.elastic.co/guide/en/elasticsearch/reference/2.4/setup-dir-layout.html
I'm wondering why this is even happening. With the following docker-compose.yml it's working fine for me with security enabled:
---
version: '2'
services:
kibana:
image: docker.elastic.co/kibana/kibana:5.1.1
links:
- elasticsearch
ports:
- 5602:5601
elasticsearch:
image: docker.elastic.co/elasticsearch/elasticsearch:5.1.1
cap_add:
- IPC_LOCK
volumes:
- esdata1:/usr/share/elasticsearch/data
ports:
- 9201:9200
volumes:
esdata1:
driver: local
I successfully run elastic and kibana using the official elastic docker. Somehow, the container version in the official elastic documention didn't work for me.
If you prefer to start a container using docker run and not through compose file. (only use this for dev envs, not recommended on prod envs)
docker network create elastic
docker run --network=elastic --name=elasticsearch docker.elastic.co/elasticsearch/elasticsearch:5.2.2
docker run --network=elastic -p 5601:5601 docker.elastic.co/kibana/kibana:5.2.2
A brief description can be found here:
https://discuss.elastic.co/t/kibana-docker-image-doesnt-connect-to-elasticsearch-image/79511/4

Resources