Proxy_url for kibana in metricbeat.yml - elasticsearch

My server requires proxy to connect to kibana. How can I specify that in metricbeat.yml file. For output.elasticsearch there is attribute called proxy_url but I cant see for kibana

From metricsbeat you can send beats data elasticsearch as well setup dashboard to kibana.
Do you need proxy for setup dashboard ? If so proxy environment will work for you.
On Linux/Unix simply export proxy variable as below
export http_proxy=http://host:port/
export https_proxy=http://host:port/
and then run the setup such as below.
./metricbeat setup --dashboards
For sending only beats data you already have proxy_url under elasticsearch output module

Related

Elasticsearch Fleet Server Outputs > Specify where agents will send data

Is posible to send data to an external elasticsearch deployment with fleet server?
I have tried with the Kibana Fleet UI settings but there is no username, password field for connection, if I specify those on Advanced YAML configuration give me these error: cannot set both api_key and username/password accessing 'elasticsearch'
Fleet > Settings > Outputs | Specify where agents will send data
I can see the Kibana Fleet Settings xpack.fleet.outputs > config described as Extra config for that output to set this manually but there is no example to set this config variable.
Kibana version: kibana:8.5.3
Elasticsearch version: elasticsearch:8.5.3
Install method: Elastic ECK 2.6
Agents don't support sending the logs to remote cluster, so you won't be able to send data to any external Elasticsearch per say.
However, you can either opt for beats and provide the list of ES hosts where you want to send the logs OR use logstash to receive input from Agent and configure your output with list of ES hosts.

Using Logstash to pass airflow logs to Elasticsearch

When using logstash to retrieve airflow logs from a folder you have access to, would I still need to make any changes in the airflow.cfg file?
For instance, I have airflow and ELK deployed on same ec2 instance. The logstash .conf file has access to the airflow logs path since they are on the same instance. Do I need to turn on remote logging in airflow config?
In fact you have two options to push airflow logs to Elastic Search:
Using a log collector (logstash, fluentd, ...) to collect Airflow log then send it to Elastic Search server, in this case you don't need to change any Airflow config, you can just read the logs from the files or stdout and send it to ES.
Using Airflow remote logging feature, in this case Airflow will log directly to your remote logging server (ES in your case), and will store a local version of this log to show it when the remote server is unavailable.
So the answer to your question is no, if you have a logstash, you don't need Airflow remote logging config

Kibana asking for credentials

I set up a Kibana server that is accesing an External ElasticServer Datasource. Nignx is on top, and I access Kibana through it.
On the initial config, I set up the credentials of Kibana using:
sudo htpasswd -c /etc/nginx/htpasswd.users kibanaadmin
Then I was able to access the Kibana Web Console, and see it running. However, the external elasticServer was not configured, so I edited the kibana.yml file to point that external ElasticServer.
elasticsearch.url: "https://bluemix-sandbox-dal-9-portal0.dblayer.com:18671/"
elasticsearch.username: "admin"
elasticsearch.password: "mypass"
When I restarted Kibana, it was able to connect to the elasticsearch server, and in fact it seems that it wrote an entry on the index there.
However, now I am asked for some credentials to get connected to the Kibana Web interface. They are not the kibanaadmin I set up previously, or the ones on elasticsearch database. Which credentials should i use?
Are you sure you're not running Kibana from the wrong ES instance and both Kibana and Nginx are running on the same server. Haven't tried it out personally but then the below links could be handy.
Enabling Kibana Authentication with Nginx
Securing Elasticsearch, Kibana with nginx
Git- Kibana with Nginx Reverse Proxy

Packetbeat dashboard for Application logs

Can packetbeat is used to monitor the tomcat server logs and windows logs?? or it will only monitor the database i.e., network monitoring?
Packetbeat only does network monitoring. But you can use it together with Logstash or Logstash-Forwarder to get visibility also into your logs.
It will do only network monitoring. you can use ELK for tomcat server logs.
#tsg is correct but now with the Beats 1.x release they are deprecating Logstash Forwarder in lieu of another Beat called Filebeat. Also they added Topbeat, which allows you to monitor server load and processes in your cluster.
See:
* https://www.elastic.co/blog/beats-1-0-0
You will likely want to install the package repo for your OS, then install each with:
{package manager cmd} install packetbeat
{package manager cmd} install topbeat
{package manager cmd} install filebeat
They each are installed in common directories. For example with Ubuntu (Linux) the config files are in /etc/<beat name>/<beat name>.yml where beat name is one of the 3 above. Each file are similar and you can disable the direct ES export and instead export to Logstash (comment ES and uncomment Logstash) and then add a beats import in your Logstash config. From thereon, Logstash listens for any beats over that port and can redistribute (or queue) using the [#metadata][beat] param to tell where it came from.
Libbeat also provides a framework to build your own so you can send any data you want to Logstash and it can queue and/or index. ;-)
Packetbeat is used mainly for network analysis . It currently supports following protocols :
ICMP (v4 and v6)
DNS
HTTP
Mysql
PostgreSQL
Redis
Thrift-RPC
MongoDB
Memcache
However , for visualizing tomcat logs you can configure them to use log4j and then configure logstash to take input from log4j and then using elasticsearch and kibana to visualise the logs.
To monitor windows logs you can use another beats platform Winlogbeat.

Getting Logstash to talk to Elastic Search with HTTPS + Basic auth

I have Elastic Search as part of the ELMA appliance. This appliance presents ES via HTTPS protected by basic auth. I have Logstash running on a separate machine. This Logstash needs to send log data to ES. What is the right output configuration to use?
Thanks for any pointers.
-Raj
there is an option in new version:
http://www.elastic.co/guide/en/logstash/current/plugins-outputs-elasticsearch.html#plugins-outputs-elasticsearch-password
ssledit
Value type is boolean
Default value is false
SSL Configurations (HTTP only)
Enable SSL
Looks like Logstash's elasticsearch_http module does not support SSL, or does not handle self-signed certs. My solution was to disable SSL on the ElasticSearch httpd conf entry in the ELMA appliance.

Resources