Elasticsearch Filebeat - elasticsearch

Im new to Elasstic Search and im trying to integrate ES in our infrastructure. I installed one central ES server (6.0) with Elasticsearch, Kibana ....
The first task I wanted to do is sending apache logfiles from other servers into this ES server.
From the description of filebeat it seems this module is doing exactly the things i want (lightweight shipping of logfiles to ES server):
https://www.elastic.co/downloads/beats/filebeat
I installed filebeat from the RPM to our Server. But it seems not to run because of missing Plugins (geoIP, UA). I tried to install these but there is no executable "elasticsearch-plugin" available.
Do i have to install the whole ES package on every server I want to send logfiles to our ES Server?
Or is there another way to send logfiles to the ES Server and process fields like IP and UA on the Server side?

It's not the only approach, but this is generally the best way to get started.
You're nearly there: The elasticsearch-plugin is located in /usr/share/elasticsearch/bin/. You will need to install the GeoIP and UA plugins on every Elasticsearch node. Once that's done you should be able to use the Apache module in Filebeat.

Related

Elastic Uptime Monitors using Heartbeat --Few Monitors are missing in kibana

I have the elk setup in a ec2 server.With Beats like metricbeat,filebeat,heartbeat.
I have setup the elastic apm for some applications like jenkins & sonarqube.
Now In uptime I can see only few monitors like sonarqube and jenkins
Other application are missing..
When I see data from yesterday not available in elasticsearch for particular application
The best way to troubleshoot what is going on is to check if the events from Heartbeat are being collected. The Uptime application only displays events from Heartbeat, and therefore — this is the Beat that you need to check.
First, check the connectivity of Heartbeat and the configured output:
metricbeat test output
Secondly, check if the events are being generated. You can check this by commenting out your existing output (Likely Elasticsearc/Elastic Cloud) and enabling either the Console output or the File output. Then start your Metricbeat and check if events are being generated. If they are, then it might be something with the backend side of things; maybe Elasticsearch is rejecting the documents sent and refusing to index them.
Apropos, Elastic is implementing a native Jenkins plugin that allows you to observe your CI pipeline using OpenTelemetry compatible backends such as Elastic APM. You can learn more about this plugin here.

How to monitor an ElasticSearch Cluster on the Elastic Cloud with Datadog?

We have an elasticsearch cluster deployed to the Elastic Cloud and would like to send monitoring/health metrics to Datadog. What is the best way to do that?
It seems like our options are:
Installing the datadog agent binary via the plugins upload
Using metric beat -> logstash -> datadog_metrics output
You can deploy the Datadog agent in a container / instance that you manage and the configure it according to these instructions to gather metrics from the remote ElasticSearch cluster that is hosted on Elastic Cloud. You need to create a conf.yaml file in the elastic.d/ directory and provide the required information (Elasticsearch endpoint/URL, username, password, port, etc) for the agent to be able to connect to the cluster. You may find a sample configuration file here.
As George Tseres mentioned above, the way I had to get this working was to set up collection on a separate instance (through docker) and then to configure it to read the specific Elastic Cloud instances.
I ended up making this: https://github.com/crwang/datadog-elasticsearch, building that docker image, and then pushing it up to AWS ECR.
Then, I spun up a Fargate service / task to run the container.
I also set it to run locally with docker-compose as a test.

Sending log files/data from one EC2 instance to another

So i have one EC2 instance with logstash, elastichsearch and kibana installed on it. and i have another EC2 instance thats running a dummy apache server. Now i know that i should install filebeat on the apache server instance to send the log files to the logstash instance but im not sure how to configure the files.
My main goal is to send the log files from one instance basically to another for processing and viewing aka ES and Kibana. Any help or advice is greatly appreciated.
Thanks in advance!
Cheers!
So as you have already stated, the easiest way to send log events from one machine to an Elastic instance is to install the filebeat agent on the machine the apache is running.
Filebeat has its own Apache module that makes the configuration even easier! In the module you specify the paths of the desired log files.
Then you also need a configuration of Filebeat itself. In the filebeat.yml you need to define the logstash destination under
output.logstash
This configuration guide gets into more details
Take a look at the filbeat.yml reference on all configuration settings.
If you are familiar with docker, there is also a guide on how to run filebeat on docker.
Have fun! :-)

Can log data exposed as a web service be input to Elasticssearch?

I have a number of applications that are running in different data centers, developed and maintained by different vendors. Each application has a web service that exposes relevant log data (audit data, security data, data related to cost calculations, performance data, ...) consolidated for the application.
My task is to get data from each system into a setup of Elasticsearch, Kibana and Logstash so I can create business reports or just view data the way I want to.
Assume I have a JBoss application server for integration to these "expose log" services, what is the best way to feed Elasticssearch? Some Logstash plugin that calls each service? JBoss uses some Logstash plugin? Or some other way?
The best way is to set up the logstash shipper on the server where the logs are created.
This will then ship them to a Redis server.
Another logstash instance will then pull the data from Redis, and index it, and ship it to Elasticsearch.
Kibana will then provide an interface to Elasticsearch, which is where the goodness happens.
I wrote a post on how to install Logstash a little while ago. Versions may have been updated since, but its still valid
http://www.nightbluefruit.com/blog/2013/09/how-to-install-and-setup-logstash/
Do your JBoss application server writes logs to file?
In my experiences, My JBoss application(in multiple server) writes the logs to the file. Then I use logstash to read the logs file and ship all the logs to a central server. You can refer to here.
So, what can you do is setup a logstash shipper in different data center.
If you do not have permission to do this, maybe you want to write a program to get the logs from different web services and then save them to a file. Then setup the logstash to read the logs file. So far, logstash do not have any plugin that can call web services.

Logstash output to server with elasticsearch

I intend to run logstash on multiple clients, which in turn would submit their logstash reports to the elastic search on a server(a Ubuntu machine, say).
Thus there are several clients running logstash outputting their logs to the elastic search on a COMMON server.
Is this o/p redirection to a server possible with Logstash on the various clients?
If yes, what would the configuration file be?
You need a "broker" to collect the outputs from each of the servers.
Here's a good tutorial:
http://logstash.net/docs/1.1.11/tutorials/getting-started-centralized

Resources