In the documentation of Cloudfoundry, the Elastic SAAS service is not mentioned
https://docs.cloudfoundry.org/devguide/services/log-management-thirdparty-svc.html
So was wondering if anyone has done it and how?
I know one way is to use a logstash instance in cf, feed the syslog to it and then ship it to Elastic. But just wondering if there is a direct possibility to skip the logstash deployment on cf?
PS. We also log using the ECS format.
Related
We use ELK for log management and nagios for centralized monitoring.
I want to setup pattern match alerts in kibana and notify via nagios.
Can you please let me know if this is feasible.
We have an elasticsearch cluster deployed to the Elastic Cloud and would like to send monitoring/health metrics to Datadog. What is the best way to do that?
It seems like our options are:
Installing the datadog agent binary via the plugins upload
Using metric beat -> logstash -> datadog_metrics output
You can deploy the Datadog agent in a container / instance that you manage and the configure it according to these instructions to gather metrics from the remote ElasticSearch cluster that is hosted on Elastic Cloud. You need to create a conf.yaml file in the elastic.d/ directory and provide the required information (Elasticsearch endpoint/URL, username, password, port, etc) for the agent to be able to connect to the cluster. You may find a sample configuration file here.
As George Tseres mentioned above, the way I had to get this working was to set up collection on a separate instance (through docker) and then to configure it to read the specific Elastic Cloud instances.
I ended up making this: https://github.com/crwang/datadog-elasticsearch, building that docker image, and then pushing it up to AWS ECR.
Then, I spun up a Fargate service / task to run the container.
I also set it to run locally with docker-compose as a test.
As far as I'm aware, there are no managed elasticsearch solutions provided by Google Cloud Platform, such as there is Amazon Elasticsearch Service on AWS.
I've opened a feature request ticket for this on the issue-tracker here, but I was wondering if there is a service somewhere on GCP that I'm missing? If not, are there plans to build an ES service on top of GCP? And if so, is there a general timeline on when that will be GA?
When configuring your cluster on ES Cloud (the cloud operated by Elastic Inc), you have the choice between hosting it on AWS or on GCP. If you pick GCP, the cluster is fully managed by Elastic on GCP.
This is a commercial feature (but AWS Elasticsearch is too), but you have a 14 days free trial to see how it looks like.
Also worth reading:
https://www.elastic.co/blog/hosted-elasticsearch-services-roundup-elastic-cloud-and-amazon-elasticsearch-service
https://www.elastic.co/aws-elasticsearch-service
Thank you for creating a feature request!
Regarding Elasticsearch on GCP, I am not 100% sure if it will apply for your case but there is a solution on Google Marketplace. It is Elasticsearch Service on Elastic Cloud offered by Google on GCP. Check it out and see if you can use it.
Can Kafka be used as a messaging service between oracle and elastic search ? any downsides of this approach?
Kafka Connect provides you a JDBC Source and an Elasticsearch Sink.
No downsides that I am aware of, other than service maintenance.
Feel free to use Logstash instead, but Kafka provides better resiliency and scalability.
I have tried this in the past with Sql server instead of Oracle and it works great, and I am sure you could try the same approach with Oracle as well since I know the logstash JDBC plugin that I am going to describe below has support for Oracle DB.
So basically you would need a Logstash JDBC input plugin https://www.elastic.co/guide/en/logstash/current/plugins-inputs-jdbc.html that points to your Oracle DB instance and pushes the rows over to Kafka using the Kafka Output plugin https://www.elastic.co/guide/en/logstash/current/plugins-outputs-kafka.html.
Now to read the contents from Kafka you would need, another Logstash instance(this is the indexer) and use the Kafka input plugin https://www.elastic.co/guide/en/logstash/current/plugins-inputs-kafka.html. And finally use the Elastic search output plugin in the Logstash indexer configuration file to push the events to Elastic Search.
So the pipeline would look like this,
Oracle -> Logstash Shipper -> Kafka -> Logstash Indexer -> Elastic search.
So overall I think this is a pretty scalable way to push events from your DB to Elastic search. Now, if you look at downsides, at times you can feel that there are one too many components in your pipeline and can be frustrating especially when you have failures. So you need to put in appropriate controls and monitoring at every level to make sure you have a functioning data aggregation pipeline that is described above. Give it a try and good luck!
I have a number of applications that are running in different data centers, developed and maintained by different vendors. Each application has a web service that exposes relevant log data (audit data, security data, data related to cost calculations, performance data, ...) consolidated for the application.
My task is to get data from each system into a setup of Elasticsearch, Kibana and Logstash so I can create business reports or just view data the way I want to.
Assume I have a JBoss application server for integration to these "expose log" services, what is the best way to feed Elasticssearch? Some Logstash plugin that calls each service? JBoss uses some Logstash plugin? Or some other way?
The best way is to set up the logstash shipper on the server where the logs are created.
This will then ship them to a Redis server.
Another logstash instance will then pull the data from Redis, and index it, and ship it to Elasticsearch.
Kibana will then provide an interface to Elasticsearch, which is where the goodness happens.
I wrote a post on how to install Logstash a little while ago. Versions may have been updated since, but its still valid
http://www.nightbluefruit.com/blog/2013/09/how-to-install-and-setup-logstash/
Do your JBoss application server writes logs to file?
In my experiences, My JBoss application(in multiple server) writes the logs to the file. Then I use logstash to read the logs file and ship all the logs to a central server. You can refer to here.
So, what can you do is setup a logstash shipper in different data center.
If you do not have permission to do this, maybe you want to write a program to get the logs from different web services and then save them to a file. Then setup the logstash to read the logs file. So far, logstash do not have any plugin that can call web services.