ServiceNow integration with Elasticsearch - elasticsearch

We are using ELK to monitor the system performance and our application logs. If there is an error in the logs, we want to create an issue in ServiceNow from ELK. Is there a way to do this? Any pointers would help.

I don't know about ELK specifically, but perhaps you could make a SOAP/REST call to do it?
Just make sure your ELK service account has sufficient permissions, and get the WSDL by going to http://yourinstance.service-now.com/tablename.do?WSDL

Related

Cloudfoundry logs to Elastic SAAS

In the documentation of Cloudfoundry, the Elastic SAAS service is not mentioned
https://docs.cloudfoundry.org/devguide/services/log-management-thirdparty-svc.html
So was wondering if anyone has done it and how?
I know one way is to use a logstash instance in cf, feed the syslog to it and then ship it to Elastic. But just wondering if there is a direct possibility to skip the logstash deployment on cf?
PS. We also log using the ECS format.

How can we get nginx access log on laravel

As title, I need to get data from nginx access log to handle and store in db. So anyone have any ideas about this ? Thank you for reading this post
You should not be storing nginx logs in the DB and trying to read them through Laravel, it will very quickly cause you performance and storage issues especially on production. Other issues will be if you have various servers, how would you aggregate all the logs?
Common practice is to use NoSQL for such tasks. So you can setup another dedicated server where you export all your logs and analyze them. You use an exporter that you install on every one of your servers, point them to your log file and they export the logs to a central logs server. You can set this up yourself using something like ELK stack. With ELK stack you can use filebeat and logstash for this.
Better would be to use some of the services out there such as GCP logging, splunk, etc. You have to pay for them but they offer a lot of benefits. Splunk would provide you with an exporter, with gcp you could use fluentd. If you are using containers, you can also setup a fluentd container and shared volumes to export the logs.

Is there application client for ElasticSeach 6.4.3 (similar to DBvear)

I tried to see my node data from application client (like DBvear), but I didn't found information about that. someone found way to connect DBvear to this version or to see the data by similar application?
I believe what you are looking for is GUI for Elasticsearch.
Typically the industry calls the elasticsearch stack as ELK stack and I believe what you are looking for is the K part of it which is Kibana.
I'm not sure if you are asking for SQL feature but if you are thinking to make use of the SQL feature you can check the Elasticsearch SQL plugin.
Other widely used client application for elasticsearch is Grafana. There are others available too(I think Splunk, Graylog, Loggly) but I believe Kibana and Grafana are the best bet.
Hope this helps!
Actually no, I using elastic search as a Database in different deployments and I don't want to maintenance Kibana instance (i prefer to see all the data in tool like DBvear)

Showing crashed/terminated pod logs on Kibana

I am currently working on the ELK setup for my Kubernetes clusters. I set up logging for all the pods and fortunately, it's working fine.
Now I want to push all terminated/crashed pod logs (which we get by describing but not as docker logs) as well to my Kibana instance.
I checked on my server for those logs, but they don't seem to be stored anywhere on my machine. (inside /var/log/)
maybe it's not enabled or I might not aware where to find them.
If these logs are available in a log file similar to the system log then I think it would be very easy to put them on Kibana.
It would be a great help if anyone can help me achieve this.
You need to use kube-state-metrics by which you can get all pod related metrics. You can configure to your kube-state-metrics to connect elastic search. It will create an index for a different kind of metrics. Then you can easily use that index to display your charts/graphs in Kibana UI.
https://github.com/kubernetes/kube-state-metrics

How to set elastic alert on amazon elasticsearch

I've been looking for a tutorial that I can get alerts from amazon elasticsearch.
I'm using metricbeat in my server instance to collect logs everything is fine but now I have to find a way to send alert for my memory and cpu, I read something about elastic alert to send alert to e-mail or slack but I don't know how to use it on amazon elasticsearch.
If anybody has a tutorial that help me.
Thanks in advance.
You need x-pack to be able to configure watchers to send email or Slack alerts. But, AWS Elasticsearch does not offer x-pack features. For this exact reason we moved away from AWS Elasticsearch to Elastic Cloud and we couldn’t be happier.

Resources