Observability in Laravel Applications - laravel

There are three main pillars of observability in applications; metrics, traces, and logs. I would want my laravel applications to be "observable" wrt to these.
Tools like Elastic, Logstash and Kibana seem to be industry standard but I cant seem to get good tutorials on how to integrate them with laravel and generally my understanding of them is hazy.
So, question is:
What observability tools do laravel developers generally use?
If the option falls on the ELK stack, are there any great tutorials or guides on how to do this?
Kibana guides are a bit too complex for a feeble mind like mine. But I am willing to get a few nosebleeds while at it - if thats the only way.

The first and easiest thing to do since you're running Laravel is to install and configure the APM agent for PHP which supports Laravel out of the box. This will take care of the "tracing" pillar.
Regarding metrics, you can install Metricbeat with the system module and the PHP_FPM module. This will take care of the "metrics" pillar.
Finally, for the "logs" pillar, you can install Filebeat with the nginx module to index your Nginx Laravel logs.
Those three will allow you to observe your Laravel applications very easily.

Related

What would be the advantages of using ELK for log management over a simple python logging + existing database log table combo?

Assuming I have many Python processes running on an automation server such as Jenkins, let's say I want to use Python's native logging module and, other than writing to the Jenkins console or to a log file, I want to store & centralize the logs somewhere.
I thought of using ELK for that, but then I realized that I can just as well create a dedicated log table in an existing database (I'm using Redshift), use something like Grafana for log dashboards/visualization and save myself the trouble of deploying a new system (most of the people in my team are familiar with Redshift but not with ElasticSearch).
Although it sounds straightforward, I feel like I'm not looking at the big picture and that I would be missing some powerful capabilities that components like Logstash were written for the in the first place. What would these capabilities be and how would it be advantageous to use ELK instead of my solution?
Thank you!
I have implemented a full ELK stack in my company in the past year.
The project was huge and took a lot of time to properly implement. The advantages of using ELK and not implementing our own centralized logging solution would be:
Not needing to re-invent the wheel- There is already a product that is doing just that. (and the installation part is extremely easy)
It is battle tested and can stand huge amount of logs in a short time.
As your business and product grows and shift you will need to parse more logs with different structure which will mean DB changes on self built system. logstash will give you endless possibilities of filtering and parsing those new formatted logs.
It has Cluster and HA capabilities, and you can scale your logging system vertically and horizontally.
Very easy to maintain and change over time.
It can send the needed output to a variety of products including Zabbix, Grafana, elasticsearch and many more.
Kibana will give you ability to view the logs, build graphs and dashboards, alerts and more...
The options with ELK are really endless and the more I work with it, the more I find new ways it can help me. not just from viewing logs on distributed remote server systems, but also security alerts and SLA graphs and many other insights.

How to set up monitoring for Redmine tasks/issues in Grafana?

I plan to set up monitoring for Redmine, with the help of which I can see man-hours spent on tickets, time taken to complete a ticket etc to monitor the productivity of my team. I want to see all of these using Graphana. As of now I think using Prometheus and exposing the Metrics but not sure how. (Might have to create an exporter I think, but not sure if that would work). So basically how can this be possible?
A Prometheus exporter is simply an HTTP server that sits next to your target (Redmine in your case, although I have no experience with it) and whenever it gets a /metrics request it does one or more API calls to the target (assuming Redmine provides an API to query the numbers you need) and returns said numbers as Prometheus metrics with names, labels etc.
Here are the Prometheus clients (that help expose metrics in the format accepted by Prometheus) for Go and Java (look for simpleclient_http or simpleclient_servlet). There is support for many other languages.
Adding on to #Alin's answer to expose Redmine metrics to Prometheus. You would need to install an exporter.
https://github.com/mbeloshitsky/redmine_prometheus.git
Here is a redmine plugin available for prometheus.
You can get the hours and all the data you need through Redmine Rest APIs. Write a little program to fetch and update the data in Graphite or Prometheus. You can perform this task using sensu through creating a metric script in python,ruby or Perl. Next all you have to do is Plotting the graphs. Well thats another race :P
RedMine guide: http://www.redmine.org/projects/redmine/wiki/Rest_api_with_python

How to monitor Elastic Stack without X-Pack?

Can we monitor the elastic stack 6.0 and above(like elastic search..) without using the X-Pack?As we know many of the Features like security, machine learning, graph APIs don't be supported under BASIC(free Licence).
So I want to know if there are any APIs, without Licence limitation, can be used to implement those functionalities mentioned above?
All the information should be in the cluster APIs, you'll just lack the visualizations.
Monitoring (of the local cluster) is actually included in X-Pack Basic unlike the other features. Any reason you don't want to use it?
Alternatives include Kopf, Cerebro,... though you'll need to run them as a separate process and watch out for version compatibilities.
We've had success with ElasticHQ for Monitoring (requires python)
https://github.com/ElasticHQ/elasticsearch-HQ
And sentinl for setting up alerts/watchers (it is a plugin for kibana)
https://github.com/sirensolutions/sentinl/wiki
We have set up a reverse proxy to enable ssl/tls and use ubuntu user management to create logins, however, we do not limit access within Kibana itself.
We have little need for graph/machine learning so I am unaware of free alternatives.
The company I work for is heavily Open Source, so these projects suit us.

Heroku and Elasticsearch - which add-on to use?

I plan to use Elasticsearch on heroku.
I was looking for the best option of Elasticsearch add-on I can use.
Found was my first choice from the following reasons:
It is now part of elastic.
When using Elasticsearch on heroku it will be opened to the world - a secure wrapper to the transport client was introduced - https://github.com/foundit/elasticsearch-transport-module/
But it looks like this repository is not highly maintained, and Elasticseach 1.5 is the latest version which is supported.
What is the recommended add-on then?
If I want to use the latest version of Elasticsearch I am doomed to use an unsecure connection?
Maybe use the official java client?
Nick with Bonsai here. Based on your question, and my own obvious bias, I'll suggest Bonsai for the following reasons:
All of our clusters have SSL with basic auth to secure the connection. We feel pretty strongly that security comes as a standard feature.
We were the first hosted Elasticsearch provider, ever. (And one of the first addon providers on Heroku, ever, with our first search addon, Websolr.) So we've got plenty of experience hosting search and and thousands of other happy Heroku customers.
One definite tradeoff with using Bonsai is that we're generally always going to lag a bit behind the latest version of ES. As of this posting we're still running ES 1.7, but updates to ES 2.2 are just around the corner.
This is probably going to be true in the future as well. Part of the reason for this is that we're a small, bootstrapped company, and we have to be pragmatic in where we focus our engineering efforts. Plus as an operations company with thousands of businesses, we like to let major new upgrades spend a few months in the wild before we commit to supporting it.
We also work hard on providing managed upgrades, at least for versions that are sufficiently backwards compatible. Everyone has their tools for helping to manage upgrades, but I don't think any of the other providers do actual in-place upgrades.
Unless you have a hard requirement for a specific feature in 2.x (and if you do, please let me know) you may do fine on 1.7 until our 2.x support is fully baked. Drop us a line at info#bonsai.io to get whitelisted for the first release of that in the coming weeks.

How to secure Elasticsearch

I have a Elasticsearch running on my server by default it runs on port 9200 and link is public means any one can insert, update, delete anything form anywhere. How do I make it secure like phpMyadmin which can be only accessed with the help of my code and not directly from browser or postman.
Elasticsearch does not perform authentication or authorization, leaving that as an exercise for the developer. Two popular ways I have seen are
Setup your own proxy (Nginx/HAProxy) fronting elasticsearch - this way you exercise full control. You can also use the Elasticsearch-jetty plugin to have jetty level auth
Shield - If budget permits use Shield which is a paid offering from Elasticsearch - https://www.elastic.co/products/shield
Even with these in place, depending on who you are exposing this to - you may want to disable certain things like dynamic scripting, throttles for DoS etc.
You can use the Elasticsearch basic authentication plugin - https://github.com/Asquera/elasticsearch-http-basic
The README there gives a good idea on how to set it up.
If you are using Kibana3 as a frontend to elasticsearch, you can secure it using https://github.com/fangli/kibana-authentication-proxy
I have enabled a relatively simple Nginx proxy that sits between my Elasticsearch and Kibana to configure authorized access to my dashboards and charts.
Look at my post here: https://udaysagars.wordpress.com/2016/04/04/how-i-configured-authorized-access-to-kibana-dashboards/
Also, you can view my application that uses this method here: http://udaysagar2177.github.io/ec2/twitter-analytics.html

Resources