ElasticBeanstalk system events and my application logs - amazon-ec2

AWS has a really nice log management tool. I can make my application log messages there very easy.
Amazon ElasticBeanstalk has a "event management" tool.
The questions are:
Can I log my app messages together in the ElasticBeanstalk events? Is it the syslog of the EC2 instance?
If yes, is this a good practice? Any problem on this? I was thinking about because, if there is no problem, I would not need any other third log management service.

The events shown in Elastic Beanstalk are internal to it. You are not supposed to fudge around with them (Although nobody is really preventing you from playing around with them).
Also, there's a log snapshot feature that picks up logs related to the application. These logs are mainly related to deployment and logging messages from the application itself. So, you can use this feature in case your application code is logging messages. For example, if you are running a Ruby/Rails with passenger you would get log messages under /var/app/support/logs/passenger.log. These are not syslog messages per se and the problem with this approach is that it's not straight forward the get your custom monitoring in place. For example, how do you parse your errors and send them to say PagerDuty?
Like you've probably figured out if you want to have custom monitoring (send logs to a syslog facility) you are better off using a third party tool like Splunk Storm, PaperTrail or Loggly. Of course you can setup your own syslog server(s) but that will require you to set up all the infrastructure.
Hope this helps.

Related

Enabling logging/debugging in Azure worker role to azure storage

I have a .net project that I am trying to deploy as a worker role in Azure. I am able to publish the file directly from Visual Studio but then when the worker role runs I am getting a uncaught exception. I am attempting to enable logging to azure storage from the worker role so I can get more information on the exception but I am running into issues getting MIT configured. Can anyone provide assistance on the best way to enable this logging?
I’m not a massive fan of the recommended Azure Worker Role logging process, namely using the Trace.WriteLine() method as I don’t feel as though it provides sufficient flexibility for my logging needs and I think it looks crap when my code is liberally scattered with Trace.WriteLine() statements, code is art and all that. I also don't like that Trace statements aren't always logged and can be 'lost' if the Worker Role hiccups or generally goes astray.
I therefore came up with an approach that writes log files to local storage via NLog, which are then flushed to Azure Storage on a schedule. Works like a charm.
I've got it documented over in the blog post at: https://modhul.wordpress.com/2014/10/28/capturing-custom-logs-from-azure-worker-roles-using-azure-diagnostics/
If I want to watch my log files in real-time (rather than waiting for them to be flushed to Azure Storage), I RDP into the Worker Role instance and fire-up a copy of BareTail (http://www.baremetalsoft.com/baretail/) which is a great way of watching log files in real-time, it also lets you add colour-coding for errors, info, warnings etc.

Parse.com and Logentries

I'm using Parse.com with iOS SDK. I really want to get all Parse logs into a third party service (Logentries) but I have no idea if Parse.com's logs can be exposed at all?
Obviously having log data which periodically gets deleted in Parse is not ideal, plus I can't filter it and an integration with Logentries could be amazing.
I'm referencing a previous unanswered question: How to stream parse logs to a service provider?
Logentries has a special endpoint for logging from AJAX and server-side JS applications. This means that for environments like Parse where using 3rd party libraries is awkward, direct integration is still dead simple- see my fork of #Marco T's gist.
I was successful in pushing logs to Loggly using an edited version of: https://gist.github.com/rogernolan/95ea615164e343b3bc54

Heroku logs cluttered

I just started using Heroku for one of my node apps.
When I run the heroku logs command it is so cluttered that i cant pick out the data I want from all the other information I don't need.
Is there a way to clean up that log output so it's more human friendly?
It's like it just dumps a wall of text at me.
Thanks!
I am using Papertrail add-on on Heroku for viewing the logs.
It has a free plan which is enough for small application. It gives you flexibility for searching your logs by text and time. A browser URL is provided by Papertrail to view the logs, which is convenient to access from mobile also. Adding this add-on to your application is quite simple, no app changes are required. Below filters are available out of the box on its dashboard to view the logs-
All events
Deploys
Dyno state changes
Platform errors
Web app output

Heroku: Testing IronMQ Messaging With Worker Locally Using Foreman

I am new to Heroku and I am trying to bootstrap a local development environment. Using Foreman, or another tool, can someone please point me to docs that illustrate sending and consuming a message with a worker. Key being setting up the MQ and the worker consuming the message configured locally. Thanks!
IronMQ (and IronWorker) are both cloud services and currently do not have local install capability. It's fairly easy to interact with the API from your local machine though including pushing messages, getting them, etc.
If you plan on using Push Queues, do keep in mind that in order to "push" back to your localhost you'll need to setup something like localtunner or ngrok. Here is some information on that: http://dev.iron.io/mq/reference/push_queues/#testing_on_localhost
Please feel free to hit us at support#iron.io or live chat: get.iron.io/chat
Chad

Heroku web UI log viewer

I wonder if there is a more friendly way ( using some GUI) to view Heroku application's logs than through a console?
I think papertrail (heroku addon) is the best option, it is similar to console logs, with search, and you can achive the logs daily to amazon S3.
Not at the moment.
You can however use the logs drain add-on to pipe your applications to a running syslog on a separate server, like outlined in the heroku docs.
After that, you could use a tool like Splunk to analyze your logs in a nice Web UI.
Hope this helps.
The Loggly Heroku add-on might be a good option.

Resources