What is the best way to monitore Ruby process ( Not Rails ), I tried to use Newrelic but it seems it is designed for Rails .
I'm using distributed Ruby processes with foreman, the processes communicate with RabbitMQ queuing system, and I'm looking for a tool that can trace my queries and other parameters specially that I'm using active record with most of my models .
NewRelic is still viable for Ruby apps that are non-Rails. The downside is you have to manually instrument your code with trace calls. NewRelic's documentation (scroll down to Adding Method Tracers (Ruby)) shows how to setup custom tracing for your application. Then you can use the NewRelic website as usual to inspect the runtime of your app.
I have done this for Java apps, but not a Ruby app. One thing to look out for is that custom traces will show up in the Background Issues area, not under Web Transactions (for obvious reasons). Database calls should be in the correct area.
The NewRelic support is very good, I have work with them several times to debug an issue. If you have an issue with the custom tracing, they will help sort it out.
try monit
Related
I've build a distributed system consisting of several web-services and some web applications consuming them.
They are all hosted on Heroku.
Is there some way for request between these applications to be done "inside heroku" without going through the web.
Something analog to using localhost.
You are maybe in luck: such a feature has currently reached the experimental phase.
Let me take a moment to underscore that: this feature may disappear or change at any time. It's not supported, but bug reports are appreciated. Don't build a bank with it. Don't get yourself in a position to be incredibly sad if severe problems are found that render it unshippable and it's aborted.
However, it is still cool, and here it is: containerized-network
You can use, for example, the pub-sub interface of any of the hosted Redis solutions. Or any of the message brokers (IronMQ, RabbitMQ) to pass messages.
I am trying to see if I can create a simple website, like a blog, using only Ruby. No Rails or a database or outside web servers. I plan to store the data in a file for persistence.
I wanted to use TCPServer, CGI, and Net::HTTP.
Is there an easier way I can use?
There are a lot of moving parts when designing a website.
Depending on the purpose of the exercise, you might want to consider using a very simple web framework like Camping, Sinatra, or Ramaze. This is probably the best solution if you're trying to get a top level understanding of web programming because it only has exactly what you need (Camping is less than 4k!) and handles stuff like routing.
Building a web server is more an exercise in HTTP parsing. You might want to omit the framework and try to build something on top of Rake (an API for lots of popular web servers) and a simple web server like Webrick or Thin.
You could also try Espresso
It is easy to learn and fast to run.
And offers all the liberty you need for creation process.
Also it has no hidden "fees", everything is transparent.
I am looking for a way to collect data remotely from various cloud instances (EC2, Rackpsace). The Rackspace API provides no way for collecting server performance metrics (ie load average, cpu usage, memory) via it's API, otherwise this would have never been asked.
I started looking at solutions like Capistrano or Mcollective (I have also considered collectd), but I am unsure of which one would best suit my application. I am trying to avoid using ssh keys for trending purposes (I don't want to have to keep logging in to collect these metrics) The script I am writing is a Ruby script which reboots a cloud server if it's load average is over a certain number. Because these providers don't expose these metrics via their API, I am looking at a way to gather them myself, and I am new to the Ruby community so after briefing over the documentation for all of these tools, I still haven't been able to get a sense of which framework would work best, or if there are other alternatives.
It sounds like Capistrano is more suited to be a deployment tool, although it can perform remote tasks, so after I read the documentation for that it was pretty much out for the purposes of my script.
MCollective looks really attractive for what I am trying to do but it seems I would have to write my own RPC style plugin for this purpose.
I've also considered plugging into some greater monitoring system such as Nagios, Munin, Zenoss, Hyperic, etc, but I'd rather not install some large bulk monitoring system when all I want to collect is but a few simple metrics.
If your intention is to trigger certain actions based on the system performance (like restarting when cpu usage is too high), you should check out god.
I'm not sure if this is also useful when you want to generate some performance statistics over a longer time period. Personally, I'm using Munin for this, but if you don't like it maybe you can find something on Ruby Toolbox | Server Monitoring.
I have rails3 application and want to check it every few hours (cron)
verify database connection, free disk space ,etc...
What else should I check ?
Also is there a special gem for creating some kind of test controller with all I need, or shoul I create it myself?
rather than try and build your own analysis tools, you should try and use what rails provides you.
http://rails-analyzer.rubyforge.org/pl_analyze/
http://rails-analyzer.rubyforge.org/tools/
Although new relic is the best web based monitoring tool for rails you could try some alternatives.
http://drewblas.com/2008/05/29/comparison-of-rails-monitoring-apps/
We are using watir and integrated with VS 2008 using ruby in steel and we have automated our web application and it awsome.
Is there way to use the same script to do the performance testing or is there any better tool.
It's hard to tell if you want something that analyzes the performance of your website (ie: profiler) or a load/stress testing tool. I'm going to assume you want a load testing tool and not a profiler, given that you're talking about script reuse.
All load testing tools, except for one (disclaimer: my company is that one), work by recording HTTP traffic and then replaying it. The script is very different from a functional testing script like one you'd have for Watir.
You can either record the HTTP traffic generated by your Watir script or try to run your functional tests directly.
If you're also using FireWatir, you can use Firebug, which is an excellent web developer tool and shows you the recorded traffic for each page. If you're using IE primarily, check out HttpWatch. It's commercial, but provides great network timings for IE and can export to various data formats. Alternatively, many load testing tools provide a proxy that can record traffic and generate a load script for you.
Once you've got the network data, you can likely quickly turn it in to a script that Pylot, Grinder, JMeter, etc can understand. The problem with this method is that you need to re-record your script whenever any part of the site or the test changes. And if your app is anything more than basic HTML (ie: Ajax, .NET viewstate, etc) then you may have to use some advanced parts of your load testing tool. See my article on ajax load testing for more info.
Shameless plug: if you were using Selenium (or were willing to convert a couple Watir scripts to Selenium scripts), which is another open source functional testing tool, you could use BrowserMob, which provides a load testing service that uses real browsers to play back the load and functional test scripts (Selenium) to drive them. It uses a lot more resources, but thanks to cloud computing the price point is still very low.
There's rawk that you could run over the log files. This gives a pretty comprehensive summary of what's taking so long.
Alternatively there's NewRelic which provides monitoring for your rails app and gives you a detailed breakdown of what every request is doing.
And finally there's FiveRuns which does things very similar to NewRelic.
Have a look at LoadWise, you could reuse existing functional test scripts for performance testing.
With the same load test scripts without change, You can either preview it in Firefox (via FireWatir) or hit your web sites with X number of virtual users (via Celerity).
http://testwisely.com/en/loadwise
dIf you are truely talking just 'performance' then you could alter the scripts to start capturing and recording the page load tines. Any time you so some action that causes a page to load (like navigating to a link) Watir returns the time to load the page.
You just need to have the scripts implement some kind of simple logging method to be able to record the time to load each page, and then alter the scripts so that it return value is captured ala
loadtime = browser.goto(someurl)
perflogger(someurl, loadtime)