Are there any open source (or I guess commercial) packages that you can plug into your site for monitoring purposes? I'd like something that we can hook up to our ASP.NET site and use to provide reporting on things like:
performance over time
current load
page traffic
SQL performance
PU time monitoring
Ideally in c# :)
With some sexy graphs.
Edit: I'd also be happy with a package that I can feed statistics and views of data to, and it would analyse trends, spot abnormal behaviour (e.g. "no one has logged in for the last hour. is this Ok?", "high traffic levels detected", "low number of API calls detected") and generally be very useful indeed. Does such a thing exist?
At my last office we had a big screen which showed us loads and loads of performance counters over a couple of time ranges, and we could spot weird stuff happening, but the data was not stored and there was no way to report on it. Its a package for doing this that I'm after.
It should be noted that google analytics is not an accurate representation of web site usage. This is because the web beacon (web bug) used on the page does not always load for these reasons:
Google analytics servers are called by millions of pages every second and can not always process the requests in a timely fashion.
Users often browse away from a page before the full page has loaded and thus there is not enough time to load Googles web beacon to record a hit.
Google analytics require javascript to be installed which can be disabled.
Quite a few (but not substantial amount) of people block google-analytics.com from their browsers, myself included.
The physical log files are the best 'real' representation of site usage as they record every request. Alternatively there are far better 'professional' packages, of which Omniture is my favourite, which have much better response times, alternative methods for recording actions and more functionality.
If you're after things like server data, would RRDTool be something you're after?
It's not really a webserver type stats program though, I have no idea how it would scale.
Edit:
I've also just found Splunk Swarm, if you're interested in something that looks "cool".
Google Analytics is free (up to 50,000 hits per month I think) and is easy to setup with just a little javascript snippet to insert into your header or footer and has great detailed reports, with some very nice graphs.
Google Analytics is quick to set up and provides more sexy graphs than you can shake a stick at.
http://www.google.com/analytics/
Not Invented here but it's on my todo list to setup.
http://awstats.sourceforge.net/
#Ian
Looks like they've raised the limit. Not very surprising, it is google after all ;)
This free version is limited to 5 million pageviews a month - however, users with an active Google AdWords account are given unlimited pageview tracking.
http://www.google.com/support/googleanalytics/bin/answer.py?hl=en&answer=55543
http://www.serverdensity.com/
One option is to use external monitoring tools, which will monitor the web performance from outside the firewall by simulating end user activities.
Catchpoint Systems has an interesting approach that requires very little coding and gives you the performance stats from outside the datacenter and from inside the asp.net (like processing time, etc)
http://www.catchpoint.com/products.html
Related
I am not able to find out anywhere that how can we do performance test manually.
Please help me out for this query.
Thanks!
Maybe you are looking for JMeter or a similar tool.
What browser? Most of the current browsers support the W3C Navigation Timing spec and expose performance data directly on the DOM. You can access it from the console, from javascript on your pages or from browser extensions that display the information.
If you want more detail like a resource load waterfall then you can usually access that directly from the dev tools provided by the various browsers.
One thing you will want to be really careful of is to make sure you do your testing in a configuration that is similar to the users. If you are running a server locally and testing from a browser on the same machine or even the same network then your performance data will be pretty worthless (unless it's an intranet app).
you can perform manual testing (Performance testing) for any webpage by optimizing your css, Javascript and images ( size).
I think JMeter is a best tool for same to check webpage testing if you want add some scripting you can also add.
Also you can check Yslow addons of firefox.This addons give you filter data to optimized your page perfromes.
Also there are some online link available.
How can we run performance testing manually for any webpage?
You can simple use GTMatrix tool which will response of your site Performaces overall in detail.
The best way to go for Performance Testing without any tool is to provide a Standard loading time for each page as per one's experience knowledge. Else request the client to provide an ideal time for each page. Against which the loading time can be verified. But in case of multiple user simultaneously JMeter is the best hands on Approach available. Its Open source. Easy to understand. And you get reports too.
But of course there are multiple factors that would hinder the Performance. They are :
Your network speed
The Server speed on which your application is hosted
The number of Simultaneous users using
The Heavy images in pages
Last but not the least unnecessary links, codes, in short memory consumption in Code, could be loops not required. All the gifts from Developer Teams !!
There are already plenty of hosted services for tracking server response times but we're looking for something to track page render/load times (i.e. browser renderer).
The problem with PageSpeed, YSlow, etc is they're on request where we want something that runs constantly and takes a reading every 15 mins for example.
Latest browsers have a window.performance.timing property which contains the timestamp at which some events occurred (such as domainLookupStart, domLoading, domInteractive, ...).
You may want to send a sample of those numbers to your servers.
See https://developer.mozilla.org/en/API/navigationTiming
Google Analytics has a page speed metric, you may want to look at it too.
Atatus monitors page load time and there by does Real User Monitoring (RUM), and also it monitors for JavaScript errors along with capturing all the user actions that lead to the error.
It also gives various views of how your performance is across geographical locations and browsers.
https://www.atatus.com/
https://www.atatus.com/blog/announcing-real-user-monitoring/
Disclaimer: I’m a web developer at Atatus.
We use GTmetrix to get daily Page Speed and YSlow scores. If you need a higher frequency, you can automate YSlow or Page Speed using Selenium or ShowSlow.
Try a free tool like dynaTrace's AJAX Edition 3. From the website: "Speed your page load times, optimize rendering, tune DOM execution, and compare to competition. Even integrate with Selenium, Watir or QTP to begin automating your performance tests. It's free, it's easy and it's now for both Firefox and IE."
I do work for dynaTrace, in full disclosure. But this simple, free tool should be very useful to you.
Best
GP
I'm not sure how to categorize this question, so let me just explain what I would like and hopefully it will make sense.
I'm after a product (with an API) which I can send different numbers to with tags, and it will take care of all the monitoring/logging stuff.
So for example, say I have a program that downloads a file from a website every 10 seconds. I would like to monitor how long each of these downloads is taking. It is quite easy in my application to time how long it takes. I would now like to send this number and tag (e.g., tag='download time', value = '1.234') to a 3rd party product. The 3rd party product will now store this value/tag for me. The product will have a website I can go to, and configure a bunch of things. So in this example, I could setup an alert like "if 'download time' > 5 send me an email". I could also visit a website, and view a graph of the logged values and maybe some random statistics (e.g., how often the value has been in the warning/error zone).
I think that's about it. Sure it wouldn't be too hard to do this myself, but I'm no web designer and it'd end up looking pretty ugly. The more user friendly this kind of product is the more willing users will be to look at the data and actually monitor stuff.
Does such a service exist?
EDIT: Products similar to this: http://dashboard.kpilibrary.com/. This is pretty much exactly what I was after, but am still searching around.
There are many monitoring tools out there. Nagios or RHQ (http://rhq-project.org/) come to mind. Most of the tools work a little different: rather than throwing stuff at them, they have plugins that actively go out and do something to do the measuring. In your example, the plugin would download the file and then report the measurement data to the central server, which can then show you graphs or run alerts on it.
On Windows, you can use this:
http://technet.microsoft.com/en-us/library/cc771692%28WS.10%29.aspx
(Windows Performance Monitor)
It pretty much does what you are looking for:
Passively collects performance data (E.g. CPU Usage)
Can be fed App specific performance metrics (E.g. download time)
Can alert you on various thresholds
Has a reporting interface for analyzing metrics
EDIT : http://technet.microsoft.com/en-us/library/cc749249.aspx , more documentation on this.
This answer is specific to Windows.
If you are looking to analyze events from various systems and you also what the opportunity to create your own events you should consider ETW.
The ETW system allows you to consume data events from any number of sub-systems. You can look at an exhaustive list of built in providers by running the following command:
logman query providers
The beauty of ETW is that you also have the opportunity to create your own providers and push your own data into the resulting report. This is a high-performance logging mechanism and is used by Windows itself for many performance investigations.
The resulting report will be an ETL file. This is a standard file that can be viewed using xPerf, ships with Windows SDK, or the build-in ETL analyzer, tracerpt.exe.
I want to be able to monitor the performance(load time of the entire page, load times of individually downloaded js/cs files , amount of memory used by the browser for the page,etc) of my web application from the perspective of the user(i.e the browser client).
Is there any tool/plugin , that can help me monitor all of these?
This list should serve as a good starting point for you. It is a complete list of end user monitoring tools.
In order to count as an End User Experience Monitoring tool it must be able to track the response times that real users experience when visiting the site – not a robot which is synthetically pinging the site. Specifically I am referring to tools that would enable IT operations to ensure that the real end users of an application or website are experiencing good performance. As I have alluded to in a previous post “speed solves a lot of problems” – claiming that even if your usability is not perfect – if it runs fast – people are less likely to notice.
We built a tool to solve this problem, take a look at Bucky.
Try FIDDLER or CHARLES
Try http://www.yottaa.com (disclaimer: i work here) - it runs real browsers in different locations to monitor your site and record asset level performance data. You can also try Gomez or Keynote Systems.
You can try out Atatus - https://www.atatus.com/ which offers performance monitoring and error tracking in one place for all your apps.
Disclaimer: I work at Atatus.
My agile team will be adding new features to a existing realty website. As we add the features we want to have a better handle on the site's overall performance as well as the performance of particular pages.
I would like to automate the gathering of performance metrics on a request/response basis for each page (e.g. what sub requests are sent out by the browser, how many are there, how much data is transferred, and how long did each request take to fulfill).
Firebug currently captures this information in its net panel, however, I haven't found any way to programmatically pull this information out.
Does anyone know of a way to pull this information out after a page has loaded?
We are currently running our user acceptance tests with Selenium and I have considered adding this feature to the selenium interface so that our tests could run and collect the data without starting any other service.
All suggestions are welcome, including ones that leverage other tools/methods to gather the performance metrics.
Thank you.
Jan Odvarko has written a Tutorial on how to use the new listener functionality within Firebug to log net panel results:
"Since Firebug 1.4a13 the Net panel introduces, among other things, several new events that allow to easily collect all network requests and also related info gathered and computed by Firebug.
This functionality should be useful also in cases where Firebug extensions want to store network activity info into a local database or send it back to the server for further analysis (I am thinking about performance statistics here)."
Take a look at the NetExport extension for FireBug.
Steps:
enable autoexport in preferences( you can automate this one as well)
choose the folder where the data is to be added
Read the file
While it isn't directly a Firebug solution, perhaps something like Jiffy would help?
Jiffy pretty much works like a server based version of Firebug's measurement tools. I haven't used it in anger yet, but it may do what you're looking for?
http://code.google.com/p/jiffy-web/
Jiffy allows developers to:
measure individual pieces of page rendering (script load, AJAX execution, page load, etc.) on every client
report those measurements and other metadata to a web server
aggregate web server logs into a database
generate reports
There is a way to use ySlow to beacon out performance data to a URL of your choice. It's not well documented, the only info I found was here:
http://tech.groups.yahoo.com/group/exceptional-performance/messages/490?threaded=1&m=e&var=1&tidx=1
Aside from that I would look into writing a Firebug plugin, I think you can access most Firebug properties. Here's a tutorial: http://www.firephp.org/Reference/Developers/ExtendingFirebug.htm
Ben,
I've done this by extended Selenium RC's ProxyHandler to queue up the URLs seen and then allow you to pull them down via some other API. It requires that you proxy everything, which isn't the default behavior of Selenium. The nice thing is that Selenium ends up being both the place to drive automation and collect the results seen.
This is probably a feature we'll soon add to Selenium RC right after we get 1.0 out the door (we're very close!).
Okay I admit this is not a direct answer but how about going right to the source? Cut out FireBug and go to the web server. Can the server log events with sufficient granularity to allow calculation of the information you require? Parsing the log file into useful data should not be particularly difficult and has the advantage of being user-platform independent and has the potential to log a greater set of data than that offered by FireBug (Awesome tool btw).