Is there any time trace web application? - performance

Is there any time trace web application?
I want use it as a tool for monitor my program productivity.(I mean, how many hours I spend on a project)
edit: I once notice there is one (like a web twitter with time trace), but I forgot its name.

If personal time tracking is what you're after, check out Kimai.

On the other hand if your question is about measuring the response time of your web application, there are other proxy recorders than Firebug: SST Trace Plus and Fiddler come to mind. Fiddler is open source and you can write plug-ins for it.
If you are serious about measuring response time, you want to do it at very high loads though, to find the breaking points of your system/application. There are no good free tools, and most people use HP's LoadRunner (for large functional tests) or Spirent's Avalanche (for very high loads).

Do you mean a tool to monitor you time as a developer or a tool to monitor how quickly your program responds?
If you want to track your own time, try base camp (http://www.basecamphq.com/)
If you want to monitor the performance of an app then #RichieHindle's suggestion (Firebug; answer now deleted) is a good one.

http://www.rescuetime.com/
RescueTime Solo?

Related

How to slow down load time for a website

I'm looking for a script/code that slows down the load time of a website. It might sound stupid but it's needed. I've tried with javascript...html codes.. .htaccess file and yeah.. I'm out of ideas. Im using a webshoes who's using "nginx". Anyone who has an idea? I feel very stupid now for asking. :P
So my goal is that when the client is hitting the enter key to visit example.com the website that shows up is going to simulate slow internet speed or something in that style. English is not my natvie language so I have a hard time to find the correct words to use..sorry for that.
Your question is so broad I can only give a broad answer.
Javascript would not be the best choice IMHO as it executes client-side. You IMPLY you want to slow things down on the server side.
You don't say which web server you're running and therefore which server-side scripting languages are available to you (e.g. php, C# via ASP.NET, etc.), but one thing you might try is figuring that out, then scripting a web page with built-in delays via functions such as the C# and php sleep() methods. This will cause the web server to pause as it generates the page.
Again, because your question is so broad, I can't give a more specific answer. My goal here is to point you toward something you can specifically do on the server side that we KNOW will slow things down.
The understanding here is that you are going to have a client hit the page that has the slowdown calls built in. If you're trying to do something more "nefarious", like simulating a stress test of the server by running it low on memory or CPU, those types of situations can slow down a server too, but are beyond the scope of this answer. You can Google stress testing tools; you'll want to have at least a basic grasp of system monitoring tools for your platform (Windows, Linux, etc.) as well if this is the path you're heading down.

Web site performance tools?

YSlow, dynaTrace, HTTPWatch, Fiddler .........
All these things are really good for measuring the performance of the website and get statistics for the same. YSlow is really cool, offers good guidelines also.
However, i am very confused with so many things around (Though it's good that people already invested time and have made nice guidelines to follow and i thank them for great work done).
Following are my questions:
How much accuracy these tools have in terms or numbers they show ?
Which one(tool) is BEST to use (one for all needs)? Or i am missing name of some tool which is out of box and better than above all?
I'm suprised that you haven't mentioned JMeter. It is free, quite easy to use, has lots of features, and great for load testing your website.
As for question one, I'm not sure I can answer that. I'm sure that in general, the numbers these tools show are pretty accurate, but there are some catches. Take JMeter for example:
JMeter itself uses a lot of memory and also some substantial CPU time if you do some heavy load testing. That means that if you run the tool on the same machine as your website, some resources are lost, e.g. not available for the website
Testing it on the same machine does not out-of-the-box take in account that the data has to be sent over the internet connection, so response times are lower then the reality.
But in all, I think you should never blindly trust the results these tools give you, but they can give you a good insight into possible bottlenecks or problems.
YSlow is good to measure performance for a single user. Try to keep it grade A and it will be OK. But it actually doesn't measure performance in case of multiple concurrent users. For that you can use under each Apache JMeter. It's a good webserver/webapplication stresstest tool. So I would say, just use both YSlow (for client performance) and JMeter (for server performance).
I haven't used DynaTrace before, so I'll skip that part. The mentioned HTTP request trackers doesn't really measure performance, they are more debuggers.
As far as I am concerned, i find YSlow to be really good (have tried fiddler too) and it does help me when i need it and i do believe that it provides the correct figures thereby making me use that in the time ahead too unless there is anything unanimously accepted (which is difficult because everyone has different choices and requirements.) or even better. Oh they are right, forgot the JMeter, something you should definitely give a mention.
There is also Speed Tracer extension for Chrome. It should be usable with any JavaScript heavy website.
http://code.google.com/webtoolkit/speedtracer/
http://gtmetrix.com is a good tool and it is free. that analyzes your page's speed performance using Page Speed and YSlow

Performance tuning in Cocoa application

I am developing a Cocoa application which communicates constantly with a web service to get up-to-date data. This considerably reduces the performance of the application. The calls are made to the web service asynchronously but the number of calls is huge.
In what ways can I improve the performance of the application? Is there a good document/write up available which gives the best practices to be followed when an a Cocoa application communicates with a web service?
Thanks
You should try out Shark that comes with the Mac OS X devtools - really great for digging into your callstack and should allow you to limit to network libraries and friends.
Yes! Apple actually has some very concise guides on performance that cover a lot of tricks and techniques, I'm sure you'll find something relevant to your own application. There may be some additional guides specific to 10.5 I haven't seen yet, but here are three I've found useful in the past.
Performance Overview
Cocoa Performance Guidelines
Cocoa Drawing Performance Guidelines
The most important thing to take away though, is that you need to use performance tools to see exactly where the bottleneck is occurring. Sometimes it may be in the place you least expect it.
I think if you use Shark your just find your app is blocking waiting for answers back from the server. Code split across machines is far harder to benchmark as the standard tools can only benchmark part of the picture.
Sounds like you need to look into bundling calls up into fewer transactions.... Your bottleneck is almost certainly the network. What about supporting sending multiple calls as an array of calls? and the same for answers? Maybe you could buffer calls locally and only send them a few times a second as a single transaction?
Tony

How do you measure page load speed? [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I am trying to quantify "site slowness". In the olden days you just made sure that your HTML was lightweight, images optimized and servers not overloaded. In high end sites built on top of modern content management systems there are a lot more variables: third party advertising, trackers and various other callouts, the performance of CDN (interestingly enough sometimes content delivery networks make things worse), javascript execution, css overload, as well as all kinds of server side issues like long queries.
The obvious answer is for every developer to clear the cache and continuously look at the "net" section of the Firebug plugin. What other ways to measure "site dragging ass" have you used?
Yslow is a tool (browser extension) that should help you.
YSlow analyzes web pages and why they're slow based on Yahoo!'s rules for high performance web sites.
Firebug, the must have for web developers Firefox extension, can measure the loading time of different elements on your webpage. At least you can rule out CSS, JavaScript, and other elements taking too much time to load.
If you do need to shrink JavaScript and CSS loading times, there are various JavaScript and CSS compressors out there on the web that simply take out unnecessary text out of them like newline characters and comments. Of course, keep an ordinary version on the side for development sake.
If you use PNGs, I recently came across a PNG optimizer that can shrink PNG sizes called OptiPNG.
"Page Load time" is really not easy to define in general.
It depends on the browser you use, because different browsers may do more requests in parallel, because javascript has differents speeds in different browsers and because rendering time is different.
Therefore you can only really measure your true page load time using the browser you are interested in.
The end of the page load can also be difficult to define because there might be an Ajax request after everything is visible on the page. Does that count the the page load or not?
And last but not least the real page load time might not matter that much because the "perceived performance" is what matters. For the user what matters is when sHe has enough information to proceed
Markus
I'm not aware of any way (at least no I could tell you :] ) that would automatically measure your pages perceived load time.
Use AOL Pagetest for IE and YSlow for firefox (link see above) to get a "feeling" for you load time.
Get yourself a proper debugging proxy installed (I thoroughly recommend Charles)
Not only will you be able to see a full breakdown of response times / sizes, you can save the data for later analysis / comparison, as well as fiddle with the requests / responses etc.
(Edit: Charles' support for debugging SOAP requests is worth the pittance of its shareware fee - it's saved me a good half a day of hair-loss this week alone!)
I routinely use webpagetest.org, which you can use to perform performance tests from different locations, on different browsers (although only msie 7-9), with different settings (number of iterations, connection speed, first run vs 2nd visit, excluding specific requests if you want, credentials if needed, ...).
the result is a very detailed report of page loading time which also provides advise on how to optimize.
it really is a great (free) tool!
Last time I worked on a high-volume website, we did several things, including:
We used Yslow to get an analysis of the individual factors affecting page load: https://addons.mozilla.org/en-US/firefox/addon/5369
performance monitoring using an external, commercial tool called Gomez - http://www.gomez.com/instant-test-pro/
We stress tested using a continuous integration build, using Apache JMeter. http://jmeter.apache.org/
If you want a quick look, say a first approximation, I'd go with YSlow and see what the major factors affecting page load time in your app are.
Well, call me old fashioned but..
time curl -L http://www.example.com/path
in linux :) Other than that, I'm a big fan of YSlow as previously mentioned.
PageSpeed is an online checking tool by Google, which is very accurate and reliable:
https://developers.google.com/pagespeed/
If it's asp.net you can use Trace.axd.
Yahoo provide yslow which can be great for checking javascript
YSlow as mentioned above.
And combine this with Fiddler. It is good if you want to see which page objects are taking the most bandwidth, which are being compressed at the server, unexpected round-trips, and what is being cached. And it can give you a general idea about processing time in the client web browser as compared to time taken between server & client
Apache Benchmark. Use
ab -c <number of CPUs on server> -n 1000 url
to get good approximation of how fast your page is.
In Safari, the Network Timeline (available under the Develop menu, which you have to specifically enable) gives useful information about loading time of individual page components, as well as showing when each component started loading.
Yslow is good, and HttpWatch for IE is great as well. However, both miss the most important metric to a user "When is the page -above the fold- ready for use to the user?". I don't think that one has been solved yet...
There are obviously several ways to identify the response time, but the challenge has always been how to measure the rendering time that is spent in browser.
We have a controlled test phase in which we use several automated tools for testing the application. One of the output we generate from this test is a fiddler trace for each transaction (a click). We can then analyse the fiddler trace to understand the Time for last byte and subtract it with the overall time the page took.
Something like this
1. A= Total response time as measured by the an automated tool (in our case we use QTPro)
2. B= Time to last byte (Server + Network time, from the fiddler trace)
3. C= A-B (approx Rendering time, OR the time spent in browser)
All the above I explained can be made a standard test process and end of the test we could generate a break-up of time spent at each layer e.g. rendering time, network time, database calls etc...

How do you do performance testing in Ruby webapps?

I've been looking at ways people test their apps in order decide where to do caching or apply some extra engineering effort, and so far httperf and a simple sesslog have been quite helpful.
What tools and tricks did you apply on your projects?
I use httperf for a high level view of performance.
Rails has a performance script built in, that uses the ruby-prof gem to analyse calls deep within the Rails stack. There is an awesome Railscast on Request Profiling using this technique.
NewRelic have some seriously cool analysis tools that give near real-time data.
They just made it a "Lite" version available for free.
I use jmeter for session-based testing - it allows very fine-grained control over pages you want to hit, parameters to inject, loops to go through, etc. It's great for simulating how many real users your site can handle, rather than just performance testing a set of static urls. You can distribute tests over multiple machines quite easily by loading up the jmeter-server on computers with publicly accessible IP's. I have found some limitations in the number of users/threads any one machine can throw at a server at once (it depends on the test), but jmeter has helped my team improve our apps capacity for users to 6x.
It doesn't have any fancy graphing -- I actually use my own in-house graphing with gruff that can do performance analysis on request time for certain pages and actions.
I'm evaluating a new opensource web page instrumentation and measurement suite called Jiffy. It's not particularly for ruby, it works for all kind of webapps
There's also a Jiffy Firebug Extension for rendering the metrics inside the browser.
I also suggest you look at Browser Mob for load testing.
A colleague of mine has also posted some interesting thoughts on this.

Categories

Resources