Logging local page load times in browser - performance

I am trying to generate a log of page load times (time from first byte of HTML to the onload event) from my actual browsing habits. An ideal solution would produce, after performing a google search for example, a log something like this:
http://google.com/ 523
https://www.google.com/search?q=asdf 1003
Where the pages took 523ms and 1003 to load.
A firefox or chrome extension that works on linux or mac would be ideal, as I'm trying to track in the context of normal everyday browsing.

If you install the NetExport Firebug extension, it will allow you to export all collected and computed data from the Firebug Net panel as a HTTP Archive (HAR) format file (based on JSON).
You can then view this file that includes load times using HAR Viewer or other tools.

Gavin,
Try using the FireBug Plugin in FireFox. http://getfirebug.com/
It shows you when the file started and stopped loading relative to all the other files.
Worth checking out.
Regards,
Ninad

You need Fiddler. http://www.fiddlertool.com
Another great tool which does what you requested in the comments is dynatrace ajax. http://ajax.dynatrace.com/ajax/en/ It hooks into the browser and keeps metrics on actual render and js times. http://ejohn.org/blog/deep-tracing-of-internet-explorer/

Related

JMeter takes long time to download resource

I am testing a static pages of website. I am talking only about homepage here. This home page takes around 20 seconds for full download and in around 6 sec you would see the page ( so user perspective is good)
now that page has eleven mp4 files, and in JMeter they take long time to download, why is that? In Google developer tool that page shows download completed in around 20 sec but my JMeter recorded script reports 60 sec for downloading all resources.
I also used Parallel controller with 6 connection but with that too it reports around 1 min of response time for the whole page. ..and once again, it takes around 20 min in Chrome to download all content.
If I use main(first) URL of homepage and check "Retrieve all embedded resources", then all resources would be downloaded but mp4 files are not downloaded. Why is that?
I also have all recorded requests of the homepage. And if I run those requests then all requests are fine but mp4 takes more time to download than they take in Chrome and response time goes to around 60 sec.
I tried my best to explain the scenario. Please Please help. Your help would be deeply appreciated.
Are you sure that these .mp4 files are being downloaded when the user opens the page?
If yes, are you sure that they're being downloaded fully, for example most common scenario is that the server sends Content-Range header telling the browser which part of the .mp4 file(s) to fetch
If no - you shouldn't be downloading them in your JMeter test at all. Compare the traffic from the real browser and from JMeter using a 3rd-party sniffer tool like Wireshark or Fiddler and make sure that JMeter sends the same request as browser does
Try opening the page in browser with clear cache. It might be the case that the browser has cached the heavy content hence no actual requests are being made and multimedia resources are being returned from memory or disk cache. If this way you will see similar timings - add HTTP Cache Manager to your test plan

Does Jmeter 2.11 supports pdf editor

Does Jmeter 2.11 supports pdf editor's data capture?. Kindly anyone advice, how to record pdf data using jmeter 2.11 version. Because our application made up on java and its a web application.
The issue is,
Log in to the application
Click on user information link
The request will be opened in a pdf editor format, where we are updating user information.
Here is the issue, I can not record the pdf editor information. While recording, the pdf editor page is not even opened. I tried to disable my antivirus protection too.. :(
JMeter is designed to functionally test server-side components, whereas opening a PDF occurs on the client-side, albeit after the PDF has been transmitted over the wire to the end-user's browser.
As such, JMeter will only show you:
How long it took the server to process the request and begin serving up the response (PDF) to the browser/JMeter
How long it took to transmit the response (PDF) over the wire back to the browser/JMeter
If you're interested in physically manipulating items on the client-side, which also has the effect of thereby testing the server, check out http://docs.seleniumhq.org/

?_escaped_fragment_= - headless browser

what do I have to do to add a ?_escaped_fragment_= support to my server? I want google to be able to crawl through my ajax site. My hashes are already in #! form
But I have no idea how to tell my server that when I enter mywebsite.com/?_escaped_fragment_=section to my browser so the url mywebsite.com/section and it would be equal to mywebsite.com/#!
thanks
Simple answer - my method (soon to be used for a site with ca. 50,000 AJAX-generated URLs) is to have a node.js server using a headless environment (try zombie, phantomjs, or any other) to load the site, making sure it's able to execute javascript and read the DOM - then at runtime, if it's google requesting the fragment, fire a request to the node.js server, which loads the site, executes the javascript, waits for the response, and delivers back the HTML, which is output to the browser.
If that sounds like a lot of work - I'm about 90% finished on the code that does it all for you, where you'd simply drop one line of (PHP) code at the top of your site/app and it does the rest for you, using a remote node.js server.
The code will be open source so if you want to set it up yourself on a node server, you can - or if it's a PITA to set it up yourself, I'll probably have a live server up and running which your app/website would fire ?_escaped_fragment_ requests to, and get the html snapshot back. It also implements caching so that these are only requested once every X days.
Watch this space - just got a few kinks to work out, and it'll be on my site (josscrowcroft.com) and I'll put it in a github repo too.

Kohana execution time is fast, but overall response time is slow, why?

I use the Kohana3's Profiler class and its profiler/stats template to time my website. In a very clean page (no AJAX, no jQuery etc, only load a template and show some text message, no database access), it shows the request time is 0.070682 s("Requests" item in the "profiler/stats" template). Then I use two microtime() to time the duration from the first line of the index.php to the last line of index.php, it shows almost very fast result. (0.12622809410095 s). Very nice result.
But if i time the request time from the browser's point of view, it's totally different. I use Firefox + Temper data add-on, it shows the duration of the request is 3.345sec! And I noticed that from the time I click the link to enter the website (firefox starts the animated loading icon), to when the browser finish its work(the icon animation stops), it really takes 3-4 seconds!!
In my another website which is built with WikkaWiki, the time measured by Temper Data is only 2190ms - 2432ms, including several access to mysql database.
I tried a clean installation of kohana, and the default plain hello-world page also loads 3025ms.
All the website i mentioned here are tested in the same "localhost" PC, same setting. Actually they are just hosted in different directories in the same machine. Only Database module is enabled in the bootstrap.php for kohana website.
I'm wondering why the kohana website's overall response is such slow while the php code execution time is just 0.126 second?? Are there anything I should look into?
==Edit for additional information ==
Test result on standard phpinfo() page is 1100-1200ms (Temper data)
Profiler shows you execution time from Kohana initialization to Profiler render call. So, its not a full Kohana time. Some kind of actions (Kohana::shutdown_handler(), Session::_destroy() etc) may take a long time.
Since your post confirms Kohana is finishing in a 1/10th of a second and less, it's probably something else:
Have you tested something else other than Kohana? It sounds like the server is at fault, but you can't be sure unless you compare the response times with something else. Try a HTML and pure PHP page.
The firefox profiler could be taking external media into consideration. So if you have a slow connection and you load Google Analytics, then that could be another problem.
Maybe there is something related with this issue: Firefox and Chrome slow on localhost; known fix doesn't work on Windows 7
Although the issue happens in Windows 7, maybe it can help...

Howto easily create a diagram using firebug loading and/or executing times?

I need to create diagram showing the most time consuming tasks when a specific page gets loaded.
Well, Firebug has this nice feature to show you all loading times of files in the network section or if i can use the profiler alternatively (console).
Now i am looking for the easiest way to get a diagram (pie chart) from the results without typing all the files and time values into an excell table.
Any suggestions?
You can export as a Har file (extension to allow firebug Har log file export), this HarViewer looks promising...
https://github.com/janodvarko/harviewer
HAR Viewer is a web application (PHP + Javascript) that allows to visualize HTTP tracing logs based on HTTP Archive format (HAR). These files contain recorded information about HTTP traffic performed by web pages.
---Also---
http://www.imagossoftware.com/harlog/
HarLog takes HAR format files or a HAR HTTP formatted stream and creates a tab delimited output file. The output file can then be imported into Excel or similar to create graphical reports.
You can generate a chart using the Google Chart API: http://code.google.com/intl/de/apis/chart/ directly in the browser. (no excell needed ;-) )

Resources