I'm optimizing a Flash website in order to load as quickly as possible, and in certain edge cases the SWF takes upto 10 seconds to load. There are no external assets except for the SWF file, and the only JS is embedded into the HTML page.
As this trace shows, the SWF request is sent by the browser only after the DOMContentLoaded event fires, which takes 8 secs. In most cases the SWF starts loading in about 2 seconds.
Is there a way to quicken the request latency? Currently the SWF is inserted with Javascript, using the SWFObject library, so only after the browser renders the HTML page does the JS execute and add the SWF tag into the webpage. What if I added the same SWF tag into the <head> section. Would this somehow preload the SWF causing the JS to get the SWF immediately when it executes later in the <body>?
Usually the rule of thumb in load-time optimization is to use as few requests (files) as possible. But the SWF file must be seperated from the HTML file. Or is there a way to embed it as base64 into the HTML, and have JS convert this into a file and load it as a Flash <object> tag instantly? I'm willing to try any tricks as long is its compatible/reliable with all browsers.
Specs:
Internet : I have an 5 MBit/sec internet connection, and I can ping the server in 280 ms.
Location : The webserver is in the US, and I'm currently in Mumbai, India.
Filesizes : 20 KB for the SWF, 5 KB for the HTML.
I blame the server, i gather its a shared hosting server...resources could be limited, even though you ping the server quickly the server then must navigate to where your website is on there hard disks..also other factors of bandwidth matter too...take into account not only your server provider but your provider, and other general congestion in between you and the US.Try a closer server...
I had this problem as i am based in Australia and i too had a US server...good at times but more or less unreliable, so i got a premium server in Australia in my main capital city(works perfect). Alot of web hosting providers offer free trials to test there speeds.
See if this is an issue...
might be a hassle but could help with your problem.
TEST:
in the run enter (CMD)
then type "ping yourwebsite.com"
results for my website are 20ms / www.parele.com.au (sydney based server Australia)
Related
I am testing a static pages of website. I am talking only about homepage here. This home page takes around 20 seconds for full download and in around 6 sec you would see the page ( so user perspective is good)
now that page has eleven mp4 files, and in JMeter they take long time to download, why is that? In Google developer tool that page shows download completed in around 20 sec but my JMeter recorded script reports 60 sec for downloading all resources.
I also used Parallel controller with 6 connection but with that too it reports around 1 min of response time for the whole page. ..and once again, it takes around 20 min in Chrome to download all content.
If I use main(first) URL of homepage and check "Retrieve all embedded resources", then all resources would be downloaded but mp4 files are not downloaded. Why is that?
I also have all recorded requests of the homepage. And if I run those requests then all requests are fine but mp4 takes more time to download than they take in Chrome and response time goes to around 60 sec.
I tried my best to explain the scenario. Please Please help. Your help would be deeply appreciated.
Are you sure that these .mp4 files are being downloaded when the user opens the page?
If yes, are you sure that they're being downloaded fully, for example most common scenario is that the server sends Content-Range header telling the browser which part of the .mp4 file(s) to fetch
If no - you shouldn't be downloading them in your JMeter test at all. Compare the traffic from the real browser and from JMeter using a 3rd-party sniffer tool like Wireshark or Fiddler and make sure that JMeter sends the same request as browser does
Try opening the page in browser with clear cache. It might be the case that the browser has cached the heavy content hence no actual requests are being made and multimedia resources are being returned from memory or disk cache. If this way you will see similar timings - add HTTP Cache Manager to your test plan
Upon loading my site (https://ameersavage.com), certain images do not load in Chrome. Upon reloading the page, the results vary. Rarely does everything load properly in Chrome. This only started happening once I uploaded the site to the server.
When I uploaded the site using the server's online cPanel, none of the assets appeared until I started fiddling with each individual file's permission. I then tried uploading the site using FTP (Transmit) which produced far better results, but still certain assets sometimes do not always show up once the page loads.
Do you think this is a reflection of poor server performance? I wanted to get a second opinion before I subscribe to a new server provider (any recommendations?).
I have a same site deployed on 2 different servers, The source files on the servers are identical, so I have to assume what is being sent by webservers on servers are different, as one has javascript that works in IE11 and the other server sends something that breaks the Javascript on the very same page and input (posted data).
How do I either save all the resources sent to a request or use IE11 F12 tools to save all the received resources?
Is there any tools for emulating browsers and saving all the sent responses to disk? (I tried saving the network traffic in IE11 to disk , but there are details about reponse times etc, that are making a compare extremely difficult).
You can save all the files delivered by the server to IE11 by clicking the Page button, then choosing Save As. Select Web Page, complete to save all the files.
The first thing to check would be to use the IE developer tools to watch for errors as the page loads.
Corrected due to comment - thank you
I am trying to generate a log of page load times (time from first byte of HTML to the onload event) from my actual browsing habits. An ideal solution would produce, after performing a google search for example, a log something like this:
http://google.com/ 523
https://www.google.com/search?q=asdf 1003
Where the pages took 523ms and 1003 to load.
A firefox or chrome extension that works on linux or mac would be ideal, as I'm trying to track in the context of normal everyday browsing.
If you install the NetExport Firebug extension, it will allow you to export all collected and computed data from the Firebug Net panel as a HTTP Archive (HAR) format file (based on JSON).
You can then view this file that includes load times using HAR Viewer or other tools.
Gavin,
Try using the FireBug Plugin in FireFox. http://getfirebug.com/
It shows you when the file started and stopped loading relative to all the other files.
Worth checking out.
Regards,
Ninad
You need Fiddler. http://www.fiddlertool.com
Another great tool which does what you requested in the comments is dynatrace ajax. http://ajax.dynatrace.com/ajax/en/ It hooks into the browser and keeps metrics on actual render and js times. http://ejohn.org/blog/deep-tracing-of-internet-explorer/
I use the Kohana3's Profiler class and its profiler/stats template to time my website. In a very clean page (no AJAX, no jQuery etc, only load a template and show some text message, no database access), it shows the request time is 0.070682 s("Requests" item in the "profiler/stats" template). Then I use two microtime() to time the duration from the first line of the index.php to the last line of index.php, it shows almost very fast result. (0.12622809410095 s). Very nice result.
But if i time the request time from the browser's point of view, it's totally different. I use Firefox + Temper data add-on, it shows the duration of the request is 3.345sec! And I noticed that from the time I click the link to enter the website (firefox starts the animated loading icon), to when the browser finish its work(the icon animation stops), it really takes 3-4 seconds!!
In my another website which is built with WikkaWiki, the time measured by Temper Data is only 2190ms - 2432ms, including several access to mysql database.
I tried a clean installation of kohana, and the default plain hello-world page also loads 3025ms.
All the website i mentioned here are tested in the same "localhost" PC, same setting. Actually they are just hosted in different directories in the same machine. Only Database module is enabled in the bootstrap.php for kohana website.
I'm wondering why the kohana website's overall response is such slow while the php code execution time is just 0.126 second?? Are there anything I should look into?
==Edit for additional information ==
Test result on standard phpinfo() page is 1100-1200ms (Temper data)
Profiler shows you execution time from Kohana initialization to Profiler render call. So, its not a full Kohana time. Some kind of actions (Kohana::shutdown_handler(), Session::_destroy() etc) may take a long time.
Since your post confirms Kohana is finishing in a 1/10th of a second and less, it's probably something else:
Have you tested something else other than Kohana? It sounds like the server is at fault, but you can't be sure unless you compare the response times with something else. Try a HTML and pure PHP page.
The firefox profiler could be taking external media into consideration. So if you have a slow connection and you load Google Analytics, then that could be another problem.
Maybe there is something related with this issue: Firefox and Chrome slow on localhost; known fix doesn't work on Windows 7
Although the issue happens in Windows 7, maybe it can help...