I'm looking for a script/code that slows down the load time of a website. It might sound stupid but it's needed. I've tried with javascript...html codes.. .htaccess file and yeah.. I'm out of ideas. Im using a webshoes who's using "nginx". Anyone who has an idea? I feel very stupid now for asking. :P
So my goal is that when the client is hitting the enter key to visit example.com the website that shows up is going to simulate slow internet speed or something in that style. English is not my natvie language so I have a hard time to find the correct words to use..sorry for that.
Your question is so broad I can only give a broad answer.
Javascript would not be the best choice IMHO as it executes client-side. You IMPLY you want to slow things down on the server side.
You don't say which web server you're running and therefore which server-side scripting languages are available to you (e.g. php, C# via ASP.NET, etc.), but one thing you might try is figuring that out, then scripting a web page with built-in delays via functions such as the C# and php sleep() methods. This will cause the web server to pause as it generates the page.
Again, because your question is so broad, I can't give a more specific answer. My goal here is to point you toward something you can specifically do on the server side that we KNOW will slow things down.
The understanding here is that you are going to have a client hit the page that has the slowdown calls built in. If you're trying to do something more "nefarious", like simulating a stress test of the server by running it low on memory or CPU, those types of situations can slow down a server too, but are beyond the scope of this answer. You can Google stress testing tools; you'll want to have at least a basic grasp of system monitoring tools for your platform (Windows, Linux, etc.) as well if this is the path you're heading down.
As a developer and constant user of minipfoler, I use stakoverflow as the benchmark for my .NET sites. That is because the entire stack network is just a blazingly fast.
I know miniprofiler is used on stackexchange. There is a whole developers thing that can be used on stack but can we enable the stats to see how fast it really is?
I might be bit over obsessive here - but I am looking to improve permanences in milliseconds and the only viable benchmark is a large and complex site like stack exchange.
I know it might be a security issue to see live data but I just really want a benchmark (screenshot / guidelines) to see how far I can optimize my .NET MVC web application.
My actual IIS and MVC performance is fantastic and I think I am more concerned about server replies and client side stuff. So can I (and should I) put more effort into smashing down this response time?
This site is hosted in Azure Cloupapp and using Azure DB - I know about 60~180ms is used on connection times that are out of my control.
How can I improve times between Paint, Load and Complete?
I find that I answer my own question on StackExchange more often now a days. Not sure what that means. But this in interesting what I found while dealing with other Q&A's (And it answered this question)
Yes, you should avoid the obvious beginner mistakes of string
concatenation, the stuff every programmer learns their first year on
the job. But after that, you should be more worried about the
maintainability and readability of your code than its performance. And
that is perhaps the most tragic thing about letting yourself get
sucked into micro-optimization theater -- it distracts you from your
real goal: writing better code.
Posted by Jeff Atwood
There is no real problem in performance or serious delays. Its just an obsession that wont lead to much satisfaction.
The 'dudes' got a point. As long as my code is readable and it runs fast - what the heck more do I want?
PERFECTION! - Waste of time, lol#me!
My title pretty much says it all. I have been looking at mod_pagespeed and it somehow impresses me as being very little more than a way to offload the work of optimization to the server instead of the developer.
There may be some advantages to doing this such as reducing developer time etc so I'm not suggesting that it is all bad. But it also does strike me as a bit of a script kiddie way to do things. Rather than learn about all those performance techniques, hey! just let the server do it!
Is mod_pagespeed something that would be good to implement on my production web application or would I be better off doing the optimization "by hand"?
Here is the original announcement.
It seems to me that it could empower the server admin to centrally optimize content created by a large set of developers. Also, it could be a good way of baking in some well-tested (by Google) best practices that might be costly to develop on your own.
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I am trying to quantify "site slowness". In the olden days you just made sure that your HTML was lightweight, images optimized and servers not overloaded. In high end sites built on top of modern content management systems there are a lot more variables: third party advertising, trackers and various other callouts, the performance of CDN (interestingly enough sometimes content delivery networks make things worse), javascript execution, css overload, as well as all kinds of server side issues like long queries.
The obvious answer is for every developer to clear the cache and continuously look at the "net" section of the Firebug plugin. What other ways to measure "site dragging ass" have you used?
Yslow is a tool (browser extension) that should help you.
YSlow analyzes web pages and why they're slow based on Yahoo!'s rules for high performance web sites.
Firebug, the must have for web developers Firefox extension, can measure the loading time of different elements on your webpage. At least you can rule out CSS, JavaScript, and other elements taking too much time to load.
If you do need to shrink JavaScript and CSS loading times, there are various JavaScript and CSS compressors out there on the web that simply take out unnecessary text out of them like newline characters and comments. Of course, keep an ordinary version on the side for development sake.
If you use PNGs, I recently came across a PNG optimizer that can shrink PNG sizes called OptiPNG.
"Page Load time" is really not easy to define in general.
It depends on the browser you use, because different browsers may do more requests in parallel, because javascript has differents speeds in different browsers and because rendering time is different.
Therefore you can only really measure your true page load time using the browser you are interested in.
The end of the page load can also be difficult to define because there might be an Ajax request after everything is visible on the page. Does that count the the page load or not?
And last but not least the real page load time might not matter that much because the "perceived performance" is what matters. For the user what matters is when sHe has enough information to proceed
Markus
I'm not aware of any way (at least no I could tell you :] ) that would automatically measure your pages perceived load time.
Use AOL Pagetest for IE and YSlow for firefox (link see above) to get a "feeling" for you load time.
Get yourself a proper debugging proxy installed (I thoroughly recommend Charles)
Not only will you be able to see a full breakdown of response times / sizes, you can save the data for later analysis / comparison, as well as fiddle with the requests / responses etc.
(Edit: Charles' support for debugging SOAP requests is worth the pittance of its shareware fee - it's saved me a good half a day of hair-loss this week alone!)
I routinely use webpagetest.org, which you can use to perform performance tests from different locations, on different browsers (although only msie 7-9), with different settings (number of iterations, connection speed, first run vs 2nd visit, excluding specific requests if you want, credentials if needed, ...).
the result is a very detailed report of page loading time which also provides advise on how to optimize.
it really is a great (free) tool!
Last time I worked on a high-volume website, we did several things, including:
We used Yslow to get an analysis of the individual factors affecting page load: https://addons.mozilla.org/en-US/firefox/addon/5369
performance monitoring using an external, commercial tool called Gomez - http://www.gomez.com/instant-test-pro/
We stress tested using a continuous integration build, using Apache JMeter. http://jmeter.apache.org/
If you want a quick look, say a first approximation, I'd go with YSlow and see what the major factors affecting page load time in your app are.
Well, call me old fashioned but..
time curl -L http://www.example.com/path
in linux :) Other than that, I'm a big fan of YSlow as previously mentioned.
PageSpeed is an online checking tool by Google, which is very accurate and reliable:
https://developers.google.com/pagespeed/
If it's asp.net you can use Trace.axd.
Yahoo provide yslow which can be great for checking javascript
YSlow as mentioned above.
And combine this with Fiddler. It is good if you want to see which page objects are taking the most bandwidth, which are being compressed at the server, unexpected round-trips, and what is being cached. And it can give you a general idea about processing time in the client web browser as compared to time taken between server & client
Apache Benchmark. Use
ab -c <number of CPUs on server> -n 1000 url
to get good approximation of how fast your page is.
In Safari, the Network Timeline (available under the Develop menu, which you have to specifically enable) gives useful information about loading time of individual page components, as well as showing when each component started loading.
Yslow is good, and HttpWatch for IE is great as well. However, both miss the most important metric to a user "When is the page -above the fold- ready for use to the user?". I don't think that one has been solved yet...
There are obviously several ways to identify the response time, but the challenge has always been how to measure the rendering time that is spent in browser.
We have a controlled test phase in which we use several automated tools for testing the application. One of the output we generate from this test is a fiddler trace for each transaction (a click). We can then analyse the fiddler trace to understand the Time for last byte and subtract it with the overall time the page took.
Something like this
1. A= Total response time as measured by the an automated tool (in our case we use QTPro)
2. B= Time to last byte (Server + Network time, from the fiddler trace)
3. C= A-B (approx Rendering time, OR the time spent in browser)
All the above I explained can be made a standard test process and end of the test we could generate a break-up of time spent at each layer e.g. rendering time, network time, database calls etc...
I've been looking at ways people test their apps in order decide where to do caching or apply some extra engineering effort, and so far httperf and a simple sesslog have been quite helpful.
What tools and tricks did you apply on your projects?
I use httperf for a high level view of performance.
Rails has a performance script built in, that uses the ruby-prof gem to analyse calls deep within the Rails stack. There is an awesome Railscast on Request Profiling using this technique.
NewRelic have some seriously cool analysis tools that give near real-time data.
They just made it a "Lite" version available for free.
I use jmeter for session-based testing - it allows very fine-grained control over pages you want to hit, parameters to inject, loops to go through, etc. It's great for simulating how many real users your site can handle, rather than just performance testing a set of static urls. You can distribute tests over multiple machines quite easily by loading up the jmeter-server on computers with publicly accessible IP's. I have found some limitations in the number of users/threads any one machine can throw at a server at once (it depends on the test), but jmeter has helped my team improve our apps capacity for users to 6x.
It doesn't have any fancy graphing -- I actually use my own in-house graphing with gruff that can do performance analysis on request time for certain pages and actions.
I'm evaluating a new opensource web page instrumentation and measurement suite called Jiffy. It's not particularly for ruby, it works for all kind of webapps
There's also a Jiffy Firebug Extension for rendering the metrics inside the browser.
I also suggest you look at Browser Mob for load testing.
A colleague of mine has also posted some interesting thoughts on this.