Web page with upload/download speed for a user - performance

I want to make an URL that can be sent to the user. When he follows that URL, he'll get a page displaying upload/download speed for that user to the server the page is hosted on. Basically, I want to host my own speedtest. It'll be used for troubleshooting so fast to implement but dirty is better than neat and proper solution.
On the server I've got PHP, perl, python, apache and nginx and can use any of them. In what direction should I look?

I never like those speedtests, because they don't really represent your bandwidth accurately at all. Used to live in a university dorm with 100 MBit, and the pipe would indeed give me in excess of 90 MBit if properly utilized. But the speedtests would return no more than 15 MBit ever. Part of the problem is that web tests (a) don't do multiconnection tests, (b) run into various bottlenecks such as their own servers, their ISPs, or their TCP stacks, etc.

Ended up using speedtest. It allows to embed the speed tracker into your page.

Related

Sending 100's of page request at the same time

I want to test the performance of my website. I have hosted it on godaddy and I want to see how it performance when 100s of users are trying to access it.
Is their a way to do the above? Is their a script that can be developed to send multiple page request?
Thanks
Consider trying Jmeter or siege.
Apache Bench is commonly used for doing load testing (which is pretty much what you are describing). There are also a bunch of services that will do it for you (some free, most with varying costs).
You could simply script curl or whet to beat on it in parallel but just throwing load at it isn't terribly useful if you don't also track how the site performs under the load (which is where the other tools come in).
One thing to watch out for is if you test just the base page/application or if you use a real browser engine to test the full page (including images and static resources).

Does some optimized web servers for single page application exists?

When we do single page application, the webserver basically does only one things, it gives some data when the client asks them (using JSON format for example). So any server side language (php, ror) or tool (apache, ningx) can do it.
But is there a language/tool that works better with this sorts of single page applications that generates lot of small requests that need low latency and sometimes permanent connection (for realtime and push things)?
SocketStream seems like it matches your requirements quite well: "A phenomenally fast real-time web framework for Node.js ... dedicated to creating single-page real time websites."
SocketStream uses WebSockets to get lowest latency for the real-time portion. There are several examples on the site to build from.
If you want a lot of small requests in realtime by pushing data - you should take a look at socket type connections.
Check out Node.js with Socket.io.
If you really want to optimize for speed, you could try implementing a custom HTTP server that just fits your needs, for example with the help of Netty.
It's blazingly fast and has examples for HTTP and WebSocket servers included.
Also, taking a look at GWAN may be worthwile (though I have not tried that one yet).
http://en.wikipedia.org/wiki/Nginx could be appropriate

Please Help Me Troubleshoot Why My Site Is Loading So Slowly

My website is http://secretpassagesbooks.com/. It runs on the latest version of wordpress and is hosted via GoDaddy on a shared web server.
My website takes at anywhere from ten seconds to one minute to load, and I don't understand why. I have tested in IE, FireFox, and Chrome, and the page speed is the same. I performed several speed tests at various online speed test sites and have an average load time of 5 - 6 seconds. Yet when I click on a link to my URL or enter it directly it takes in excess of 30 seconds (sometimes more than a minute) to load the index page.
Here is what I have done so far to troubleshoot the issue:
I have the YSlow and Page Speed extensions installed in Firebug
Yslow test gives me a "Grade A -Overall performance score 90"
My Page Speed a score is 94/100
I have the W3Cache wordpress plugin installed and am using page, browser, and database object caching
I've tried minimizing as much CSS and JavaScript as possible
The site is using HTTP compression
Is there anything more I can do with this design, or is it case of my shared web server being overloaded? Thanks in advance for all your help.
YSlow, etc detect problems in the HTML, Javascript and CSS parts, and these are probably OK. It looks like your hosting is to blame.
If those plug-in results are correct (and I've no reason to doubt they are), then it's most likely a case of your virtual server simply being overloaded.
I presume you have no such issues running an identical site in a "local" production environment either, although you might want to try this to confirm if you've not already done this.
Incidentally, a tale-tell sign of an overloaded VPS/shared hosting solution is if the first page load is incredibly slow, but subsequent loads are "normal" - a common reason being that your "decicated" sandbox is being awoken from a sleep/low resource state. (This also seems to be the case as far as your site is concerned.) As such, it's possible (I don't know the details of this server, such as whether you have a "guaranteed" resource level for CPU, memory, etc.) that other sites on this particular server are using more than their fair share of bandwidth until your site kicks in.
Based on some tests from a tool that I built (The Performance Grader at JoomlaPerformance.com), wow is it bad...
Notice that the HTML took approximately 21.83 seconds to download (from the initial request, to the last object being downloaded). Not to mention that the page is nearly 300kb (which is fairly large for only having 7 images)...
This is where the issue is. Notice that the connection and DNS phases are fine, but the generation phase is really REALLY slow. That's where your problems are. It's server-side. So, you need to debug why it's slow. Some areas to look at are the SQL queries that are being executed (and if they are slow), any slow plugins, etc. Try disabling things one at a time to see if each makes a measurable difference or not.
My "hunch" is that your database is either overloaded, or your queries are very expensive. So in short, you can try another host to see if that helps (which is the solution more than you'd think)...
As most of you pointed out, the issue seemed to be with the server. I contacted GoDaddy and explained the situation. It turns out that my site was hosted on one of their legacy servers and was most likely overloaded. They switched me over to one of their grid servers (no cost) and now everything is loading quickly. Thanks for all the responses. I spent a lot of time tweaking the design, removing plugins one by one, reducing as many HTTP requests as possible, and generally went crazy trying figure out how to best optimize my site. After a few days and a lot of tests, I could not accept that the problem was client-side, especially after all the optimization test I ran showed my site was ok. So good to have it settled...for now, at least.
GoDaddy's webhosting is the bottleneck to your website, you should probably go for a VPS if you have got an advanced website with loads of lookups!

How to measure how much traffic a server could potentially handle?

Are there any tools that would enable me to load-test my server and tell me how much traffic it could roughly handle?
By traffic I mean how many requests per second it can consistently serve without timing out.
I realize that every server is different, and so is every application that runs on that server. That's why I thought this route may be the way to go.
Thanks a bunch!
For a very simple benchmark on web servers (if your request is the same every time), you could use ab. A very simple tool, but it gives some interesting statistics nonetheless.
If you dont mind going the paid software route, then LoadRunner is a very good choice, IMO.
I have used LoadRunner in the past for doing this kind of measurement for multiple web and non web applications.
And I like The Grinder. The only downside (but I don't know if others are able to do this) is that it doesn't replay well the ASP.NET hugely long generated URLs

Is perl the fastest way to write a high performance page?

I was inspired by Slashdot, I was heard that it uses very limited servers to support a lot of users with fast response. And there is a website named slashcode, not sure if slashdot uses its source code.
I am wondering if Perl is the best to write a high performance web page? I know using Apache or IIS will be having a lot of overhead?
Any idea, books, papers, tutorials?
I'm going to assume that by "high performance" you mean both in the real time taken to produce a page and also how many it can serve concurrently.
The programming language isn't so important as your servers and algorithms. You may want to look into The C10k Problem which is a series of new technologies and refinement of techniques with the aim to allow a single web server to concurrently handle more than 10,000 concurrent connections. Things like the Nginx and lighttpd web servers and varnish cache came out of this project.
Big wins come from using a very light, very fast, very modular web server (Apache and IIS ain't it) with a very light, very fast cache in front of it to avoid having to process the same thing twice. For a high concurrency server, even caching for a few seconds can save you hundreds or thousands of processes. By chopping up a static page into a series of AJAX requests you can cache the more static bits and pieces independently of the bits that change frequently.
Instead of using mod_blah that embeds your program into a web server, use FastCGI or similar that puts your programs into their own little application servers. This allows them to run independent of the web server, possibly on remote machines and with load balancing. This lets you easily scale your processing power.
Eventually you're going to micro-optimize really important bits of your application code to the point where the language matters, but you can focus on the really important bits rather than having to do the whole project solely according to raw performance.
Regardless of how fast your code is, at some point the bottleneck will stop being your code, and start being the web server itself.
As long as you're not using the CGI interface[1] to talk to the web server, the language isn't going to have a noticeable impact on performance in 99% of cases. The exceptions are those in which you're doing heavy back-end processing rather than simply grabbing something out of a database, lightly massaging it, and sending it off to the user - and, if you are doing that kind of thing, you're likely better off doing it asynchronously if possible and stuffing the results into a database to be lightly massaged and viewed later.
The reason is, quite simply, that network connection and data transfer times will be so much longer than your program's execution time that it's not even funny. If it's taking 2 seconds to establish a network connection to the server and do the data transmission in each direction, nobody is going to care whether the processing on the server adds 0.1s or 0.2s on top of that 2s of network activity.
[1] Note that I am talking here about the vanilla CGI "start up a new process to service each incoming request" model, not the Perl CGI module (CGI.pm/use CGI). There are ways to use CGI while also making use of a long-lived process which handles multiple requests over its lifetime.
Architecture and system design are more important than language choice for a high traffic app.
But selecting a language is not the first thing you should do, unless you are planning to write everything from the ground up.
You should be selecting a toolset.
If you want to have something soonest, look at existing web applications. What meets your needs? How customizable is it? Does it meet your performance/scalability requirements? If so, the language you use will be the language your app uses.
If you can't find a good match in existing apps, look at different frameworks, Catalyst, Rails, Squatting, Camping, Jifty, Django. There's a nice list of them on Wikipedia.
You should be able to find a framework that will do the job, many of them. Pick some contenders and choose one. The language you use will be the language your framework uses.
There's really no such thing as a "high performance page". That's like asking what the fastest car is (and if you watch enough Top Gear, you know that's not a simple answer). You have to think about what you actually want to do (i.e. the particular task), what you have to do to make that happen, and which tools would work best for that.
Are you going to have a lot of people doing a lot of small things, or fewer people doing really big things? Is it all going to happen at once (i.e. spikes), or is it going to be constant demand? Are you send back small chunks of data or serving up really large files?
Suppose that every portion were as fast as possible. It's a fantasy for sure, but consider it anyway. Now that everything is fast as possible, rank every part according to how relatively fast they are. What's the slowest part? Is it disk access? Network IO? Socket availability?
If you aren't at the point where you're already thinking about this, the language probably isn't that important beyond your skill with it.
There are a lot of books on web performance out there. :)
This post on serverfault suggestst that you could write an extension module to nginx for serving dynamic content.
Such modules need to be compiled to native machine code, so most likely are faster than running Perl.
I don't believe it would be faster than other common choices such as PHP, Python, Ruby, Java, or C#.

Resources