Drupal vs WordPress performance comparison - performance

In the beginning i built my site - bemcapaz.net - on Wordpress. But after having to hack the core and build lots of stuff through direct programming I decided to move on to Drupal.
Drupal besides being a CMS focused more on community websites is great for doing anything you can imagine in a really simple way, even a blog which was what I created.
My question now is, which one offers the best performance? I think Drupal looks to be really heavier than Wordpress but since I'm not an advanced programmer I have no idea how to evaluate which one offers the fastest MySQL requests and loading times of the web pages.
Thanks.

Drupal is definitely heavier in the sense that it runs more queries per page once you've customized it. Using modules like Views, you can also build your own dynamic queries to drive widgets and pages. Those can be as speedy or as slow as the underlying combination of joins allows.
On the other hand, Drupal does have much more robust caching controls. Full-page output caching for anon users, granular caching of widget output, and granular caching of any data retrieved by a Views query can all combine to help quite a bit. There are also plugin modules like "Boost" or "Memcached" that let you augment that underlying cache system with materialized HTML files in the filesystem (bypassing Drupal directly in favor of apache), or a memcached server that stores all the cached information in memory rather than the database.
If you're looking to discover hot spots in a Drupal site you should also install the Devel module; it allows you to get query counts and detailed query times for each page on the site, and track them down to the module that's running them.

Don't know about Drupal, but in WP you can estimate query time with following code:
Just add it to your footer, after any queries.
<?php echo get_num_queries(); ?> queries. <?php timer_stop(1); ?> seconds.
I suppose, performance for both CMS depends on numbers and complexity of queries and caching mechanism. If you are using them both wisely, your performance gonna be OK. I mean - ask your database only for info you really need ;)

If someone (like me) wants a raw and simple time-comparasion, I wrote the exact same App 3 times (with 3 frameworks), which's result will share below.
Note that I didn't do any heavy queries.
Or anything which would affect the results
(in favor of one framework over the other).
In my local with Core-i7 CPU, and SSD storage:
Drupal toke 5 seconds to show a simple page:
And that because of caching,
If I clear cache manually, takes 25 seconds (and recreates cache, so the next run takes 5 seconds again),
But don't worry a server's hosting-machine is far stronger than my local-host (>ლ)
While WordPress toke 17 seconds for same page (and that always).
Again, don't worry a server's host is far stronger than my local.
Rewriting with Laravel, same App takes 850 mili-seconds ;-)
So, if you don't have the time, money or knowledge to create a basic CMS with Laravel, then Drupal is the obvious winner (but harder to learn compared to WordPress).

Related

ColdFusion Caching Solutions for Fusebox 4

I have an application that was built using Fusebox 4 with ColdFusion. Can anyone recommend a good caching solution, that is a plugin, which works directly with this older version of the framework?
Another idea I've been tinkering around with is to take the most commonly used queries in the system and applying cachedWithin. The value would be a variable stored in the application scope. Basically anytime we update any of the most commonly accessed tables in the db, we update the application.cachedwithin variable as well. So whenever these tables are updated the data is refreshed. Anything else that isn't used frequently will simply query the DB to get the content.
Also to add to this very simple caching methodology would be to simply store strings, or other frequently used content, directly within the application scope.
This bulk of this application is around 30 pages, comprised of approximately 200 products. So its quite a small website.
Can anyone recommend a good Fusebox 4 cache plugin or confirm if this simple caching methodology is a good idea? If not, could you recommend a simple alternative?
thanks in advance
I would suggest you to use cfcache to store all pages output into statistics HTML files.
Then on any update, you can clear the cache of the updated pages or all the cache:
<cfcache action="flush" />
<cfobjectcache action="clear" />
make sure to disable the urlSessionFormat() in URL.
I'm not sure that you even need to be caching given the size of the site, unless you are getting a huge amount of traffic. If you are currently having performance problems, the first thing to do is make sure that Fusebox is in production mode, so that it isn't recreating the parsed files on each request.
Caching the queries should certainly aid performance - how long are the queries currently taking to execute? With Fusebox 4, it can be problematic to have "Report execution times" turned on in CF when debugging, as it can significantly affect the time the request takes to execute.

Very high page load times?

I have a Drupal site built on a shared host and I'm finding that the site is very slow to respond. I susepect it's the host and not my Drupal/database configurations but I don't know how to decipher the results from Pingdom.
I have also read Explanation of Pingdom Results but am unsure of how to resolve my problems.
Pingdom results show a Load Time of 60 seconds.
Performance Grade tab shows results of all items at or near 100.
According to the Page Analysis tab, most of the time is spent on the Wait state.
Does the above indicate a problem with my hosting or perhaps domain name provider or is there something that I can do to improve performance of my website?
I should also mention that I've used other tools like Google's Page Speed Chrome plugin and Firefox's Yslow plugin and both give an above average rating to my webpages which leads me to believe it's an issue with my host.
Drupal has this issues of abusing database queries especially if you use a lot of modules on one page and do not cache anything. That may slow down your site considerably. I use Pressflow Drupal`s profile to reduce some load times I also ad Varnish to server (you can look at Memcache too) I also add Boost module to the site itself. But the most important thing is to get query per page load number right. If have written some custom code optimize it. Look for ways to get same data without sending queries to the server maybe some data was already loaded to the page and you do not need tome queries.
In your particular case I think that some lose loop which does not end but has safety trigger which kills it after certain amount of time. I can bet that the reason is somewhere in your custom code or some underdeveloped module. Try to enable display of all the error.
P.S. Example of such page would be the best way determine what is wrong.

How to Increase page loading speed in Zend Framework Application

I have developed application using ZF.The app is little big with a lots of features.
I use Zend_Application(already using autoloader in constructor),Zend_Layout,Zend_view,Zend_form,etc. My current issue is, the page loading is very slow and that too in localhost with XAMP.
I have enabled xdebug, to investigate the issue, got a cachegrind file in "tmp" folder and tried to view it with WinCachegrind software. There i can a see a lot of processes and functions being run for each and every request or page load.
Also, i have installed YSlow add-on for firefox and observed the speed of page loads in seconds...I have compare the speed with ZF and non ZF applications. And from the comparison, the pages for non zf app takes less than 1 sec to load and for the ZF app, it takes atleast 6-7 seconds. What a huge difference.
Main Things happen in the app are :
1) Database connection happens for each request.
2) Im not adding the view to layout explicitly,ZF just appends it automatically, to layout.phtml, based on the action name.
3) Some windows have forms with few drop down boxes which fetches data from the database.
4) Have menus with ACL implimented, before it was loading the privilges from DB for each and every request, but now i have optimized it, so that it will work only duiring the login and rest of the time it will take from the Zend_Registry.
I would like to attach the cachegrind file so that some one can see whats happening in the background, but i cant see an option here for attaching.
Someone please help me to find a solution for this. Any kind of help is really appreciated. Thanks a lot
Let's try to give some hints.
First database connection should happen only once (except if you use several privileges access on the database or several databases). So check that you use Singleton patterns with you Zend_Db_Tables object
Then you do not use Zend_Cache. You should really start to use Zend_Cache and build several cache objects. Let's say for example a File cach, with long term storage, and a memcache or Apc Cache, storing objects. Then use these cache in several layers:
gives the FileCache to Zend_Db_Table (defaultMetaDataCache), this way you will avoid a loot of metadata queries, queries that ask for description of each columns of the tables you use.
Store one or more Acl object (depends on how you use Acl, if you have one big Acl with all rules or several with subsets). And store them in mid-duration caches when they are built.
Think of other usages, detect heavy loops, semi-static contents (like you select lists, how many time should they be considered static?)
Finally, get a whole mental image of how your application engine works, and how your data will grow and be used.You will need that step to use application levels caches in the very best way (for example should some elements be cached for groups of users?, should Acl objects be build for groups, for each user, for everybody, is ther some blocks in the layout that should be rendered the same for everybody?).

Magento Performance Tuning

I have a server which is running more than 15 Magento stores, but they are not performing well, though I have a giant server for hosting them. My server configuration is - 8 CPU's Quad Core 24GB RAM and 2 TB HDD.
My current page load is 1.6sec. I want it under 600ms. I have already installed APC, & eAccelerator and tuned Apache's parameters. I am using the latest Magento version.
Please suggest.
-Ramesh
You might look at enabling block caching as explained here. It should work pretty well on the category and product pages but you have to be extra careful to apply proper cache tags and identifiers to make sure the content you are displaying is always up to date...
First things first, what is actually bottlenecked? Optimization is always about tradeoffs, and you may just make things worse if you're looking in the wrong places. Make use of top (assuming you're on Linux here) and see what your processor/memory usage look like.
I'm going to take a stab in the dark here and say that, if you have already added an opcode, you may be waiting on other HTTP requests for page load. Use YSlow on Firefox and see if you are trying to load excessive amounts of data. Optimizing image size and setting proper caching parameters for images may solve the problem there.
If not, silvo's suggestion is a very good one. Using either block-level or page-level caching can really speed up a site. This topic has been covered previously, so see those posts, too.
Hope that helps!
Thanks,
Joe
I'm not sure that you will see a benefit of using APC and eAccelerator on the same server. They pretty much do the same thing.
A page load of 1.6 seconds is fairly typical for a Magento install. They easiest way to lower your page load time (after basic Apache and MySQL tuning and APC) is use a Full Page Cache. There are a few out on the market right now. We have written a Full Page Cache that has gotten page loads down to the .1 - .3 second range for most users, http://ecommerce.brimllc.com/performance/full-page-cache-magento.html

Scalability and Performance of Web Applications, Approaches?

What various methods and technologies have you used to successfully address scalability and performance concerns of a website? I am an ASP.NET web developer exploring .NET remoting with WCF with SQL clustering and am curious as to what other approaches exist (such as the ‘cloud’). In which cases would you apply various approaches (for example method a for roughly x many ‘active’ users).
An example of what I mean, a myspace case study: http://highscalability.com/myspace-architecture
This is a very broad question making it difficult to answer, but I'll try and provide a few general suggestions.
1 - Unless you are doing some things seriously wrong then you'll likely not need to worry about perf or scale until you hit a significant amount of traffic (over 1 million page views a month).
2 - Your biggest performance problems initially are likely to be the page load times from other countries. Try the Gomez Instance Site Test to see the page load times from around the world, and use YSlow as a guide for optimizing.
3 - When you do start hitting performance problems it will first most likely be due to the database work. Use the SQL Server Profiler to examine your SQL traffic looking for long running queries to try optimizing, and also use dm_db_missing_index_details to look for indexes you should add.
4 - If your web servers start becoming the performance bottleneck, use a profiler to (such as the ANTS Profiler) to look for ways to optimize your web pages code.
5 - If your web servers are well optimized and still running too hot, look for more caching opportunities, but you're probably going to need to simply add more web servers.
6 - If your database is well optimized and still running too hot, then look at adding a distributed caching system. This probably won't happen until you're over 10 million page views a month.
7 - If your database is starting to get overwhelmed even with distributed caching, then look at a sharding architecture. This probably won't happen until you're over 100 million page views a month.
I've worked on a few sites that get millions/hits/month. Here are some basics:
Cache, cache, cache. Caching is one of the simplest and most effective ways to reduce load on your webserver and database. Cache page content, queries, expensive computation, anything that is I/O bound. Memcache is dead simple and effective.
Use multiple servers once you are maxed out. You can have multiple web servers and multiple database servers (with replication).
Reduce overall # of request to your webservers. This entails caching JS, CSS and images using expires headers. You can also move your static content to a CDN, which will speed up your user's experience.
Measure & benchmark. Run Nagios on your production machines and load test on your dev/qa server. You need to know when your server will catch on fire so you can prevent it.
I'd recommend reading Building Scalable Websites, it was written by one of the Flickr engineers and is a great reference.
Check out my blog post about scalability too, it has a lot of links to presentations about scaling with multiple languages and platforms:
http://www.ryandoherty.net/2008/07/13/unicorns-and-scalability/
There is velocity from MS as well as MEMCache has a port to .NET now and also indeXus.Net

Resources