openX performance - CDN? - performance

I was wondering whether there are any best practices or if you have any good tips on improving performance of a self hosted openX instance.
Apart from the usual suspects (e.g. tuning apache, php and mySQL), using memcached for caching etc.
Is there any room for a CDN or other proxy cache in front of openX? (my guess is no due to the dynamic nature of the delivery scripts results)?

I suppose you could cache the actual creative (image files, etc.) but you've guessed correctly that caching the scripts isn't going to be a good thing.
Based on some recent questions, it looks like there's not a whole lot to be done to improve OpenX's performance, unfortunately.

You can refer these steps given on openx website,
performance tuning
Openx has some problems like it fetches all the banners from system which are matching to zone id, and apply all delivery limitation on them to select right banner for serving.
but you can try given steps in url, and that will improve the performance.

Yes, you can use a CDN.
I'm using a CDN (pull zone with CNAME). And I only set this parameter:
Configuration > Global Settings > Banner Delivery Settings > OpenX Server Access Paths
Image Store URL: http:// cdn.yourdomain.com/www/images

Related

My wordpress website and dashboard ,both are too slow, server responded in 11 sec

Domain of my blog is codesaviour.
Since last month my blog and wp-admin dashboard has slowed down to a frustrating level. I have already removed post revision after reading from speeding up wordpress.
Here is the Google PageSpeed Insight report of my blog. According to it server responding time is 11s.
I even read following threads in stack overflow :
link. I tried to implement the steps but blog is still slow,no change.
My host is Hostgator.in,their online assistance asked me to enable gzip compression as instructed at link,So I followed the instruction, as I was not having .htaccess file on server I created one and pasted the code mentioned in previous link,but nothing helped. It is slow like before, even online reports doesn't show that gzip is even working.
Here is a report from gtmetrix that includes Pagespeed and YSlow reports.Third Tab Timeline shows that it took 11.46s in receiving.
Main problem is server response of 11s (google pagespeed report) or 11.46s(gtmetrix report).
Google suggests to reduce it under 200ms ,How can I reduce it?
#Constantine responded in this link , that many wordpress website are going through same slow phase.
I am using following plugins:
Akismet
Google Analyticator
Google XML Sitemaps
Jetpack by WordPress.com
Revision Control
SyntaxHighlighter Evolved
WordPress Gzip Compression
WordPress SEO
WP Edit
Every time I select add new plugin following error is reported,
An unexpected error occurred. Something may be wrong with
WordPress.org or this server’s configuration.
Also whenever i am installing any plugin using upload option, its giving me error :
Can't load versions file.
http_request_failed
Please help me,in order to increase speed of my blog and dashboard, also suggestion for the errors I am receiving.
Edit
Automatically , without any changes , 11.46s has been reduced to 1.26s .
I will focus on the speed issue. Generally, when things start to be slow, it is a good idea to test by gradually switching off the features until it is fast. The last thing you switched off before it is fast is slow. Then look at that thing in details. Try to split the given task to subtask and do it again, until you find the exact cause of the problem. I would do that with the plugins as well. After the testing is finished, I would put the features back.
Use an effective caching plugin like "WP Super Cache". It drastically improves your page"s load time. Optimizing your images is also essential for your site"s speed. WP-SmushIt performs great for this issue.The last plugin which I highly recommend you is WP-Optimize.This plugin basically clean up your WordPress database and optimize it without doing manual queries. It sometimes gives error when you installed the same plugin more than ones. Firstly, you should delete the plugin from your ftp program instead of using wordpress platform. Otherwise, its not working properly due to errors. Then try to install again the same plugin which you had already deleted.
If you're going to maintain a site about programming then you really have to fix the performance. It really is awful.
The advice you get from automated tools isn't always that good.
Looking at the link you provided the biggest problem is the HTML content generation from GET http://codesaviour.com/ which is taking 11.46 seconds (there are problems elsewhere - but that is by far the worst) - 99% of the time the browser is just waiting - it only takes a fraction of a second to transfer the content across the network. Wordpress is notorious for poor performance - often due to overloading pages with plugins. Your landing page should be fast and cacheable (this fails on both counts).
even online reports doesn't show that gzip is even working
The HAR file you linked to says it is working. But compression isn't going to make much impact - it's only 8.4Kb uncompressed. The problem is with the content generation.
You should certainly use a Wordpress serverside cache module (here's a good comparison).
DO NOT USE the Wordpress Gzip plugin - do the compression on the webserver - it's much faster and more flexible.
In an ideal world you should be using ESI - but you really need control over the infrastructure to implement that properly.
Diagnosing performance problems is hard - fixing them is harder and that is when you have full access to the system it's running on. I would recommend you set up a local installation of your stack and see how it performs there - hopefully you can reproduce the behaviour and will be able to isolate the cause - start by running HPROF, checking the MySQL query log (I'm guessing these aren't available from your hosting company). You will howevver be able to check the state of your opcode cache - there are free tools for both APC and ZOP+. Also check the health of your MySQL query cache.
Other things to try are to disable each of the plugins in turn and measure the impact (you can get waterfalls in Firefox using the Firebug extension, and in chrome using the bundled developer tools).
You might also want to read up a bit on performance optimization - note that most books tend to focus on the problems client-side but your problems are on your server. You might even consider switching to a provider who specializes in Wordpress or use a different CMS.
symcbean's answer is good, but I would add a few things:
This is a server-side issue
This has been said by others, but I want to further emphasize that this is a server side issue, so all those client-side speed testing tools are going to be of very limited value
HostGator isn't high-performance hosting
I don't know about India, but HostGator in the US is generally very slow for dynamic, database driven sites (like yours). It absolutely shouldn't take 11 seconds to load the page, especially since your site doesn't look particular complex, but unless you're serving a totally static site, HostGator probably won't ever give you really stellar performance.
Shared hosting leaves you at the mercy of badly-behaved "neighbors"
If you're using one of HostGator's standard shared hosting packages (I'm assuming you are), you could have another site on the same machine using too many resources and crippling the performance of your site. See if you can get HostGator to look into that.
Why not use a service built for this?
This looks like a totally standard blog, so a service like Tumblr or Wordpress.com (not .org) might be a better choice for your needs. Performance will be excellent and the cost should be very low, even with a custom domain name. If you aren't experienced in managing WordPress and don't have any interest in learning how (don't blame you), why not leave all that to the experts?
You need to make some adjustment to improve your speed up WordPress.
The first step is: clean some unwanted plugins you had in WordPress.
The second step is: delete the theme you not used.
The third step is: compress all images with lossless quality.
The fourth step is: Clean up the database.
If you have done all these steps you will fix your WordPress. You want more details to check out this link: How to fix WordPress dashboard slow.
Other than the usual suggestions, if you are hosting your MySql db on another host from the web server, check the latency between the two. Wordpress is unbelievably chatty with it's db (50+ db calls to load each dashboard page, for example). By moving the db onto the same host as the web server, I got excellent performance.

Sitecore with DMS vs caching server - how do you handle it?

We're planning to introduce DMS to our customer's Sitecore installation. It's a rather popular site in our country and we have to use proxy caching server (it's Nginx in this case) to make it high-traffic-proof.
However, as far as we know, it's not possible to use all the DMS features with caching proxy enabled - for example personalization of content - if it gets cached it won't be personalized.
Is there a way to make use of all the DMS features with proxy cache turned on? If not, how do you handle this problem for high-traffic sites - is it buying more Content Delivery servers to carry the load, or extending current server with better hardware (RAM, CPU, bandwidth)?
You might try moving away from your proxy caching for some pages, or even all.
There's no reason not to use a CDN for static assets and media library assets, so stick with that
Leverage Sitecore's built-in html cache for sublayouts/renderings - there are quite a few options for caching
Use Sitecore's Debug feature to track down the slowest components on your site
Consider using indexes instead of doing "fast" or Sitecore queries
Don't do descendants query "//*" (I often see this when calculating selected state for navigation - hint: go the other way, calculate the ancestors of the current page)
#jammykam wrote an excellent answer on this over here.
John West wrote a great blog post on this also, though a bit older.
Good luck!
I've been wondering about this myself.
I have been thinking of implementing an ajax web service that:
- talks to the DMS and returns JSON
- allows you to render the personalised components client side
- allows you to trigger anlaytics events
I have been googling around and I haven't found anyone that has done it and published the information yet. The only place I have found something similar is actually in the mobile sdk, but I haven't had a chance to delve into it yet.
I have also not been able to use proxy server caching and DMS together successfully. For extremely high loads, I have recommended to clients to follow the standard optimization and scaling guidelines, especially architecting for proper Sitecore sublayout and layout caching for as much of the site as possible. With that caching done, follow it up by distributing across multiple Content Delivery nodes with load balancing to help support high volume with personalization at the same time.
I've heard that other CMS's with personalization use a javascript approach to load the personalized content on the client-side, but I would be worried about losing track of the analytics data that is gathered when personalized content is loaded and interacted with.

Is it possible to implement a Content Delivery Network (CDN) if I'm with a cluster web host?

I'm with a webhost, a web farm or cluster, I guess you could say. I have a 47 page company website, and all speed tests suggest I use a CDN.
I've googled and SE's this to no end, but still don't understand how to implement a content delivery network. Are they suggesting I order a subdomain and put all my .css, .js, and image files in that subdomain? Or are they suggesting that instead of downloading jquery 1.7, I just link to malsup's jquery? But then what would I do for images and .css?
Just kinda confused here; any help in this regard would be truly appreciated!
Yes - you can implement a CDN with a cluster web host. In the vast majority of cases if you can change your DNS settings you can implement CDN. Another suggestion is to use a cookie-less domain. But, a content delivery network will optimize the delivery of all the files you mentioned. While I'm not sure of all the particulars of your specific setup and situation, it sounds like you could use front-end optimization and an overall faster site delivery. Take a look at the following that highlights EdgeCast's integration of Google's PageSpeed into our content delivery network, and how they'll help out sites like yours in tandem: http://www.edgecast.com/docs/ec-edgeopt-datasheet.pdf

Is it better to use Cache or CDN?

I was studying about browser performance when loading static files and this doubt has come.
Some people say that use CDN static files (i.e. Google Code, jQuery
latest, AJAX CDN,...) is better for performance, because it requests
from another domain than the whole web page.
Other manner to improve the performance is to set the Expires header
equal to some months later, forcing the browser to cache the static
files and cutting down the requests.
I'm wondering which manner is the best, thinking about performance and
if I may combine both.
Ultimately it is better to employ both techniques if you are doing web performance optimization (WPO) of a site, also known as front-end optimization (FEO). They can work amazingly hand in hand. Although if I had to pick one over the other I'd definitely pick caching any day. In fact I'd say it's imperative that you setup proper resource caching for all web projects even if you are going to use a CDN.
Caching
Setting Expires headers and caching of resources is a must and should be done 100% of the time for your resources. There really is no excuse for not doing caching. On Apache this is super easy to config after enabling mod_expires.c and mod_headers.c. The HTML5 Boilerplate project has good implementation example in the .htaccess file and if your server is something else like nginx, lighttpd or IIS check out these other server configs.
Here's a good read if anyone is interested in learning about caching: Mark Nottingham's Caching Tutorial
Content Delivery Network
You mentioned Google Code, jQuery latest, AJAX CDN and I want to just touch on CDN in general including those you pay for and host your own resources on but the same applies if you are simply using the jquery hosted files cdn or loading something from http://cdnjs.com/ for example.
I would say a CDN is less important than setting server side header caching but a CDN can provide significant performance gains but your content delivery network performance will vary depending on the provider.
This is especially true if your traffic is a worldwide audience and the CDN provider has many worldwide edge/peer locations. It will also reduce your webhosting bandwidth significantly and cpu usage (a bit) since you're offloading some of the work to the CDN to deliver resources.
A CDN can, in some rarer cases, cause a negative impact on performance if the latency of the CDN ends up being slower then your server. Also if you over optimize and employ too much parallelization of resources (using multi subdomains like cdn1, cdn2, cdn3, etc) it is possible to end up slowing down the user experience and cause overhead with extra DNS lookups. A good balance is needed here.
One other negative impact that can happen is if the CDN is down. It has happened, and will happen again. This is more true with free CDN. If the CDN goes down for whatever reason, so does your site. It is yet another potential single point of failure (SPOF). For javascript resources you can get clever and load the resource from the CDN and should it fail, for whatever the case, then detect and load a local copy. Here's an example of loading jQuery from ajax.googleapis.com with a fallback (taken from the HTML5 Boilerplate):
<script src="//ajax.googleapis.com/ajax/libs/jquery/1.8.2/jquery.min.js"></script>
<script>window.jQuery || document.write('<script src="js/vendor/jquery-1.8.2.min.js"><\/script>')</script>
Besides obvious free API resources out there (jquery, google api, etc) if you're using a CDN you may have to pay a fee for usage so it is going to add to hosting costs. Of course for some CDN you have to even pay extra to get access to certain locations, for example Asian nodes might be additional cost then North America.
For public applications, go for CDN.
Caching helps for repeated requests, but not for the first request.
To ensure fast load on first page visit use a CDN, chances are pretty good that the file is already cached by another site already.
As other have mentioned already CDN results are of course heavily cached too.
However if you have an intranet website you might want to host the files yourself as they typically load faster from an internal source than from a CDN.
You then also have the option to combine several files into one to reduce the number of requests.
A CDN has the benefit of providing multiple servers and automatically routing your traffic to the closest location to your client. This can result in faster delivery, optimized by location.
Also, static content doesn't require special application servers (like dynamic content) so for you to be able to offload it to a CDN means you completely reduce that traffic. A streaming video clip may be too big to cache or should not be cached. But you don't neccessarily want to support that bandwidth. A CDN will take on that traffic for you.
It is not always about the cache. A small application web server may just want to provide the dynamic content but needs a solution for the heavy hitting media that rarely changes. CDNs handle the scaling issue for you.
Agree with #Anthony_Hatzopoulos (+1)
CDN complements Caching; also in some cases, it will help optimize Caching directives.
For example, a company I work for has integrated behavior-learning algorithms into its CDN, to identify and dynamically cache generated objects.
Ordinarily, these objects would be un-Cachable (i.e. [Cache-Control: max-age=0] Http header). But in this case, the system is able to identify Caching possibilities and override original HTTP Header directions. (For example: a dynamically generated popular product that should be Cached, or popular Search result page that, while being generated dynamically, is still presented time over time in the same form to thousands of users).
And yes, before you ask, the system can also identify personalized data and very freshness, to prevent false positives... :)
Implementing such an algorithm was only possible due to a reverse-proxy CDN technology. This is an example of how CDN and Caching can complement each other, to create better and smarter acceleration solutions.
Above those experts quotes, the explanation are perfect to understand CDN tech and also cache
I would just provide my personal experience, I had worked on the joomla virtuemart site and unfortunately it will not allow update new joomla and virtuemart version cause it was too much customised fields in product pages, so once the visitor up to 900/DAY and lots user could not put their items in their basket because each time to called lots js and ajax called for order items takes too much time
After optimise the site, we decide to use CDN, then the performance is really getting good, along by record from gtmetrix, the first YSlow Score was 50% then after optimise + CDN it goes to 74%
https://gtmetrix.com/reports/www.florihana.com/jWlY35im
and from dashboard of CDN you could see which datacenter cost most and data charged most to get your improvement of marketing:
But yes to configure CDN it has to be careful of purge time and be balancing numbers of resource CDN cause if it down some problem you need to figure out which resource CDN cause
Hope this does help

How can we run performance testing manually for any webpage?

I am not able to find out anywhere that how can we do performance test manually.
Please help me out for this query.
Thanks!
Maybe you are looking for JMeter or a similar tool.
What browser? Most of the current browsers support the W3C Navigation Timing spec and expose performance data directly on the DOM. You can access it from the console, from javascript on your pages or from browser extensions that display the information.
If you want more detail like a resource load waterfall then you can usually access that directly from the dev tools provided by the various browsers.
One thing you will want to be really careful of is to make sure you do your testing in a configuration that is similar to the users. If you are running a server locally and testing from a browser on the same machine or even the same network then your performance data will be pretty worthless (unless it's an intranet app).
you can perform manual testing (Performance testing) for any webpage by optimizing your css, Javascript and images ( size).
I think JMeter is a best tool for same to check webpage testing if you want add some scripting you can also add.
Also you can check Yslow addons of firefox.This addons give you filter data to optimized your page perfromes.
Also there are some online link available.
How can we run performance testing manually for any webpage?
You can simple use GTMatrix tool which will response of your site Performaces overall in detail.
The best way to go for Performance Testing without any tool is to provide a Standard loading time for each page as per one's experience knowledge. Else request the client to provide an ideal time for each page. Against which the loading time can be verified. But in case of multiple user simultaneously JMeter is the best hands on Approach available. Its Open source. Easy to understand. And you get reports too.
But of course there are multiple factors that would hinder the Performance. They are :
Your network speed
The Server speed on which your application is hosted
The number of Simultaneous users using
The Heavy images in pages
Last but not the least unnecessary links, codes, in short memory consumption in Code, could be loops not required. All the gifts from Developer Teams !!

Resources