Serve static assets with an efficient cache policy - error in PHP website - caching

How to manage to Serve static assets with an efficient cache policy and minimize main-thread work.

Serve static assets with an efficient cache policy - this is a suggestion that assets should have at least one month cache set on them (usually done via .htaccess).
It looks like you have already done this for everything you can control as the items listed are external assets you cannot set the cache policy on.
If you have done it for all your own resources (from your domain) then do not worry.
minimise main-thread work - this means that your site is using a LOT of JavaScript or performing a lot of calculations on page load.
The only way to improve this is to remove unnecessary JS and optimise anything that is remaining.
My guess is you are using lots of plugins / libraries to do simple things on the site that could more easily be achieved through other methods.
Post the URL of your site and I will improve this answer to give more relevant answers that may help you and others.
Work around for efficient caching
One way you could fix this issue (but you need to know what you are doing) is to download the script in question to your server every night via a cron job and serve it from your server instead.
That way you can set the cache time - however you need to make sure you do cache busting on the script each time you download a different version (by comparing the previous file and the new file and checking for changes) so you don't break functionality.
As you can imagine, this technique is only used in extreme circumstances where you can justify the need to control the cache policy due to the massively increased complexity and potential for problems.

Related

Do websites share cached files?

We're currently doing optimizations to our web project when our lead told us to push the use of CDNs for external libraries as opposed to including them into a compile+compress process and shipping them off a cache-enabled nginx setup.
His assumption is that if the user has visits example.com which uses a CDN'ed version of jQuery, the jQuery is cached that time. If the user happens to visit example2.com and happen to use the same CDN'ed jQuery, the jQuery will be loaded from cache instead of over the network.
So my question is: Do domains actually share their cache?
I argued that even if it is possible the browser does share cache, the problem is that we are running on the assumption that the previous sites use the same exact CDN'ed file from the same exact CDN. What are the chances of running into a user browsing through a site using the same CDN'ed file? He said to use the largest CDN to increase chances.
So the follow-up question would be: If the browser does share cache, is it worth the hassle to optimize based on his assumption?
I have looked up topics about CDNs and I have found nothing about this "shared domain cache" or CDNs being used this way.
Well your lead is right this is basic HTTP.
All you are doing is indicating to the client where it can find the file.
The client then handles sending a request to the CDN in compliance with their caching rules.
But you shouldn't over-use CDNs for libraries either, keep in mind that if you need a specific version of the library, especially older ones, you won't be likely to get much cache hits because of version fragmentation.
For widely used and heavy libraries like jQuery you want the latest version of it is recommended.
If you can take them all from the same CDN all the better (ie: Google's) especially as http2 is coming.
Additionally they save you bandwidth, which can amount to a lot when you have high loads of traffic, and can reduce the load time for users far from your server (Google's is great for this).

My wordpress website and dashboard ,both are too slow, server responded in 11 sec

Domain of my blog is codesaviour.
Since last month my blog and wp-admin dashboard has slowed down to a frustrating level. I have already removed post revision after reading from speeding up wordpress.
Here is the Google PageSpeed Insight report of my blog. According to it server responding time is 11s.
I even read following threads in stack overflow :
link. I tried to implement the steps but blog is still slow,no change.
My host is Hostgator.in,their online assistance asked me to enable gzip compression as instructed at link,So I followed the instruction, as I was not having .htaccess file on server I created one and pasted the code mentioned in previous link,but nothing helped. It is slow like before, even online reports doesn't show that gzip is even working.
Here is a report from gtmetrix that includes Pagespeed and YSlow reports.Third Tab Timeline shows that it took 11.46s in receiving.
Main problem is server response of 11s (google pagespeed report) or 11.46s(gtmetrix report).
Google suggests to reduce it under 200ms ,How can I reduce it?
#Constantine responded in this link , that many wordpress website are going through same slow phase.
I am using following plugins:
Akismet
Google Analyticator
Google XML Sitemaps
Jetpack by WordPress.com
Revision Control
SyntaxHighlighter Evolved
WordPress Gzip Compression
WordPress SEO
WP Edit
Every time I select add new plugin following error is reported,
An unexpected error occurred. Something may be wrong with
WordPress.org or this server’s configuration.
Also whenever i am installing any plugin using upload option, its giving me error :
Can't load versions file.
http_request_failed
Please help me,in order to increase speed of my blog and dashboard, also suggestion for the errors I am receiving.
Edit
Automatically , without any changes , 11.46s has been reduced to 1.26s .
I will focus on the speed issue. Generally, when things start to be slow, it is a good idea to test by gradually switching off the features until it is fast. The last thing you switched off before it is fast is slow. Then look at that thing in details. Try to split the given task to subtask and do it again, until you find the exact cause of the problem. I would do that with the plugins as well. After the testing is finished, I would put the features back.
Use an effective caching plugin like "WP Super Cache". It drastically improves your page"s load time. Optimizing your images is also essential for your site"s speed. WP-SmushIt performs great for this issue.The last plugin which I highly recommend you is WP-Optimize.This plugin basically clean up your WordPress database and optimize it without doing manual queries. It sometimes gives error when you installed the same plugin more than ones. Firstly, you should delete the plugin from your ftp program instead of using wordpress platform. Otherwise, its not working properly due to errors. Then try to install again the same plugin which you had already deleted.
If you're going to maintain a site about programming then you really have to fix the performance. It really is awful.
The advice you get from automated tools isn't always that good.
Looking at the link you provided the biggest problem is the HTML content generation from GET http://codesaviour.com/ which is taking 11.46 seconds (there are problems elsewhere - but that is by far the worst) - 99% of the time the browser is just waiting - it only takes a fraction of a second to transfer the content across the network. Wordpress is notorious for poor performance - often due to overloading pages with plugins. Your landing page should be fast and cacheable (this fails on both counts).
even online reports doesn't show that gzip is even working
The HAR file you linked to says it is working. But compression isn't going to make much impact - it's only 8.4Kb uncompressed. The problem is with the content generation.
You should certainly use a Wordpress serverside cache module (here's a good comparison).
DO NOT USE the Wordpress Gzip plugin - do the compression on the webserver - it's much faster and more flexible.
In an ideal world you should be using ESI - but you really need control over the infrastructure to implement that properly.
Diagnosing performance problems is hard - fixing them is harder and that is when you have full access to the system it's running on. I would recommend you set up a local installation of your stack and see how it performs there - hopefully you can reproduce the behaviour and will be able to isolate the cause - start by running HPROF, checking the MySQL query log (I'm guessing these aren't available from your hosting company). You will howevver be able to check the state of your opcode cache - there are free tools for both APC and ZOP+. Also check the health of your MySQL query cache.
Other things to try are to disable each of the plugins in turn and measure the impact (you can get waterfalls in Firefox using the Firebug extension, and in chrome using the bundled developer tools).
You might also want to read up a bit on performance optimization - note that most books tend to focus on the problems client-side but your problems are on your server. You might even consider switching to a provider who specializes in Wordpress or use a different CMS.
symcbean's answer is good, but I would add a few things:
This is a server-side issue
This has been said by others, but I want to further emphasize that this is a server side issue, so all those client-side speed testing tools are going to be of very limited value
HostGator isn't high-performance hosting
I don't know about India, but HostGator in the US is generally very slow for dynamic, database driven sites (like yours). It absolutely shouldn't take 11 seconds to load the page, especially since your site doesn't look particular complex, but unless you're serving a totally static site, HostGator probably won't ever give you really stellar performance.
Shared hosting leaves you at the mercy of badly-behaved "neighbors"
If you're using one of HostGator's standard shared hosting packages (I'm assuming you are), you could have another site on the same machine using too many resources and crippling the performance of your site. See if you can get HostGator to look into that.
Why not use a service built for this?
This looks like a totally standard blog, so a service like Tumblr or Wordpress.com (not .org) might be a better choice for your needs. Performance will be excellent and the cost should be very low, even with a custom domain name. If you aren't experienced in managing WordPress and don't have any interest in learning how (don't blame you), why not leave all that to the experts?
You need to make some adjustment to improve your speed up WordPress.
The first step is: clean some unwanted plugins you had in WordPress.
The second step is: delete the theme you not used.
The third step is: compress all images with lossless quality.
The fourth step is: Clean up the database.
If you have done all these steps you will fix your WordPress. You want more details to check out this link: How to fix WordPress dashboard slow.
Other than the usual suggestions, if you are hosting your MySql db on another host from the web server, check the latency between the two. Wordpress is unbelievably chatty with it's db (50+ db calls to load each dashboard page, for example). By moving the db onto the same host as the web server, I got excellent performance.

product list page of magento incredibily slow

I am new with Magento and have developed a website using CE 1.7.0.2. Now its ready to go live but I have issues with slow page load.
My website product home, list and detail page initially takes time to 10-13 sec to page load but after that first load it only takes 1-2 second to page load.
Also I have installed APC, Memcache and CDN on server and full page cache extension for website but yet it is slow. I am so frustrated why this happening with my website?
If anybody knows how can we resolve speed up issues that will be helpful for me.
Thanks!
You can go through the below steps for Magento Optimization:
High Performance Dedicated Server. Ex: Amazon EC2 cloud
Swap Apache for NGINX
Minimize Javascript use
Minify and Compressed CSS files
‘Combine CSS’ seeks to reduce the number of HTTP requests made by a
browser
Optimize images
Use lazyload for images
Specify Image dimensions
Combine images into CSS sprites
Use a Content Delivery Network (CDN) for delivering static files
like JS, CSS and Images to offload your server
Disable modules/extension which are not required
Enable all Magento Caches
Use a Full Page Cache / Varnish Cache / Memcache / RedisCache
Don’t use layered navigation if you don’t really need it, it needs
a lot of resources
Enable Compilation
Limit the number of products on a product overview page.
Set only those attribute frontend properties to ‘Yes’ that you’re
actually going to use. Set all other to ‘No’.
Don’t use in quick search, advanced search compare, etc. : Catalog
-> Attributes -> Manage Atributes -> Frontend Properties.
Install Google Page Speed Module
Minimize redirects – Minimizing HTTP redirects from one URL to
another cuts out wait time for users.
Prefer asynchronous resources – Fetching resources asynchronously
prevents those resources from blocking the page load.
This list may help you for Magento performance improvement:
Enable Magento caching
This is ofcourse the first step in optimization: Enable all the available caches in the Magento Admin Panel.
Compress images
Many people forget that images (PNG, JPG) can be compressed, which lowers the bandwidth between the browser and the webserver. Not only the images used by the Magento skin need optimizing, but catalog images as well. Various tools allow you to compress batches of images, for instance the online tool Smush.It.
Disable unneeded Magento modules
By disabling Magento modules that you do not need, less resources are needed – as simple as that. Modules could be disabled through the configuration in the Magento Admin Panel, or by editing XML-files in app/etc/modules. For instance, disable Mage_Log which performs queries on every request, but is not needed if you gather site statistics using external programs.
Enable flat catalogs for smaller webshops
For smaller webshops switching from the complex EAV-structure to a flat catalog could save time. This optimization is dubious and depends on many parameters, so do not take this step lightly.
W3C compliance
While it could be argued that this is less important with the coming of HTML5, it is still a fact that if your webpages are filled with ugly errors, the browser will have a harder time interpreting it. If you stick to W3C compliance, it is made sure the browser engine has an easy job parsing your HTML-code.
Compress output in general
By enabling the PHP-setting zlib.output_compression the output generated by PHP is compressed when sent to the browser. This saves bandwidth. Instead of using this, you could use the Apache mod_deflate module as well, which allows also for compression of non-PHP output (CSS, JavaScript, other plain text-files).
Configure PHP options
Most PHP settings actually do not influence the performance of Magento, but just set certain limits. For instance, settings like memory_limit and max_execution_time do not increase the page load but just make sure that certain actions do not timeout or run into memory problems.
Session storage
With Magento, sessions could be stored in files or in the database (by configuring app/etc/local.xml). Which option performs best, really depends on how the hosting environment is setup. If MySQL databases perform well, session storage in the database could benefit your site. But if MySQL is not setup correctly, the best choice might be files.
Use a PHP accelerator
By opcode caching, PHP-execution could be fastened. There are various PHP accelerators doing this job (APC, ZendOptimizer+, eAccelerator, XCache). Both APC and ZendOptimizer+ are working flawless with Magento.
Tune PHP realpath_cache
By tuning the PHP realpath_cache_size to for instance 128K (default is 16K) and the realpath_cache_ttl to 86400, things might be speeding up. Make sure you don’t run out of memory, because every Apache child will consume the configured caching size.
Use Apache mod_expires
By telling the browser which files to keep in cache for how long, you can optimize the browser cache. For instance, JavaScript files tend to change much less then CSS files (at least in the first stages of the site), but perhaps after the site is running smooth for some months you can maximize the expiration date.
Beware 404 errors
Every time a file (like a stylesheet or image) is not found, this generates a 404 error. Because the default 404 of Magento is caught by the application of Magento itself, this causes the Magento application to start for every 404 encountered. Check your Apache logs to make sure all 404 errors are solved.
Disable Magento logging
Within the Magento configuration, you can enable logging under the Developers-tab. Depending on the Magento modules this could lead to more logs needed to be written to the filesystem, slowing down your shop. Because the logging-abilities are only needed to debug something, in daily usage, it’s best to disable logging altogether.
MySQL table optimization
Through phpMyAdmin, you can perform the command OPTIMIZE TABLE on specific Magento database-tables. When a table is cluttered, this could lead to improved performance. This does not only count for the complex EAV-tables, but also for regular MySQL tables that are used frequently (for instance, core_config_data).
Merge CSS and JavaScript
By merging all CSS and JavaScript files together as one big file, only one single HTTP-request is needed by the browser to fetch this content. This saves bandwidth. For this merging, the FooMan Speedster module could be used. Magento 1.4 contains an option to merge CSS, while JavaScript-files are merged by default.
Besides merging, crunching is also an option offered by FooMan Speedster: It removes whitespaces from the output, but when compression is already applied to CSS, this option is less needed.
Use Magento Compiler module
The Magento Compiler module limits the number of directories that PHP has to search through when looking for PHP-files. This decreases the PHP execution-time, which speeds up the Magento application in general.
Be warned that you need to be careful when making changes to your Magento system, while the Magento Compiler is enabled. Upgrades should only be undertaken when the compiler is (temporarily) disabled.
One very neat trick that speeds up things tremenduously is to create a tmpfs-mount specifically for the includes/src folder. Note that this tmpfs-mount needs to be at least 100Mb – preferably 200Mb.
MySQL server tuning
The default MySQL setup is a lot of times sufficient to run a general hosting environment, but not all optimized for Magento. Tuning settings like query_cache_size could dramatically increase performance, but is also dangerous because it hugely depends on other variables (number of databases, number of tables per database, number of queries, peak usage).
Serve static content through a CDN
Static content like images, CSS-stylesheets or JavaScript-files, could be served through other servers that are more optimized for static content. For instance, a CDN could be used so that the static content is always served from a location that is closest to the visitor. This is vital for webshops serving customers worldwide.
Disable local Magento modules
If your site does not need local Magento modules, you could choose to skip the search for local modules alltogether. Within the app/etc/local.xml file, you will find an XML-tag allowing you to do so.
Be carefull with HTTPS
Every time you use SSL between webserver and browser, the process of encrypting and decrypting is added on both sides. Also there is a slight overhead in bandwidth. The Magento site runs slightly faster if you disable SSL for all or just a few pages. However, this “win” is so small compared to the other wins on this page, that it should only be handled with caution. The gained bandwidth is non-vital, while almost all computers nowadays have CPU-power with which the encryption/decryption process takes place in microseconds.
Magento in the cloud
While CDNs could be used to optimize the bandwidth for static content, the Magento application could also be optimized in the same way by using cloud computing.
Memory-based filesystem for dynamic data
By storing dynamic data (var/cache, var/session) on a memory-based filesystem like RAMdisk or tmpfs, the disk I/O is decreased.
Disable Apache htaccess-files
When allowing the usage of htaccess-files, Apache needs to inspect every directory in its path to see if an htaccess-file is present. By moving the Apache configuration-directives from the htaccess-file to the VirtualHost configuration-file, and disabling htaccess-files all together, the Apache execution-time is decreased. This tip probably applies in most cases only to dedicated servers.
Use Nginx or Litespeed
While the Apache webserver is very flexible in its configurations, there are other webservers that are better optimized regarding memory usage: By replacing Apache with either Nginx or Litespeed, you could speed up the Magento scripts even more. Both webservers require manual configuration to allow for SEF URLs.
Use lazyload for images
When a page is loading, a visitor is often waiting for images on that page to load. Depending on the number and size of these images, this can take some time. Instead of loading images at once when the page is loaded, you can also add the LazyLoad JavaScript effect that makes sure only visible images (within the browser screen) are loaded, while remaining images are only loaded once the visitor scrolls down.
Minimize Apache logging
If Apache logging is minimized, less file operations are needed for every incoming request. Ofcourse less logging also means less insight when something goes wrong. An alternative is to optimize the filesystem on which Apache logs are stored. By default, Apache logs to the /var filesystem – but there’s no need to enable things like journalling for that filesystem.
ref: http://magentotutorialbeginners.blogspot.in/2014/05/magento-performance-improvement.html
If you have done all kind of server and caching optimization.Go to code level.
1) See are you loading a collection with in a foreach loop.
2) Try to optimize the code.
3) If you are loading a collection, filter the collection for the required attributes only.
4) Check for your product images.Use png images and try to keep images size under 500 KB.
5) Try commenting the custom functionality that you are providing on that page.And check with some tool like GTMetrix. How much time you achieve to load the page.Try to find out the code which is taking long time to load.
6) Keep only necessary attributes used for filter purpose in Layered navigation.
7) Try disabling unnecessary modules.
8) Try after enabling the compilation.
Hope these suggestions will work for you.

Is it better to use Cache or CDN?

I was studying about browser performance when loading static files and this doubt has come.
Some people say that use CDN static files (i.e. Google Code, jQuery
latest, AJAX CDN,...) is better for performance, because it requests
from another domain than the whole web page.
Other manner to improve the performance is to set the Expires header
equal to some months later, forcing the browser to cache the static
files and cutting down the requests.
I'm wondering which manner is the best, thinking about performance and
if I may combine both.
Ultimately it is better to employ both techniques if you are doing web performance optimization (WPO) of a site, also known as front-end optimization (FEO). They can work amazingly hand in hand. Although if I had to pick one over the other I'd definitely pick caching any day. In fact I'd say it's imperative that you setup proper resource caching for all web projects even if you are going to use a CDN.
Caching
Setting Expires headers and caching of resources is a must and should be done 100% of the time for your resources. There really is no excuse for not doing caching. On Apache this is super easy to config after enabling mod_expires.c and mod_headers.c. The HTML5 Boilerplate project has good implementation example in the .htaccess file and if your server is something else like nginx, lighttpd or IIS check out these other server configs.
Here's a good read if anyone is interested in learning about caching: Mark Nottingham's Caching Tutorial
Content Delivery Network
You mentioned Google Code, jQuery latest, AJAX CDN and I want to just touch on CDN in general including those you pay for and host your own resources on but the same applies if you are simply using the jquery hosted files cdn or loading something from http://cdnjs.com/ for example.
I would say a CDN is less important than setting server side header caching but a CDN can provide significant performance gains but your content delivery network performance will vary depending on the provider.
This is especially true if your traffic is a worldwide audience and the CDN provider has many worldwide edge/peer locations. It will also reduce your webhosting bandwidth significantly and cpu usage (a bit) since you're offloading some of the work to the CDN to deliver resources.
A CDN can, in some rarer cases, cause a negative impact on performance if the latency of the CDN ends up being slower then your server. Also if you over optimize and employ too much parallelization of resources (using multi subdomains like cdn1, cdn2, cdn3, etc) it is possible to end up slowing down the user experience and cause overhead with extra DNS lookups. A good balance is needed here.
One other negative impact that can happen is if the CDN is down. It has happened, and will happen again. This is more true with free CDN. If the CDN goes down for whatever reason, so does your site. It is yet another potential single point of failure (SPOF). For javascript resources you can get clever and load the resource from the CDN and should it fail, for whatever the case, then detect and load a local copy. Here's an example of loading jQuery from ajax.googleapis.com with a fallback (taken from the HTML5 Boilerplate):
<script src="//ajax.googleapis.com/ajax/libs/jquery/1.8.2/jquery.min.js"></script>
<script>window.jQuery || document.write('<script src="js/vendor/jquery-1.8.2.min.js"><\/script>')</script>
Besides obvious free API resources out there (jquery, google api, etc) if you're using a CDN you may have to pay a fee for usage so it is going to add to hosting costs. Of course for some CDN you have to even pay extra to get access to certain locations, for example Asian nodes might be additional cost then North America.
For public applications, go for CDN.
Caching helps for repeated requests, but not for the first request.
To ensure fast load on first page visit use a CDN, chances are pretty good that the file is already cached by another site already.
As other have mentioned already CDN results are of course heavily cached too.
However if you have an intranet website you might want to host the files yourself as they typically load faster from an internal source than from a CDN.
You then also have the option to combine several files into one to reduce the number of requests.
A CDN has the benefit of providing multiple servers and automatically routing your traffic to the closest location to your client. This can result in faster delivery, optimized by location.
Also, static content doesn't require special application servers (like dynamic content) so for you to be able to offload it to a CDN means you completely reduce that traffic. A streaming video clip may be too big to cache or should not be cached. But you don't neccessarily want to support that bandwidth. A CDN will take on that traffic for you.
It is not always about the cache. A small application web server may just want to provide the dynamic content but needs a solution for the heavy hitting media that rarely changes. CDNs handle the scaling issue for you.
Agree with #Anthony_Hatzopoulos (+1)
CDN complements Caching; also in some cases, it will help optimize Caching directives.
For example, a company I work for has integrated behavior-learning algorithms into its CDN, to identify and dynamically cache generated objects.
Ordinarily, these objects would be un-Cachable (i.e. [Cache-Control: max-age=0] Http header). But in this case, the system is able to identify Caching possibilities and override original HTTP Header directions. (For example: a dynamically generated popular product that should be Cached, or popular Search result page that, while being generated dynamically, is still presented time over time in the same form to thousands of users).
And yes, before you ask, the system can also identify personalized data and very freshness, to prevent false positives... :)
Implementing such an algorithm was only possible due to a reverse-proxy CDN technology. This is an example of how CDN and Caching can complement each other, to create better and smarter acceleration solutions.
Above those experts quotes, the explanation are perfect to understand CDN tech and also cache
I would just provide my personal experience, I had worked on the joomla virtuemart site and unfortunately it will not allow update new joomla and virtuemart version cause it was too much customised fields in product pages, so once the visitor up to 900/DAY and lots user could not put their items in their basket because each time to called lots js and ajax called for order items takes too much time
After optimise the site, we decide to use CDN, then the performance is really getting good, along by record from gtmetrix, the first YSlow Score was 50% then after optimise + CDN it goes to 74%
https://gtmetrix.com/reports/www.florihana.com/jWlY35im
and from dashboard of CDN you could see which datacenter cost most and data charged most to get your improvement of marketing:
But yes to configure CDN it has to be careful of purge time and be balancing numbers of resource CDN cause if it down some problem you need to figure out which resource CDN cause
Hope this does help

How do I update an expensive in-memory cache across a SharePoint farm?

We have 3 front-end servers each running multiple web applications. Each web application has an in memory cache.
Recreating the cache is very expensive (>1 min). Therefore we repopulate it using a web service call to each web application on each front-end server every 5 minutes.
The main problem with this setup is maintaining the target list for updating and the cost of creating the cache several times every few minutes.
We are considering using AppFabric or something similar but I am unsure how time consuming it is to get up and running. Also we really need the easiest solution.
How would you update an expensive in memory cache across multiple front-end servers?
The problem with memory caching is that it's unique to the server. I'm going with the idea that this is why you want to use AppFabric. I'm also assuming that you're re-creating the cache every few minutes to keep the in memory caches in sync across all servers. With all this work, I can well appreciate that caching is expensive for you.
It sounds like you're doing a lot of work that probably isn't necessary. This article has some detail about the caching mechanisms available within SharePoint. You may be interested in the output cache discussed near the top of the article. You may also want to read the linked TechNet article and the linked article called "Custom Caching Overview".
The only SharePoint way to do that is to use Service Application infrastructure. The only problem is that it requires some time to understand how it works. Also it's too complicated to do it from scratch. You might consider downloading one of existing applications and rename classes/GUIDs to match your naming conventions. I used this one: http://www.parago.de/2011/09/paragoservices-a-sharepoint-2010-service-application-sample/. In this case you can have single cache per N front-end servers.

Resources