I have set up a Wordpress Woocommerce storefront. I want to set up downloadable products which will be downloaded via XSendFile module.
However, my download files are quite big (50mb approx) and so am planning to set up Cloudflare to cache the download files so I don't exceed my bandwidth limit from my hosting service.
My question is, will Cloudflare cache files that are linked through Apache's XSendFile module?
Sorry if this is a basic question. I'm just trying to figure out whether this set up will work or whether I will need to find an alternative solution.
NOTE: Forgot to add that the download files are pdf files.
It really depends on if we are proxying the record that it is on (www, for example). It is also important to note that we wouldn't cache third-party resources at all, if it is structured in some way that is not directly on the domain.
I would also recommend reviewing what CloudFlare caches by default.
Related
How to manage to Serve static assets with an efficient cache policy and minimize main-thread work.
Serve static assets with an efficient cache policy - this is a suggestion that assets should have at least one month cache set on them (usually done via .htaccess).
It looks like you have already done this for everything you can control as the items listed are external assets you cannot set the cache policy on.
If you have done it for all your own resources (from your domain) then do not worry.
minimise main-thread work - this means that your site is using a LOT of JavaScript or performing a lot of calculations on page load.
The only way to improve this is to remove unnecessary JS and optimise anything that is remaining.
My guess is you are using lots of plugins / libraries to do simple things on the site that could more easily be achieved through other methods.
Post the URL of your site and I will improve this answer to give more relevant answers that may help you and others.
Work around for efficient caching
One way you could fix this issue (but you need to know what you are doing) is to download the script in question to your server every night via a cron job and serve it from your server instead.
That way you can set the cache time - however you need to make sure you do cache busting on the script each time you download a different version (by comparing the previous file and the new file and checking for changes) so you don't break functionality.
As you can imagine, this technique is only used in extreme circumstances where you can justify the need to control the cache policy due to the massively increased complexity and potential for problems.
We're currently doing optimizations to our web project when our lead told us to push the use of CDNs for external libraries as opposed to including them into a compile+compress process and shipping them off a cache-enabled nginx setup.
His assumption is that if the user has visits example.com which uses a CDN'ed version of jQuery, the jQuery is cached that time. If the user happens to visit example2.com and happen to use the same CDN'ed jQuery, the jQuery will be loaded from cache instead of over the network.
So my question is: Do domains actually share their cache?
I argued that even if it is possible the browser does share cache, the problem is that we are running on the assumption that the previous sites use the same exact CDN'ed file from the same exact CDN. What are the chances of running into a user browsing through a site using the same CDN'ed file? He said to use the largest CDN to increase chances.
So the follow-up question would be: If the browser does share cache, is it worth the hassle to optimize based on his assumption?
I have looked up topics about CDNs and I have found nothing about this "shared domain cache" or CDNs being used this way.
Well your lead is right this is basic HTTP.
All you are doing is indicating to the client where it can find the file.
The client then handles sending a request to the CDN in compliance with their caching rules.
But you shouldn't over-use CDNs for libraries either, keep in mind that if you need a specific version of the library, especially older ones, you won't be likely to get much cache hits because of version fragmentation.
For widely used and heavy libraries like jQuery you want the latest version of it is recommended.
If you can take them all from the same CDN all the better (ie: Google's) especially as http2 is coming.
Additionally they save you bandwidth, which can amount to a lot when you have high loads of traffic, and can reduce the load time for users far from your server (Google's is great for this).
I have this small technical question about caching...
I am planning to use caching for my website and I was wondering if the cached file where save on visitors personal computers !?
I asked somebody and told me that they are saved on HTML files, and these are not on visitors Personal PC
Regards
That depends on what you mean by Cache. Most sites use caching to save bandwidth by reducing the hits to the database or other server resources by not having to re-generate dynamic content on every request. On the other hand, browsers will cache JavaScript and CSS files from websites on the local computer as a part of their normal process. Cookies are 'caching' important information specific to that computer / user and are also stored by the browser locally.
I am assuming that your talking about pages on a server, reusing them for multiple requests. That can be stored as tmp files or as entries in a database on the server (CakePHP and CSP comes to mind here). It really depends on your configuration and what you decide you want to do.
I am new to magento and i have just starting setting up my first sites. One of the requirements i am after is to store all images files on a seperate server from which the site is hosted on. I have briefly looked into amazon cloudfront and the following plugin:
http://www.magentocommerce.com/magento-connect/cloudfront-cdn.html
This works alongside my cloudfront distribution setup so the images are being accessed from the cdn alongside the js,css etc when i check the source. My issue is they still reside on my own server too.
Is there a way to have everything just on a cdn so that my server disk space can be kept as low as possible with only the template files on there, no images?
Based on experience, I would really recommend you do not try completely removing any media files away from your actual server disk. The main role of CDN should just be to completely mirror these whenever files are new or updated and such.
However, if you really want to do this I would also sternly warn you that you do not attempt this with js and css files. The trouble is just not worth it. You'll see why later.
So we're left with media or mostly image files. These files are usually very large thus the reasoning behind moving it away from server disk.
My strategy and what I did before is I used an S3 bucket behind the Cloudfront CDN. I moved all stuff from the media directory to S3, Cloudfront configured to pull from S3, then Cloudfront CDN then is CNAME'd as media.mydomain.com. Obviously, I would then set my media base URLs (System > Configuration > General > Web) to http://media.mydomain.com/media/ and https://media.mydomain.com/media/.
It worked perfectly. No CORS issues at all because I did not touch CSS/JS base url paths. Because for those files, I just relied on the free Cloudflare CDN (yeah, yeah, I know).
Next thing I knew and saw defects with this setup is that all uploads do not work at all. WSIWYG uploads do not go to the S3 bucket immediately.. However, there was a solution using s3fuse though which then immediately degraded into a problem as it had bad memory leaks too.
What ultimately worked is we just paid for the additional disk space (we were using Amazon AWS), Wrapped the whole domain on Cloudflare CDN, and when we needed SSL we upgraded to Pro.
Simple, it works and it's head-ache free.
NB: I'm not connected with Cloudflare whatsoever, I'm just really really happy with their service.
The W3C has started throttling requests to XSD/DTD files, adding as much as a minute of latency to the request:
http://www.w3.org/Help/Webmaster.html#slowdtd
I want to be able to mirror the standards specifications locally so that users don't have to wait for the server to respond. However I'm struggling to find a file list for the W3C standards.
Anyone know such as list or has some way to produce one, or knows of a W3C mirror site?
Thanks
The best answers to your questions are the ones in the blog post linked from the URI you cite.
Options available to users of XML software (as opposed to developers) include:
running a local caching proxy
storing a local copy of the DTD files, making an OASIS XML Catalog for them, and using XML processors that support catalogs
complaining vocally to the vendors of the commercial XML software you use, if they do not support catalogs
helping modify any open source XML software you use to make it support catalogs
Use the OpenKomodo Github repo to grab the most common DTDs, then reference the local copy rather than depending on a third party.