I'm running Cloudflare in front of a web application, and often need to manually purge the Cloudflare cache after pushing in order to test changes I've made. Since I'm currently using GitHub Actions for my CI/CD pipeline, I could stop doing this manually altogether and use one of the many third-party Actions to purge the cache after every push, but this seems like it would reduce the efficacy of the cache and cause users to feel the impact.
Is there a (presumably programmatic) way to purge the Cloudflare cache for modified files only, either using GitHub Actions or otherwise? It seems that this should be technically feasible since Git already knows which files have been modified and Cloudflare does allow cache invalidation via URLs on the free tier (although not with wildcards).
Related
How to manage to Serve static assets with an efficient cache policy and minimize main-thread work.
Serve static assets with an efficient cache policy - this is a suggestion that assets should have at least one month cache set on them (usually done via .htaccess).
It looks like you have already done this for everything you can control as the items listed are external assets you cannot set the cache policy on.
If you have done it for all your own resources (from your domain) then do not worry.
minimise main-thread work - this means that your site is using a LOT of JavaScript or performing a lot of calculations on page load.
The only way to improve this is to remove unnecessary JS and optimise anything that is remaining.
My guess is you are using lots of plugins / libraries to do simple things on the site that could more easily be achieved through other methods.
Post the URL of your site and I will improve this answer to give more relevant answers that may help you and others.
Work around for efficient caching
One way you could fix this issue (but you need to know what you are doing) is to download the script in question to your server every night via a cron job and serve it from your server instead.
That way you can set the cache time - however you need to make sure you do cache busting on the script each time you download a different version (by comparing the previous file and the new file and checking for changes) so you don't break functionality.
As you can imagine, this technique is only used in extreme circumstances where you can justify the need to control the cache policy due to the massively increased complexity and potential for problems.
I have a web server cluster that contains many running web server instances. each instance cache some configurations in its local memory, the original configurations are stored in Database.
these configurations are used for every request, so the cache may necessary for performance reason.
I want to provide an admin page, in which, the administrator can change the configurations. how do I update all the cache in every server instance?
now I have two solutions for this:
set an expire time for the cache.
when administrator update the configuration, notify each instance via some pub/sub mechanism(e.g. use redis).
for solution 1, the drawback is the changes can not take effect immediately.
for solution 2, I'm wondering, if the pub/sub will have impact on the performance of the web server.
which one is better? or is there any common solution for this problem?
Another drawback of option 1 is that you'll periodically hit your database unnecessarily.
If you're already using Redis then option 2 is a good solution. I've used it successfully and can't imagine how there could be a performance impact just because you're using pubsub.
Another option is to create a cache invalidation URL on each website, e.g. /admin/cache-reset/, and have your administration tool call the cache-reset URL on each individual server. The drawback of this solution is that you need to maintain a list of servers. If you're not already using Redis it could just be the simple/practical/low-tech solution that you're looking for.
There have been cases where things are getting cached and users have to manually clear the cache in order to use the latest version of the app. What's the best strategy for dealing with this?
There are a whole bunch of HTTP headers which deal with caching behaviour.
Configuring this correctly is possibly more an art than a science - without any caching, your site will feel very slow to end users, and you'll have to deal with a LOT of traffic to your site.
However, with aggressive caching, you may end up serving stale content, or old versions of your JavaScript and CSS.
A common solution to this is to use eTags and "modifiedSince" requests - this means the browser will check whether a resource in the cache is out of date by asking the server if it has a newer version, and only download the new version if it needs to.
I am using Joomal 3.3. and its hosted on the Godaddy and Godaddy's account file limit is 2500000. My Joomla is creating large amount cache files which goes over the limit. The website is a calendar driven and has lots of events, quires based on categories, etc. I have been cleaning cache manually. First, I would like to know why there are lots of cache files created and second I would appreciate any suggestion/solution that will resolve this issue permanently. Is there any plugin that would automatically clear the cache like every two days which I should I consider?
The problem that you have is the expired cache that should be deleted, but is not. You should purge the expired cache automatically by adding the following to your cron:
wget http://yourwebsite.com/cli/garbagecron.php
The above cron should preferably run on a daily basis. If you don't know how to add it, then you can always ask your host (GoDaddy) to do it for you.
We have 3 front-end servers each running multiple web applications. Each web application has an in memory cache.
Recreating the cache is very expensive (>1 min). Therefore we repopulate it using a web service call to each web application on each front-end server every 5 minutes.
The main problem with this setup is maintaining the target list for updating and the cost of creating the cache several times every few minutes.
We are considering using AppFabric or something similar but I am unsure how time consuming it is to get up and running. Also we really need the easiest solution.
How would you update an expensive in memory cache across multiple front-end servers?
The problem with memory caching is that it's unique to the server. I'm going with the idea that this is why you want to use AppFabric. I'm also assuming that you're re-creating the cache every few minutes to keep the in memory caches in sync across all servers. With all this work, I can well appreciate that caching is expensive for you.
It sounds like you're doing a lot of work that probably isn't necessary. This article has some detail about the caching mechanisms available within SharePoint. You may be interested in the output cache discussed near the top of the article. You may also want to read the linked TechNet article and the linked article called "Custom Caching Overview".
The only SharePoint way to do that is to use Service Application infrastructure. The only problem is that it requires some time to understand how it works. Also it's too complicated to do it from scratch. You might consider downloading one of existing applications and rename classes/GUIDs to match your naming conventions. I used this one: http://www.parago.de/2011/09/paragoservices-a-sharepoint-2010-service-application-sample/. In this case you can have single cache per N front-end servers.