Cloudflare cashing everyhing - caching

I'm using Cloudflare on server containing CRM but I'm facing problem that CloudFlare is cashing the whole page so when I add new data to page the new updates don't appear until I press CTRL+F5 and when I press F5 it's return the old page before the update.
I think CloudFlare is caching everything. What I Should do?

Purge cache in Cloudflare.
This is an example.
First:
Then:
Edit:
You can create a page rule for not caching the specific page at all.
Example: (This will let browser cache 30 min.)

In cloudflare page rule settings, bypass all URLs and write another rule to cache only static contents like css,js,images, etc.. and place this rule on top order.
Also check your cache-control settings on response header. Your problem seems like cache control header settings.

Related

How to prevent JavaScript files to be cached in IIS

I have a weird problem, javascript files are cached in IIS,
What I have done till now
Disabled caching on my website in IIS
Disable cache on chrome dev tools
Adding timespan at url in page to prevent any chaching
Doubled check to see if the files on disk are updated and they were updated
But my scripts and css not updating from latest version on disk, until I do IISReset
Your attempt to break the cache is short circuited by the fact you are getting the cached version of the content page with the old timestamp, meaning you never request the JavaScript page with the new timestamp until something else forces the content page to reload.
You can check this by using the same cache breaking scheme on the content page itself in addition to the JavaScript page.
This is not a recommended solution as you will bypass cache altogether for the content page.
Every answer I've seen for this problem has worked sometimes and other times not. If I get a solution, I'll post it.

HTTP GET request caching vs asp.net Output caching

I have a very basic doubt about caching. I have sent a request to an aspx page from browser. Since it is an http Get request by default it will be cached. I can see that in about:cache of browser. But if that page is cached, then how my modification (may be in css or js of that particular aspx page) is reflecting on next request. That means it is not taking that from cache?
But At that time cache expire shows something like this "1970-01-01 05:30:00" in about:cache of that aspx request. All other static pages (external js) it is showing future expiry date.
Does a "past expiry" date simply imply that the item should not be "fetched again" from cache?
If enabled Output caching I know the new modification will not see as long as the cache is not expired. But then how this asp.net output cache and http get by default caching mechanism differs? I know Output caching has the facility to cache it in server or proxy, so it will serve for multiple users. But at the browser level how it differs?
Btw I got the answer :)
http://devproconnections.com/aspnet/aspnet-pages-and-browser-caching
Thanks again.

Can you force a browser to always fetch the cached files and not do a round trip for a 304?

As I understand, this is how browser caching works. Assuming, a far future header has been set to let's say a year and foo.js is set to be cached. Here are some scenarios:
First visit to the page, server returns 200 and foo.js is cached for a year.
Next visit, browser checks the cache but has to check the server if foo.js has been modified. If not, server returns a 304 - Not Modified.
User is already on the page (and foo.js is in cache) clicks a link to go to another page, browser looks at the cached version of foo.js and serves it without doing a roundtrip to the server and returns a 200 (Cached).
User is already on the page (and foo.js is in cache) and for some reason hits F5/Reload, browser checks the cache but has to do a round trip to the server and check if foo.js has been modified. If not, server returns a 304.
As you can see, whenever a page is refreshed, it will always have to do a trip to the server to check if the file has been modified or not. I know this is not a lot and server will only return the header info but a round trip time in some cases are extremely important.
The question is, is there a way I can avoid this since I'm already setting the expiration for the files. I just want it to always fetch it from the cache until the expiration has expired or replace the file with something else (by versioning it).
From what I understand, pressing F5/Ctrl-R is browser specific action, thus leaving the control to browser.
What if the user clears the cache before clicking another action? So, even if there was HTTP specification to forcefully use cache in F5, there's no guarantee that you'll be able to achieve your need.
Simply configure and code to cache wherever maximum possible and leave the rest to user.
It looks like, when you navigate to a page (that is entering an address in URL bar or clicking a link), resources are fetched from cache without a HEAD request to server. But when you refresh the page it does the HEAD request ans so the RTT.
This looks more clear in Network tab of IE's Developer Tools. If you see the initator column, it says navigate for the first case and refresh for CTRL+R or F5.
You can override the F5 and CTRL+R behavior by adding an event listener on them and doing a window.location = window.location and prevent the default behavior by event.peventDefault or something similar. This will cause page navigation instead of refresh.
Also, I didn't test the case when the cached resource has actually changed on server. If that turns out to be a problem, you can solve it by version numbering of resources and generation of HTML with URLs pointing to the latest version of the resource (kind of like cache-manifest problem with HTML5 offline applications).
EDIT: This however doesn't solve the problem if user clicks on browser's refresh button; onbeforeunload event may help in that case.

Why does firefox always load the next page in the menu as well as the page I have requested?

I am working on a new website. While testing some of the functionality I had a number of debug statements and was watching the logs. It seems that Firefox (at least) loads the "next" page in the menu as well as the page I have clicked on. If I have menu items A B C D E and click on B then I see a request for mysite.com/B and then a request for mysite.com/C in the logs, and so on.
Is this some kind of look-ahead performance thing? Is there any way to avoid it (setting an attribute on the link maybe?) The problem is that the second page in my menu is somewhat heavier as it loads a whole lot of data from a web service. I'm happy for people to do that if they want to use the functionality, but would rather not that every visitor to the front page loads it unneccessarily. Is this behvaiour consistent across browser?
Yes, Firefox will prefetch links to improve the perceived performance to the user. You can read more about the functionality in Firefox here https://developer.mozilla.org/en-US/docs/Link_prefetching_FAQ
It isn't possible to disable this in the client's browser, however the request should include the header X-moz: prefetch which you can use to determine if it is in fact a prefetch request or not, and potentially return a blank page for prefetch requests. You can then use Cache-control: must-revalidate to make sure the page loads appropriately when actually requested by the user.
If you happen to be using Worpdress for your site, you can disable the tags with the prefetch information by using:
Wordpress 3.0+
//remove auto loading rel=next post link in header
remove_action('wp_head', 'adjacent_posts_rel_link_wp_head');
Older versions:
//remove auto loading rel=next post link in header
remove_action('wp_head', 'adjacent_posts_rel_link');
Yes, it's called prefetch. It can be turned off in the client, see the FAQ:
https://developer.mozilla.org/en-US/docs/Link_prefetching_FAQ
I'm not aware of a way to turn it off via the server

Can a corporate proxy cache whole pages?

We are seeing some odd errors when our customers test our ASP.NET web apps. There is a cart counter on the top of every page that tells you how many items are in the shopping cart. She reports that this number is changing as she moves from one page to the next. We cannot recreate this.
Is it possible that her corporate proxy server is caching the whole page and never actually contacting our server? This is a staging site on http, her production site is on https.
Revision: The page gets cached over HTTPS as well. It shows a completely cached version of our shopping cart page. If the user clicks the refresh button they get a current version of the page, but that new version becomes the cached version.
It's certainly possible that an intermediate proxy (corporate or otherwise) is caching your pages. Although I don't understand how that would explain the cart number on the page changing. If you don't want any caching to take place, send the appropriate HTTP headers along with each request you don't want cached:
Cache-Control: private, no-store, max-age=0
Expires: <some date in the past>
Pragma: no-cache
The first line above is for HTTP 1.1 clients, and the second 2 are for HTTP 1.0 clients. Check out section 14.9 of the HTTP 1.1 protocol spec for all the gory details.
There's also a setting in IE that could cause this behavior. Go to "Tools" > "Internet Options". On the "General" tab, click "Settings" under "Browsing History". Make sure "Check for newer versions of stored pages" is set to "Automatically". This is the default value.
I had a user who changed this to "Never" and was wondering why he always saw old content. :)

Resources