Caching and HTTPS - caching

I've noticed something interesting while monitoring the network communications between my browser and server. It has something to do with caching.
Say I have a CSS file http://domain.com/main.css (used in unsecured pages), which can also be accessed via https://domain.com/main.css (used in secured pages).
When I first load an unsecured page, the CSS file gets a 200 OK. When I reload the page (or go to another unsecured page), I get a 304 Not Modified.
When I go to a secured page for the first time, the CSS file from the https source gets a 200 OK. And when I reload the page (or go to another secured page, I get a 304 Not Modifie.
When I return to the unsecured page, the CSS file still gets a 304 Not Modified.
When I return to the secured page, the CSS file gets a 200 OK. What happened to the cached copy? How can I make it cached?

This might answer your question. It might be the case that your website defines this resource as non cacheble by defining this :
Cache-Control private, must-revalidate, max-age=0
for example ( when accessing https://www.google.com/ncr) causing your browser not to cache it. Do you have Fire-bug\Fiddler or anything similar to view the response headers?

Related

Why does Chrome make a request for already cached image with max-age header?

My goal is to cache images in browser's cache.
Current setup
Cache-Control: public, max-age=10368000 header is attached to the requested image to cache the image.
Furthermore, ETag: f5f9f1fb39790a06c5b93735ac6e2397 header is attached too to check if image has changed once max-age is reached.
When the same image is requested again, if-none-match header's value is checked to see if image has changed. If it has not, 304 is returned.
I am using Fiddler to inspect http traffic (note - behaviour described below was happening already before Fiddler, so it is not the problem)
Chrome version - 77.0.3865.90 (Official Build) (64-bit)
Expected behaviour
I expect that the image would get cached 10368000 seconds. Unless 10368000 seconds have passed, image gets served from browser's cache on each request. Once 10368000 seconds have passed, request is made to the server which checks Etag. If the image has not changed on the server, return 304 to the client and extend caching period to another 10368000 seconds.
Behaviour with Google Chrome
Image gets requested for the first time. Below are the request headers
Image is returned successfully with 200 status. Below are the response headers
Fiddler tells that an image was requested and served
Furthermore, the fact that the image is cached is seen in ~/Library/Caches/Google/Chrome as it appears there.
I click on the link bar in browser and press Enter to request the image again. Below are the request headers
Image is returned successfully with 304 status. Below are the response headers
Fiddler tells that image was served from the cache
Why did Chrome make an additional request for the image and served it from the cache only after it received 304 response?
I see that when requesting the image second time Cache-Control: max-age=0 request header is set. As I understand, it means that Chrome wants to re-validate if cached image is valid before serving it. But why?
Some people online have told that Etag header is the reason for Chrome making sure images are valid. Even if I do not include Etag header, just have Cache-Control: public, max-age=10368000 in the first response, the Cache-Control: max-age=0 exists in second request's headers.
I have also tried excluding public, making it private etc. I also added Expires and Last-Modified pair with and without max-age=10368000 and get the same behaviour.
Furthermore, in dev tools I have NOT checked Disable cache. So the cache is enabled for sure which also makes sense because the image was served after returning 304.
This exact behaviour happens also when, instead of clicking link bar and pressing Enter, I press Refresh arrow aka CMD + R. If I do hard refresh CMD + SHIFT + R, then the image is requested as if it is first time requesting it, which makes sense.
Behaviour with Firefox
Firefox works exactly as expected.
Image gets requested for the first time. Below are the request headers
Image is returned successfully with 200 status. Below are the response headers
Fiddler tells that an image was requested and served
Furthermore, if I hover over response status, then it says OK
I click on the link bar in browser and press Enter to request the image again. Below are the request headers
Image is returned successfully with 200 status. Below are the response headers
BUT Firefox shows that it was cached.
FURTHERMORE, Fiddler detected no activity on the second request, so the image was served from Firefox's cache. Unlike Chrome, which made another request to the server just to receive 304 from it.
Thank you very much for your time and help. I highly appreciate it and am open to provide any other information. Really looking forward to what you think about this.
Have a nice day :)
Why did Chrome make an additional request for the image and served it from the cache only after it received 304 response?
Because you told it to - whether you realise it or not! When you browse forwards and backwards the cached image will be used. When you click refresh or F5 (Cmd R on MacOS) or a page, you are explicitly asking Chrome to double check if the page is still the correct one. So when you click on the URL bar and click enter it’s the same thing. Chrome thinks that the only reason you’d do that, on a URL you are already on, you must want it to check.
I see that when requesting the image second time Cache-Control: max-age=0 request header is set. As I understand, it means that Chrome wants to re-validate if cached image is valid before serving it. But why?
Correct. See the answer to this question. As to why - because as per above you have Chrome a signal you wanted to refresh.
There is also the cache-control immutable header, which is designed to tell the browser never to re-fetch if in the cache - even on refresh.
Firefox works exactly as expected.
Interesting. I actually like the way Chrome does this. Certainly for reload (F5/Cmd + R). Appreciate it confused you (though hopefully will be less confusing now you understand it), but testing this is not the norm, and most people don’t click the URL bar and then hit enter without changing the URL unless they want to recheck the current page with the server.

Chrome use ajax cached version when back is pressed

I use ajax request beside pushing history state to load new content and update the entire page. At the server the X-Requested-With header is used to decide to send the full page or just the content. But it seems chrome tends to use the cache no matter it's loaded with ajax or normal request (it doesn't respect headers when checking the cache).
The problem happens when I open a site page, I click a link to navigate to a new page using ajax then navigate to a new page by entering the url in address bar. When I hit back the ajax cached version (no matter it's html or json) is shown instead of full page. When the cache is disabled everything works fine.
Is there any way to force chrome respect the request headers when checking the cache?
After some research I found out that browsers tend to cache responses base on Request Method as well as URL. So they won't consider any request headers when checking the cache by default. But it's possible to force the browser to respect some headers when checking the cache by using Vary header.
So by adding this header (Vary:X-Requested-With) to each response that changes based on X-Requested-With request header, server is telling browser that this response may vary if your X-Requested-With header is changed and you must request a new response.

HTTP GET request caching vs asp.net Output caching

I have a very basic doubt about caching. I have sent a request to an aspx page from browser. Since it is an http Get request by default it will be cached. I can see that in about:cache of browser. But if that page is cached, then how my modification (may be in css or js of that particular aspx page) is reflecting on next request. That means it is not taking that from cache?
But At that time cache expire shows something like this "1970-01-01 05:30:00" in about:cache of that aspx request. All other static pages (external js) it is showing future expiry date.
Does a "past expiry" date simply imply that the item should not be "fetched again" from cache?
If enabled Output caching I know the new modification will not see as long as the cache is not expired. But then how this asp.net output cache and http get by default caching mechanism differs? I know Output caching has the facility to cache it in server or proxy, so it will serve for multiple users. But at the browser level how it differs?
Btw I got the answer :)
http://devproconnections.com/aspnet/aspnet-pages-and-browser-caching
Thanks again.

Client-Side caching on IIS7 doesn't seem to work

I have set content caching on a specific folder by following the local web.config method. I don't think it works, and I would like to fix this.
I activate the cache using the IIS / HTTP Headers / Common headers feature. I set them to 1 day of expiration.
I opened a page with Google Chrome in private navigation, and then open the Network tab in the console.
The first time I load the page, everything loads from the site, obviously.
If I refresh the page, I see 2 types of loading in the Network console:
the files from Google and Facebook and such have a status of 200, and a size of (from cache).
the files from the folder for which I set the caching have a status of 304 and their size is displayed.
So, I guess the caching setting doesn't work? Or does the 304 response means that it's loaded from the cache? If they aren't, how can I make it work ?
Thanks !
304 is a way of caching. The server says that the content of the resource hasn't changed and it doesn't have to send the response content (the response content is empty).
This is quite convenient. The browser still makes the request for the resource (so that when it changes the new content is delivered to the client) but the server possibly saves the bandwidth by sending 304 + empty body instead of 200 and body content.

Confirming HTTP caching with Fiddler

How can I use Fiddler to confirm that HTTP caching is working? Is there another better way?
You can confirm caching by having a page fetch a resource and note that no request for the resource appeared in Fiddler. I can't think of a better way to do it. Works for me.
right click the URL in the fiddler and click properties, you can check the cach info in that popup under "WININET CACHE INFO"
Browse the site through the Fiddler as proxy. In each response details, there's a tab "Caching". This shows useful info about the response headers - e.g. what the different Cache-Control and Expires values mean.
I think the best way is to use the method demonstrated within most caching tutorials - Have a label on the page that displays the current server time. If the value is cached, you will not see it update with subsequent page refreshes until the cache is regenerated.
If your requirement is more complex (you need to use Fiddler), Anthony's suggestion is the one I have used successfully in the past.
Fiddler will definitely help with this. You'll either see the server respond with an HTTP 304 response (Not Modified - which tells the client that the cached item is still valid) or for content that has it's web expiry set correctly, you won't see a request at all.
In fact, you'll find Firefox plus FireBug will do this for you too.

Resources