Has anybody tried?
Here is the use case. In a first request-response cycle, this would happen:
Request 1:
GET / HTTP/1.1
...
Response 1
HTTP/1.0 200 OK
Etag: version1
Cache-control: max-age=1
... angly html here
....<link href="mycss.css" >
...
Request 2:
GET /mycss.css HTTP/1.1
...
Response 2 (probably pushed):
Etag: version1
Cache-control: max-age=<duration-of-the-universe>
...
... brackety css ...
...
and then, when the browsers goes a second time to the same page, it will of course fetch again the "/" resource because of the very short max-age:
GET / HTTP/1.1
...
If-not-modified: version1
But it won't fetch mycss.css if it has it in cache. However, the server can use the validator present in the "if-not-modified" header of the request for "/" to get an idea of the client's cache age, and may conclude that mycss.css version's of the browser is too old. In that case, before even answering the previous request, the server can "promise" a new version of mycss.css/
By the specs, should the browser accept and use it?
Overview:
I still don't know what the answer to my question is from a purely theoretical side, but at least today it doesn't seem possible in practice to do cache-busting this way :-(, with neither Google Chrome or Firefox. Both reject or ignore the pushed stream if they believe that the resource they have in cache is fresh.
I also got this from somebody who prefers to remain anonymous:
Browsers will typically put resources received through push in a
"demilitarized zone" and only once the client asks for that resource
it will be moved into the actual cache. So just pushing random
things will not make them end up in the browser cache even if the
browser accepts them at the push moment.
Update
As early 2016, it is still not possible, due mainly to lack of consensus on how this should be handled, and if it should be allowed at all or not.
As this page shows, even with HTTP/2, the way to solve the stale assets issue is to create a unique URL for each asset version, and then ensure that the user receives that new URL when they re-visit the page.
Related
I followed this tutorial (https://learn.microsoft.com/en-us/aspnet/core/performance/caching/response?view=aspnetcore-2.1) to implement ResponseCache on my controller-action.
In short, I added
services.AddResponseCaching(); and app.UseResponseCaching(); in the startup and this tag [ResponseCache( Duration = 30)] on my controller.
Then I added a <h2>#DateTime.Now</h2> in my view and what I expected.... was the same datetime.now for 30 seconds.
But it doesn't, it just shows the new time on every reload (F5).
I made sure my devtools in chrome do not say 'disable cache'.
It's both with and without the chrome devtools open, on my local machine, now trying on a brandnew .net core mvc project.
One thing I noticed (with devtools open) is that the request has this header: Cache-Control: max-age=0. Does this influence the behaviour?
I thought it would mean something because it looks like the request says 'no cache' but that strikes me as weird because I didn't put the header in and I would say the default behaviour of chrome wouldn't be to ignore caches?
A header like Cache-Control: max-age=0 effectively disables all caching. Resources are basically expired as soon as they come off the wire, so they are always fetched. This header originates from the server. The client has nothing to do with it.
Assuming you haven't disabled response caching manually in some way by accident. Then, the most likeliest situation is that you're doing something where the response caching middleware will never cache. The documentation lists the following conditions that must be satisfied before responses will be cached, regardless of what you do:
The request must result in a server response with a 200 (OK) status code.
The request method must be GET or HEAD.
Terminal middleware, such as Static File Middleware, must not process the response prior to the Response Caching Middleware.
The Authorization header must not be present.
Cache-Control header parameters must be valid, and the response must be marked public and not marked private.
The Pragma: no-cache header must not be present if the Cache-Control header isn't present, as the Cache-Control header overrides the Pragma header when present.
The Set-Cookie header must not be present.
Vary header parameters must be valid and not equal to *.
The Content-Length header value (if set) must match the size of the response body.
The IHttpSendFileFeature isn't used.
The response must not be stale as specified by the Expires header and the max-age and s-maxage cache directives.
Response buffering must be successful, and the size of the response must be smaller than the configured or default SizeLimit.
The response must be cacheable according to the RFC 7234 specifications. For example, the no-store directive must not exist in request or response header fields. See Section 3: Storing Responses in Caches of RFC 7234 for details.
However, in such situations, the server should be sending Cache-Control: no-cache, not max-age=0. As a result, I'm leaning towards some misconfiguration somewhere, where you have set this max age value and either forgot or overlooked it.
This is working for me in a 3.1 app to not let F5/Ctrl+F5 or Developer Tools in Firefox or Chrome bypass server cache for a full response.
In startup add this little middleware before UseResponseCaching().
// Middleware that fixes server caching on F5/Reload
app.Use(async (context, next) =>
{
const string cc = "Cache-Control";
if (context.Request.Headers.ContainsKey(cc))
{
context.Request.Headers.Remove(cc);
}
const string pragma = "Pragma";
if (context.Request.Headers.ContainsKey(pragma))
{
context.Request.Headers.Remove(pragma);
}
await next();
});
app.UseResponseCaching();
Haven't noticed any problems...
I want to improve my experience with the internet by caching static content like images(jpg,png,gif) and fonts. Because always happens that when watching a webpage with a lot of images, and then I refresh with F5, the same contents are downloaded again.
I know that it's because the response headers could contain no cache o max-age 0, and even sometimes it happens when there is no cache o max-age in the response.
But in case of images or fonts that never change, it's useless to get max-age 0. So I wanted to know if there is a way to override the response headers and set them with max-age 1 year. Maybe with a chrome extension?
Yes you can do this by using a Chrome Extension. See this Change HTTP Headers chrome extension already does it.
For your specific case, you just need to do this:
Add an event listener which should be called whenever headers are received.
Read the details of the headers
Check if response content type is image
Add/Update the desired header to the headers
To accomplish this you can use webRequest Headers Received event.
Documentation of onHeadersReceived
onHeadersReceived (optionally synchronous): Fires each time that an HTTP(S) response header is received. Due to redirects and authentication requests this can happen multiple times per request. This event is intended to allow extensions to add, modify, and delete response headers, such as incoming Set-Cookie headers.
Your code will look something like this
chrome.webRequest.onHeadersReceived.addListener(function(details){
for(var i = 0; i < details.responseHeaders.length; i++) {
// If response is of image, add the cache-control header
}
return {responseHeaders: details.responseHeaders};
},
{urls: ['https://*/*'], types: ['image'] },
['blocking', 'responseHeaders']);
PS: I have not run and tested the code so please excuse the typos.
EDIT (After #RobW comment)
No, this is not possible as of now (22 march 2014). Adding Cache-control has no influence on the caching behavior. Check out this answer for more details.
Albeit this is an old question, I stumbled upon it recently. Later I found the chrome extension "Speed-Up Browsing" which seems to do exactly what the OP asked for.
For me it worked.
https://chrome.google.com/webstore/detail/speed-up-browsing/hkhnldpdljhiooeallkmlajnogjghdfb?hl=en
You could use Fiddle https://www.telerik.com/fiddler (free) as proxy adding caching for selected URLs or patterns. I did that a
I'm developing an application which is supposed to serve different content for "normal" browser requests and AJAX requests for the same URL requested.
(in fact, encapsulate the response HTML in JSON object if the request is AJAX).
For this purpose, I'm detecting an AJAX request on the server side, and processing the response appropriately, see the pseudocode below:
function process_response(request, response)
{
if request.is_ajax
{
response.headers['Content-Type'] = 'application/json';
response.headers['Cache-Control'] = 'no-cache';
response.content = JSON( some_data... )
}
}
The problem is that when the first AJAX request to the currently viewed URL is made strange things happens on Google Chrome - if, right after the response comes and is processed via JavaScript, user clicks some link (static, which redirects to other page) and then clicks back button in the browser, he sees the returned JSON code instead of the rendered website (logging the server I can say that no request is made). It seems for me that Chrome stores the latest request response for the specific URL, and doesn't take into account that it has different content-type etc.
Is that a bug in the Chrome or am I misusing HTTP protocol ?
--- update 12 11 2012, 12:38 UTC
following PatrikAkerstrand answer, I've found following Chrome bug: http://code.google.com/p/chromium/issues/detail?id=94369
any ideas how to avoid this behaviour?
You should also include a Vary-header:
response.headers['Vary'] = 'Content-Type'
Vary is a standard way to control caching context in content negotiation. Unfortunately it has also buggy implementations in some browsers, see Browser cache vary broken.
I would suggest using unique URLs.
Depending of you framework capabilities you can redirect (302) the browser to URL + .html to force response format and make cache key unique within browser session. Then for AJAX requests you can still keep suffix-less URL. Alternatively you may suffix AJAX URL with .json instead .
Another options are: prefixing AJAX requests with /api or adding some cache boosting query params ?rand=1234.
Setting cache-control to no-store made it in my case, while no-cache didn't. This may have unwanted side effects though.
no-store: The response may not be stored in any cache. Although other directives may be set, this alone is the only directive you need in preventing cached responses on modern browsers.
Source: Mozilla Developer Network - HTTP Cache-Control
Right now I'm using this:
location ~* \.(js|css)$ { # |png|jpg|jpeg|gif|ico
expires max;
#log_not_found off; # what's this for?
}
And this is what I see in firebug:
Did it work? If I didn't get it wrong, my browser is asking for the file again, and nginx is answering 'not modified', so my browser uses the cache. But I thought the browser shouldn't even ask for the file, it already knows it will never expire.
Any thoughts?
Do not use F5 to reload the page. Use click on the url + enter, or click in a link. That's how I got only 1 request.
Clearly , your file is not stale as its max-age and expiry date are still valid and hence the browser will not communicate with server.The Browser doesn't ask for the file unless it is stale. i.e. its cache-control ( max -age) is over or Expiry date is gone. In that case it will ask the serve if the given copy is still valid or not. if yes, it will serve same copy, else it will get new one.
Update :
See, here is the thing. F5/refresh will always make browser to request the server if anything is modified or not. It will have If-Modified-Since in Request header. While it is different from just navigating the site, coming back to pages and click events in which browser will not ask server , and load from cache silently( no server call). Also, if you are testing on firefox Live HTTP Headers, it will show you exactly what is requested, while Firebug will always show you If-Modified-Since. Safari's developer menu should show load time as 0. Hope it helps.
I want to turn off the cache used when a URL call to a server is made from VBScript running within an application on a Windows machine. What function/method/object do I use to do this?
When the call is made for the first time, my Linux based Apache server returns a response back from the CGI Perl script that it is running. However, subsequent runs of the script seem to be using the same response as for the first time, so the data is being cached somewhere. My server logs confirm that the server is not being called in those subsequent times, only in the first time.
This is what I am doing. I am using the following code from within a commercial application (don't wish to mention this application, probably not relevant to my problem):
With CreateObject("MSXML2.XMLHTTP")
.open "GET", "http://myserver/cgi-bin/nsr/nsr.cgi?aparam=1", False
.send
nsrresponse =.responseText
End With
Is there a function/method on the above object to turn off caching, or should I be calling a method/function to turn off the caching on a response object before making the URL?
I looked here for a solution: http://msdn.microsoft.com/en-us/library/ms535874(VS.85).aspx - not quite helpful enough. And here: http://www.w3.org/TR/XMLHttpRequest/ - very unfriendly and hard to read.
I am also trying to force not using the cache using http header settings and html document header meta data:
Snippet of server-side Perl CGI script that returns the response back to the calling client, set expiry to 0.
print $httpGetCGIRequest->header(
-type => 'text/html',
-expires => '+0s',
);
Http header settings in response sent back to client:
<html><head><meta http-equiv="CACHE-CONTROL" content="NO-CACHE"></head>
<body>
response message generated from server
</body>
</html>
The above http header and html document head settings haven't worked, hence my question.
I don't think that the XMLHTTP object itself does even implement caching.
You send a fresh request as soon as you call .send() on it. The whole point of caching is to avoid sending requests, but that does not happen here (as far as your code sample goes).
But if the object is used in a browser of some sort, then the browser may implement caching. In this case the common approach is to include a cache-breaker into the statement: a random URL parameter you change every time you make a new request (like, appending the current time to the URL).
Alternatively, you can make your server send a Cache-Control: no-cache, no-store HTTP-header and see if that helps.
The <meta http-equiv="CACHE-CONTROL" content="NO-CACHE> is probably useless and you can drop it entirely.
You could use WinHTTP, which does not cache HTTP responses. You should still add the cache control directive (Cache-control: no-cache) using the SetRequestHeader method, because it instructs intermediate proxies and servers not to return a previously cached response.
If you have control over the application targeted by the XMLHTTP Request (which is true in your case), you could let it send no-cache headers in the Response. This solved the issue in my case.
Response.AppendHeader("pragma", "no-cache");
Response.AppendHeader("Cache-Control", "no-cache, no-store");
As alternative, you could also append a querystring containing a random number to each requested url.