Force chrome to cache static content like images - caching

I want to improve my experience with the internet by caching static content like images(jpg,png,gif) and fonts. Because always happens that when watching a webpage with a lot of images, and then I refresh with F5, the same contents are downloaded again.
I know that it's because the response headers could contain no cache o max-age 0, and even sometimes it happens when there is no cache o max-age in the response.
But in case of images or fonts that never change, it's useless to get max-age 0. So I wanted to know if there is a way to override the response headers and set them with max-age 1 year. Maybe with a chrome extension?

Yes you can do this by using a Chrome Extension. See this Change HTTP Headers chrome extension already does it.
For your specific case, you just need to do this:
Add an event listener which should be called whenever headers are received.
Read the details of the headers
Check if response content type is image
Add/Update the desired header to the headers
To accomplish this you can use webRequest Headers Received event.
Documentation of onHeadersReceived
onHeadersReceived (optionally synchronous): Fires each time that an HTTP(S) response header is received. Due to redirects and authentication requests this can happen multiple times per request. This event is intended to allow extensions to add, modify, and delete response headers, such as incoming Set-Cookie headers.
Your code will look something like this
chrome.webRequest.onHeadersReceived.addListener(function(details){
for(var i = 0; i < details.responseHeaders.length; i++) {
// If response is of image, add the cache-control header
}
return {responseHeaders: details.responseHeaders};
},
{urls: ['https://*/*'], types: ['image'] },
['blocking', 'responseHeaders']);
PS: I have not run and tested the code so please excuse the typos.
EDIT (After #RobW comment)
No, this is not possible as of now (22 march 2014). Adding Cache-control has no influence on the caching behavior. Check out this answer for more details.

Albeit this is an old question, I stumbled upon it recently. Later I found the chrome extension "Speed-Up Browsing" which seems to do exactly what the OP asked for.
For me it worked.
https://chrome.google.com/webstore/detail/speed-up-browsing/hkhnldpdljhiooeallkmlajnogjghdfb?hl=en

You could use Fiddle https://www.telerik.com/fiddler (free) as proxy adding caching for selected URLs or patterns. I did that a

Related

ASP.NET MVC Page with ResponseCache on action DOES return new content instead of cache

I followed this tutorial (https://learn.microsoft.com/en-us/aspnet/core/performance/caching/response?view=aspnetcore-2.1) to implement ResponseCache on my controller-action.
In short, I added
services.AddResponseCaching(); and app.UseResponseCaching(); in the startup and this tag [ResponseCache( Duration = 30)] on my controller.
Then I added a <h2>#DateTime.Now</h2> in my view and what I expected.... was the same datetime.now for 30 seconds.
But it doesn't, it just shows the new time on every reload (F5).
I made sure my devtools in chrome do not say 'disable cache'.
It's both with and without the chrome devtools open, on my local machine, now trying on a brandnew .net core mvc project.
One thing I noticed (with devtools open) is that the request has this header: Cache-Control: max-age=0. Does this influence the behaviour?
I thought it would mean something because it looks like the request says 'no cache' but that strikes me as weird because I didn't put the header in and I would say the default behaviour of chrome wouldn't be to ignore caches?
A header like Cache-Control: max-age=0 effectively disables all caching. Resources are basically expired as soon as they come off the wire, so they are always fetched. This header originates from the server. The client has nothing to do with it.
Assuming you haven't disabled response caching manually in some way by accident. Then, the most likeliest situation is that you're doing something where the response caching middleware will never cache. The documentation lists the following conditions that must be satisfied before responses will be cached, regardless of what you do:
The request must result in a server response with a 200 (OK) status code.
The request method must be GET or HEAD.
Terminal middleware, such as Static File Middleware, must not process the response prior to the Response Caching Middleware.
The Authorization header must not be present.
Cache-Control header parameters must be valid, and the response must be marked public and not marked private.
The Pragma: no-cache header must not be present if the Cache-Control header isn't present, as the Cache-Control header overrides the Pragma header when present.
The Set-Cookie header must not be present.
Vary header parameters must be valid and not equal to *.
The Content-Length header value (if set) must match the size of the response body.
The IHttpSendFileFeature isn't used.
The response must not be stale as specified by the Expires header and the max-age and s-maxage cache directives.
Response buffering must be successful, and the size of the response must be smaller than the configured or default SizeLimit.
The response must be cacheable according to the RFC 7234 specifications. For example, the no-store directive must not exist in request or response header fields. See Section 3: Storing Responses in Caches of RFC 7234 for details.
However, in such situations, the server should be sending Cache-Control: no-cache, not max-age=0. As a result, I'm leaning towards some misconfiguration somewhere, where you have set this max age value and either forgot or overlooked it.
This is working for me in a 3.1 app to not let F5/Ctrl+F5 or Developer Tools in Firefox or Chrome bypass server cache for a full response.
In startup add this little middleware before UseResponseCaching().
// Middleware that fixes server caching on F5/Reload
app.Use(async (context, next) =>
{
const string cc = "Cache-Control";
if (context.Request.Headers.ContainsKey(cc))
{
context.Request.Headers.Remove(cc);
}
const string pragma = "Pragma";
if (context.Request.Headers.ContainsKey(pragma))
{
context.Request.Headers.Remove(pragma);
}
await next();
});
app.UseResponseCaching();
Haven't noticed any problems...

Disable cache in ExtLib REST control (which uses dojox.data.JsonRestStore)

In my XPage I have a xe:djxDataGrid (dojox.grid.datagrid) which uses xe:restService which seems to use dojox.data.JsonRestStore.
Everything works fine without proxy but my client accesses the application via a proxy because of corporate policy. After a user updates data in the DataGrid it shows old values when accessed behind the proxy.
When the REST Control/JsonRestStore sends an ajax GET request to get data, there is no Cache-Control parameter in request headers. And Domino does not place Expires parameter in the reponse headers. I believe that's why the old version of the GET request gets cached by the proxy.
We have tried to disable cache in browsers but that does not help which indicates the proxy is caching the requests.
I believe this could be solved either by:
Setting Cache-Control parameter in request headers OR
Setting Expires parameter in response headers
But I haven't found a way to set either of these. For the XPage Domino sets Expires:-1 response header but not for the ajax GET request which is:
/mypage.xsp/?$$viewid=!ddrg6o7q1z!&$$axtarget=view:_id1:_id2:callback1:restService1
This returns the JSON data to JsonRestStore and gets cached by the proxy.
One options is to try to get an exception to the proxy so requests to this site would bypass the proxy cache. But exceptions are generally not easy to get thru.
Any ideas? Thanks.
Update1
My colleque suggested that I could intercept the xhr GET requests made by dojox.data.JsonRestStore and add a time parameter to the URL to prevent cache. Here is my question about that:
Prevent cache in every Dojo xhr request on page
Update2
#SvenHasselbach has a great solution for preventing cache for all xhrs:
http://openntf.org/XSnippets.nsf/snippet.xsp?id=cache-prevention-for-dojo-xhr-requests
It seems to work perfectly, &dojo.preventCache= parameter is added to the URLs and the requests seem to return correct JSON also with this parameter. But the DataGrid stops working when I use that code. Every xhr causes this error:
Tried with Firefox and Chrome. The first page of data still loads because xhr interception is not yet in place but the subsequent pages show only "..." in each cell.
The solution is Sven Hasselbach's code in the comment section of Julian Buss's blog which needs to be slightly modified.
I changed xhrPost to xhrGet and did not place the code to dojo.addOnLoad. When placed there it was not effective in the first XHR by the DataGrid/Store.
I also removed the headers modification because it overrides existing headers. When the REST control requests data from server with xhrGet the URL is always the same and rows requested are in HTTP header like this:
Range: items=0-9
This (and other) headers disappear when the original code is used. To just add headers we would have take the existing headers from args and append to them. I didn't see a need for that because it should be enough to add the parameter in the URL. Here is the extremely simple code I'm using:
if( !(dojo._xhrGet )) {
dojo._xhrGet = dojo.xhrGet;
}
dojo.xhrGet = function (args) {
args['preventCache'] = true;
return dojo._xhrGet(args);
}
Now I'm getting all rows and all XHR Get URLs have &dojo.preventCache= parameter which is exactly what I wanted. Next we'll test in customer environment to see if this solves their problem.
Update
As Julian points out in his blog I could also use a Web Site Rule to set Expires or cache-control http response headers.
Update
The customer reports it's working now for them!

Serving content depending on http accept header - caching problems?

I'm developing an application which is supposed to serve different content for "normal" browser requests and AJAX requests for the same URL requested.
(in fact, encapsulate the response HTML in JSON object if the request is AJAX).
For this purpose, I'm detecting an AJAX request on the server side, and processing the response appropriately, see the pseudocode below:
function process_response(request, response)
{
if request.is_ajax
{
response.headers['Content-Type'] = 'application/json';
response.headers['Cache-Control'] = 'no-cache';
response.content = JSON( some_data... )
}
}
The problem is that when the first AJAX request to the currently viewed URL is made strange things happens on Google Chrome - if, right after the response comes and is processed via JavaScript, user clicks some link (static, which redirects to other page) and then clicks back button in the browser, he sees the returned JSON code instead of the rendered website (logging the server I can say that no request is made). It seems for me that Chrome stores the latest request response for the specific URL, and doesn't take into account that it has different content-type etc.
Is that a bug in the Chrome or am I misusing HTTP protocol ?
--- update 12 11 2012, 12:38 UTC
following PatrikAkerstrand answer, I've found following Chrome bug: http://code.google.com/p/chromium/issues/detail?id=94369
any ideas how to avoid this behaviour?
You should also include a Vary-header:
response.headers['Vary'] = 'Content-Type'
Vary is a standard way to control caching context in content negotiation. Unfortunately it has also buggy implementations in some browsers, see Browser cache vary broken.
I would suggest using unique URLs.
Depending of you framework capabilities you can redirect (302) the browser to URL + .html to force response format and make cache key unique within browser session. Then for AJAX requests you can still keep suffix-less URL. Alternatively you may suffix AJAX URL with .json instead .
Another options are: prefixing AJAX requests with /api or adding some cache boosting query params ?rand=1234.
Setting cache-control to no-store made it in my case, while no-cache didn't. This may have unwanted side effects though.
no-store: The response may not be stored in any cache. Although other directives may be set, this alone is the only directive you need in preventing cached responses on modern browsers.
Source: Mozilla Developer Network - HTTP Cache-Control

How to set nginx cache headers to never expire?

Right now I'm using this:
location ~* \.(js|css)$ { # |png|jpg|jpeg|gif|ico
expires max;
#log_not_found off; # what's this for?
}
And this is what I see in firebug:
Did it work? If I didn't get it wrong, my browser is asking for the file again, and nginx is answering 'not modified', so my browser uses the cache. But I thought the browser shouldn't even ask for the file, it already knows it will never expire.
Any thoughts?
Do not use F5 to reload the page. Use click on the url + enter, or click in a link. That's how I got only 1 request.
Clearly , your file is not stale as its max-age and expiry date are still valid and hence the browser will not communicate with server.The Browser doesn't ask for the file unless it is stale. i.e. its cache-control ( max -age) is over or Expiry date is gone. In that case it will ask the serve if the given copy is still valid or not. if yes, it will serve same copy, else it will get new one.
Update :
See, here is the thing. F5/refresh will always make browser to request the server if anything is modified or not. It will have If-Modified-Since in Request header. While it is different from just navigating the site, coming back to pages and click events in which browser will not ask server , and load from cache silently( no server call). Also, if you are testing on firefox Live HTTP Headers, it will show you exactly what is requested, while Firebug will always show you If-Modified-Since. Safari's developer menu should show load time as 0. Hope it helps.

WP7 - Prevent RestSharp from caching

I use RestSharp in my Windows Phone 7.1 project.
My problem is RestSharp always cache response data.
Example:
At the first time I send request, it returns data correctly. After some delete operations, I send that request again, but response seems the same as the first time, nothing's changed.
If I stop debugging and press F5 to start again, it works perfectly as expected.
I also tried request.AddParameter("cache-control", "no-cache", ParameterType.HttpHeader); and got no luck.
How can I fix this problem?
I have the same issue so just add header that specify not to cache response data
client is my RestClient with base url and than add default header Cache-Control with value no-cache.
client.AddDefaultHeader("Cache-Control", "no-cache")
I found solution in Rico Suter comment, thanks! I will mark this as accepted anwser
its a hack but try something like url = originalUrl + "&nocache=" + DateTime.Now.Ticks
The "Cache-Control" header should do the trick!
I think HTTP Headers are case-insensitive, but the server may not agree with me there! You should try using Cache-Control instead of cache-control...
Also, I would also add the Pragma header with no-cache value to the request (some old servers don't use the "Cache-Control" header, but they will sure recognize this one)!
And I would try to use Fiddler to debug the comms and check that the headers are really being sent to the server as expected!
Another solution can be to set the "If-Modified-Since" header with value of DateTime.Now:
client.AddDefaultParameter("If-Modified-Since", DateTime.Now, ParameterType.HttpHeader);

Resources