We are seeing some odd errors when our customers test our ASP.NET web apps. There is a cart counter on the top of every page that tells you how many items are in the shopping cart. She reports that this number is changing as she moves from one page to the next. We cannot recreate this.
Is it possible that her corporate proxy server is caching the whole page and never actually contacting our server? This is a staging site on http, her production site is on https.
Revision: The page gets cached over HTTPS as well. It shows a completely cached version of our shopping cart page. If the user clicks the refresh button they get a current version of the page, but that new version becomes the cached version.
It's certainly possible that an intermediate proxy (corporate or otherwise) is caching your pages. Although I don't understand how that would explain the cart number on the page changing. If you don't want any caching to take place, send the appropriate HTTP headers along with each request you don't want cached:
Cache-Control: private, no-store, max-age=0
Expires: <some date in the past>
Pragma: no-cache
The first line above is for HTTP 1.1 clients, and the second 2 are for HTTP 1.0 clients. Check out section 14.9 of the HTTP 1.1 protocol spec for all the gory details.
There's also a setting in IE that could cause this behavior. Go to "Tools" > "Internet Options". On the "General" tab, click "Settings" under "Browsing History". Make sure "Check for newer versions of stored pages" is set to "Automatically". This is the default value.
I had a user who changed this to "Never" and was wondering why he always saw old content. :)
Related
I'm using Cloudflare on server containing CRM but I'm facing problem that CloudFlare is cashing the whole page so when I add new data to page the new updates don't appear until I press CTRL+F5 and when I press F5 it's return the old page before the update.
I think CloudFlare is caching everything. What I Should do?
Purge cache in Cloudflare.
This is an example.
First:
Then:
Edit:
You can create a page rule for not caching the specific page at all.
Example: (This will let browser cache 30 min.)
In cloudflare page rule settings, bypass all URLs and write another rule to cache only static contents like css,js,images, etc.. and place this rule on top order.
Also check your cache-control settings on response header. Your problem seems like cache control header settings.
We have an a HTTPS site that brings up a page from a different site of ours that’s HTTP.
In IE (9), we get the message at the bottom of the page:
“Only secure content is displayed. What’s the risk? [Show all content]”.
When the button is clicked, it closes the lightbox-ish control that's open and returns to the page it was overlaid on.
Does anyone know how to avoid this?
In the HTTP site’s page, one guy here had the idea to add, at the end of On_Load, the following to turn off cross-site scripting protection:
this.Response.Headers.Add("X-XSS-Protection", "0");
Both sites are C# / ASP.NET 4.0.
Thanks in advance!
Add the url to your trusted sites, it's the only way if you don't send all data through https.
Internet Options -> Security -> Trusted Sites -> Sites.
If this is something that needs to be company wide, I would recommend pushing out the rule via a group policy.
Alternatively, allow access the control using https on the other site (if you can) and reference that - the warning will disappear.
The real setting to enable here is to "Display mixed content" for the zone of the site you want. If the site is on your Intranet, you select Intranet zone in the Security settings, then Custom level. If it's an Internet site, you go there and go to Custom level.
There, you should see the "Display mixed content" setting, and simply select "Enable", then "OK" your way out of the dialogs.
Reference: https://www.mydigitallife.net/how-to-disable-only-secure-content-is-displayed-in-ie-always-show-all-mixed-content/
As I understand, this is how browser caching works. Assuming, a far future header has been set to let's say a year and foo.js is set to be cached. Here are some scenarios:
First visit to the page, server returns 200 and foo.js is cached for a year.
Next visit, browser checks the cache but has to check the server if foo.js has been modified. If not, server returns a 304 - Not Modified.
User is already on the page (and foo.js is in cache) clicks a link to go to another page, browser looks at the cached version of foo.js and serves it without doing a roundtrip to the server and returns a 200 (Cached).
User is already on the page (and foo.js is in cache) and for some reason hits F5/Reload, browser checks the cache but has to do a round trip to the server and check if foo.js has been modified. If not, server returns a 304.
As you can see, whenever a page is refreshed, it will always have to do a trip to the server to check if the file has been modified or not. I know this is not a lot and server will only return the header info but a round trip time in some cases are extremely important.
The question is, is there a way I can avoid this since I'm already setting the expiration for the files. I just want it to always fetch it from the cache until the expiration has expired or replace the file with something else (by versioning it).
From what I understand, pressing F5/Ctrl-R is browser specific action, thus leaving the control to browser.
What if the user clears the cache before clicking another action? So, even if there was HTTP specification to forcefully use cache in F5, there's no guarantee that you'll be able to achieve your need.
Simply configure and code to cache wherever maximum possible and leave the rest to user.
It looks like, when you navigate to a page (that is entering an address in URL bar or clicking a link), resources are fetched from cache without a HEAD request to server. But when you refresh the page it does the HEAD request ans so the RTT.
This looks more clear in Network tab of IE's Developer Tools. If you see the initator column, it says navigate for the first case and refresh for CTRL+R or F5.
You can override the F5 and CTRL+R behavior by adding an event listener on them and doing a window.location = window.location and prevent the default behavior by event.peventDefault or something similar. This will cause page navigation instead of refresh.
Also, I didn't test the case when the cached resource has actually changed on server. If that turns out to be a problem, you can solve it by version numbering of resources and generation of HTML with URLs pointing to the latest version of the resource (kind of like cache-manifest problem with HTML5 offline applications).
EDIT: This however doesn't solve the problem if user clicks on browser's refresh button; onbeforeunload event may help in that case.
I am working on a new website. While testing some of the functionality I had a number of debug statements and was watching the logs. It seems that Firefox (at least) loads the "next" page in the menu as well as the page I have clicked on. If I have menu items A B C D E and click on B then I see a request for mysite.com/B and then a request for mysite.com/C in the logs, and so on.
Is this some kind of look-ahead performance thing? Is there any way to avoid it (setting an attribute on the link maybe?) The problem is that the second page in my menu is somewhat heavier as it loads a whole lot of data from a web service. I'm happy for people to do that if they want to use the functionality, but would rather not that every visitor to the front page loads it unneccessarily. Is this behvaiour consistent across browser?
Yes, Firefox will prefetch links to improve the perceived performance to the user. You can read more about the functionality in Firefox here https://developer.mozilla.org/en-US/docs/Link_prefetching_FAQ
It isn't possible to disable this in the client's browser, however the request should include the header X-moz: prefetch which you can use to determine if it is in fact a prefetch request or not, and potentially return a blank page for prefetch requests. You can then use Cache-control: must-revalidate to make sure the page loads appropriately when actually requested by the user.
If you happen to be using Worpdress for your site, you can disable the tags with the prefetch information by using:
Wordpress 3.0+
//remove auto loading rel=next post link in header
remove_action('wp_head', 'adjacent_posts_rel_link_wp_head');
Older versions:
//remove auto loading rel=next post link in header
remove_action('wp_head', 'adjacent_posts_rel_link');
Yes, it's called prefetch. It can be turned off in the client, see the FAQ:
https://developer.mozilla.org/en-US/docs/Link_prefetching_FAQ
I'm not aware of a way to turn it off via the server
How can I use Fiddler to confirm that HTTP caching is working? Is there another better way?
You can confirm caching by having a page fetch a resource and note that no request for the resource appeared in Fiddler. I can't think of a better way to do it. Works for me.
right click the URL in the fiddler and click properties, you can check the cach info in that popup under "WININET CACHE INFO"
Browse the site through the Fiddler as proxy. In each response details, there's a tab "Caching". This shows useful info about the response headers - e.g. what the different Cache-Control and Expires values mean.
I think the best way is to use the method demonstrated within most caching tutorials - Have a label on the page that displays the current server time. If the value is cached, you will not see it update with subsequent page refreshes until the cache is regenerated.
If your requirement is more complex (you need to use Fiddler), Anthony's suggestion is the one I have used successfully in the past.
Fiddler will definitely help with this. You'll either see the server respond with an HTTP 304 response (Not Modified - which tells the client that the cached item is still valid) or for content that has it's web expiry set correctly, you won't see a request at all.
In fact, you'll find Firefox plus FireBug will do this for you too.