Sporadic page load failure - firefox

An issue has started recently, within Chrome and reportedly Firefox, pages would be loading fine and browsing would be as normal and then suddenly then a page would fail to load (continuing to spin as if loading). The page that fails is often not the same.
If I refresh the page or try to goto another page on the domain within the same browser, the browser doesn't even try to resolve the name or make a connection, and is then unable to load the page.
Swapping to another browser and I am back to browsing the domain normally again, while the original browser(in most cases Chrome) will not load the pages until a restart.
This has happened with 3 different people on 3 different machines in both Chrome and Firefox.
The domain that it is running off has allot of ajax calls within certain pages, I am not sure if the server is tripping out due to the number of requests from the one client...I am not sure.
I am not sure if this is a server, client or script functionality issue, as I can not personally reproduce it. I can do little to debug or work out what is causing this or how to fix it...
As you can see I am not sure of allot with this problem :) so I am throwing it out to stack-overflow in the hope that someone may have had similar experiences or have any directions I could look towards.
Cheers,
Brendan

If the page is making many requests in a short time, your firewall (router) may block it. I've noticed this behavior on my own router, and had to set it to a less restrictive level to make things work.

Related

Is there a recommended strategy for migrating from appcache to ServiceWorker in Firefox

I am working on a trip planning application that used to implement offline support with the combination of appcache and localStorage. As soon as Service Worker became a viable option we started using it. Transition went without a hitch in Chrome (Chromium, Opera etc.) but Firefox (both 44 and 45) is posing some issues: Firefox registers new service worker but sill loads pages in its scope from the outdated appcache.
In other words, if you are lucky enough to never have stumbled upon our website, and if you open it for the first time in FF 44/45, you are going to get a new shiny service worker that takes care of all your offline needs. Life is great.
However if you had a misfortune to use Firefox before Service Worker was enabled you'd still have our website's older version in your appcache.
you go to the welcome page - the service worker gets activated and it (supposedly) takes care of handling everything for the entire scope
you log in, which redirects you one of the pages in SW scope (/ui) - it still should be handled by service worker, but instead Firefox suddenly realizes that it has that old appcache and without even trying to load anything from the network it loads the old content from the appcache
I would OK with that (alhough my reading of the Mozilla docs tells me that appcache should be ignored in the scope controlled by service worker) if that happened only once. Sadly Firefox does not even try to GET a manifest to check if that old appcache is up to date. If it did attempt to GET manifest, it would have received 404, which would invalidated the appcache (as it did on Chrome). I do not see anything like that on the wire (or on the server side).
To add insult to the injury Firefox console proudly anounces: The Application Cache API (AppCache) is deprecated and will be removed at a future date. Please consider using ServiceWorker for offline support. :-)
Simple refresh (F5) does load the current version of the page through the service worker. Sadly it only works once. After closing and reopening the tab the whole dance replays itself: service worker takes care of all the pages in the scope with the exception of the ones that used to have appcache manifest declaration.
Clearing the appcache (appcache clear in developer's console or through Settings UI) does remedy the situation, but I cannot possible suggest it to all our Firefox users.
I tried to find something Firefox bugzilla without much luck. If someone can find a relevant issue that would be great.
For now we just had to disable SW support for Firefox.
Is there any way of signaling to Firefox that it should ignore the old appcache when in Service Worker scope?
Why can't you just modify your appCache .manifest file? Provide an empty file, or garbage link, or some dummy page, just to get the new .manifest accepted, and the old one should be history.

MVC 3 with IE, poor bundle performance

We have deployed an MVC 3 website on an IIS6 box.
Everything runs fine, but the performance is abysmal.
Can anyone help me understand
why am I getting 20 second response times to get a script bundle?
why bundled scripts are not cached by IE even if the Expires header is set?
The site is several times faster in Chrome (I have noticed the cache behaviour is correct), but we cannot force customers to use it.
Any help would be great. I'm kind of wondering if it's a server-side setting that's forcing the bundle recompilation each request, or if it's just IE acting like usual.
Edit: as per comments request, I'm including also the bundle request headers:
If you have different download times for a full reload between the two browsers it could be that you are doing intense computations with a client side framework like angularjs (I have seen big performance differences from highly complex angularjs apps between the two browsers).
If both your browsers show the same download time, it is either a network issue, or a server issue.
The IE caching could be a separate issue, break your problem into two parts - look for the cause of the slow downloads first.
All I can do now is suggest an approach to finding the issue.
Summary of what you know
It looks like you have:
Server sends an Expires header one year from now
When you reload the page (i.e. you don't force a full refresh using Ctrl+F5)
IE doesn't take any notice of the cache header, and when it sends it's new request it doesn't use If-Modified-Since or If-None-Match
Chrome behaves differently and respects the Expires and/or ETag response headers (it doesn't even make the request again for the bundle).
EDIT 1: You also seem to be saying (though it would be good to see a timeline from chrome) that Chrome downloads the files faster, implying it is not a server-side problem. Your latest comment states that Chrome's downloads are also slow. (end edit)
And you also seem to be saying that this behaviour is consistent (i.e. 100 requests in IE, and 100 requests in Chrome show the above behaviour with no deviations).
Approach
You should break this problem into two parts:
Why is the download so slow?
Is there a server-side performance problem? Look for common download times in IE and Chrome, and Firefox (it could be due to bundling/minification/compression on the server).
Is there a network connectivity issue (dropped packets, for instance)? Look for inconsistent download times, Start times, Request times, between requests in a given browser and the same behaviour across all browsers.
Is a script slowing down IE, but not Chrome (this is not uncommon, I maintain legacy sites where the scripts don't run well in IE but do in Chrome) - look at different profile results between browsers.
Why is the javascript not being cached in IE? Troubleshoot (1) first, then worry about this.
It is possible that the two are related, but approaching them separately will be a start. Number 1 is far easier to diagnose that 2, the top references to caching javascript in IE on the web are to prevent it in order to help with development.
Root cause diagnosis
EDIT 1 The first thing to do is try the site from a browser on the server, or very close to the server to see if you have a network issue. (end edit)
Tools like Fiddler, the browser developer tools, timeline and script profiler, and YSlow are your friend. Compare each of the following between Chrome and IE (and see what happens in Firefox as well) and spot the difference. Note: you may need to clear the browser cache between tests.
browser developer tools -> script profile: see if you have a slow running script in IE compared to Chrome
similar analysis in a tool like YSlow (look for comparisons between the two browsers, not script improvements)
request and response headers, and timeline from a normal (i.e. not full reload) page load
request and response headers, and timeline from a full page reload (Ctrl+F5)
Start and Request durations for every js file for a given browser, and between browsers (this may point to network issues)? I note that the Start and Request alone are taking 0.6s and 1s each in IE - that is very very poor performance.
5 requests, and 5 full reloads with cache clearing between (that is, don't chase a ghost - be consistent in your test methodology)
Download times should be no different between Chrome and IE with no scripts actually running so also add a control test. Assuming that your bundle files don't "do anything" (i.e. they contain functions that the page calls rather than kicking off long processes by themselves) then create a blank page in your site which references exactly the same javascript files - not just the bundle, but every single js reference.
With the control test you can compare pure download times and caching behaviour in IE to Chrome, without any client side javascript running (use the developer tools profiler to verify no scripts are running). If your bundle files do kick off long running things, just temporarily disable those things by putting return statements at the top of the script and concentrate only on the download into the browser.

Blank page on Azure

I have an application running in Azure (trial account). So far so good, everything has been nice, except for a long deploy times (10-15 minutes).
I've done a deploy recently and got a lot of weird bugs I cannot trace. For example, if I log in and thus a cookie is created (I use FormsAuthentication) all I get from the application is a blank page, as in, absolutely nothing is sent to the browser. The application works fine in the ASP.NET Web Dev Server, IIS Express, even the Azure Emulator!
What could be the issue? Searching the web hasn't been much help, with only a couple of unrelated issues.
I tried logging into the site (if I correctly understood from one if the comments, the url is versulo.com) and I didn't get any blank page with 404 status code.
However, there is another problem I spotted. Your site seems to be implementing caching inappropriately. The main page, the one from which you trigger the login and which is dynamic in nature contains an Expires header set at 5 minutes after the pages first load. That means that each call or redirect to that page within 5 minutes since it was first loaded, will be served from the browser's cache.
Because of that, after I login into your application I am redirected back to the home page which looks like I am not logged in. If I force a F5 refresh on the browser, then the page will indeed show me as logged in.
If instead of a refresh I try to login again (which is what I did in my first trials, since it looked like the login didn't work in the first time), I am getting an error page with the following message:
Sorry, there has been an error on the server.
500
The page looks like an application error page and even if it displays the 500 number, it is actually served with an HTTP 200.
So, while I am not 100% sure if this is also the cause of the problem described by you, you should remove the Expires headers from the dynamic pages your application is serving.
This can be because you're combining Forms Authentication with multiple instances. Are you using multiple instances? If that's the case, could you:
Try to change it to 1 instance. Does this fix the issue?
Try to make the following change to the web.config (configure machineKey): http://msdn.microsoft.com/en-us/library/ff649308.aspx
some partial views are not rendered at all;
Do you mean some pages are working fine, but others are not? It would be better if you can point out a pattern on what’s working and what’s not? For now, please make sure all referenced assemblies (except for default .NET assemblies and Windows Azure runtime) have Copy Local set to true. For example, MVC assemblies are considered as extensions to .NET, so please set Copy Local to true. In addition, you can also try to use Fiddler to monitor the requests to see what’s returned from the server.
Best Regards,
Ming Xu.
Could you provide a link to the application, or perhaps some source code?
When you say 'blank page', what is actually returned, a 404 / 500?
Have you inspected the IIS logs, or added some trace information to your code?
Have you tried accessing the service using it's ip address rather than domain name?

Chrome caching like a mad browser

I've got a web service that, like most others, uses js and css files. I use the old trick of appending a version number to the js and css file like; ?v=123 and that gets changed every time we update the service on production.
Now, this works fine on all browsers, except for Chrome. Chrome seems to prefer it's cached version over getting the new one and therefor seems to ignore the appended variable. In some cases, forcing it to refresh cache (cmd+r / ctrl+f5) wasn't enough so I had to go into options and clear out the cache for it to load up the new content.
Has anyone experienced this issue with Chrome? And if so, what was the resolution to the problem?
Chrome should certainly treat requests with varying query strings as different requests; a cached result for style.css?v=123 should never be used for style.css?v=124. If you're seeing different behavior, please file a bug at http://new.crbug.com/ and post the bug ID here.
That said, I'd first check to see whether the page was cached longer than you expected. If a new version of the page itself wasn't downloaded, then it would still be requesting ?v=123 as the HTML wouldn't have changed. If you're sending long-lived cache headers with the page, it's certainly possible that Chrome is caching it more aggressively than you expected. If that's the behavior you're seeing, please star http://crbug.com/8742 for updates.
I had also same experience
You can user Ctrl + Shift + R for cache free browsing in both Chrome + Mozilla.
I have had this experience as well.
I run a membership site which displays content such as "You must be logged in as a Gold member in order to see this content" if they are not logged in or are trying to view content not allowed by their membership level. But even if the user is logged in, the user would still see "You need to log in", due to Google Chrome's aggressive caching. In Firefox, however, it works fine as I test logging in and out of all 5 levels of membership - each displaying the proper content.
While Chrome's caching problem can be solved by clearing the cache every time the user logs in and out, it would be really annoying to take that approach.

IE6 cannot download javascript files over HTTPS

I'm totally stumped here, so any ideas would be appreciated.
I have a RichFaces application, which recently became non-functional when used from IE6. The problem began when I included the following line in my main template:
<a4j:loadScript src="resource://jquery.js"/>
This results in the following generated HTML:
<script src="/AgriShare/a4j/g/3_3_3.Finaljquery.js.jsf" type="text/javascript"></script>
By "non-functional" I mean that pages no longer load, b/c the first page appears to hang the browser for a long time, and then all references to jQuery say that the object was not defined. Eventually this appears to put IE6 in a state where further clicks do nothing.
After a lot of trial and error I have established the following:
The app still works in Chrome, Firefox and IE8
The app still works in IE6, if I switch to HTTP. So, the problem appears to be related to HTTPS, which I can't dispose of.
I further narrowed down the problem by trying to manually request 3_3_3.Finaljquery.js.jsf in IE6 address bar. It asks me if I want to save the file (so it can see it is there), but when I say 'Save', it hangs for about 5 seconds and then says:
Internet Explorer cannot download 3_3_3.Finaljquery.js.jsf from [host_name].
The connection with the server was reset.
Doing the same download over HTTP succeeds.
Gradually reducing the size of the file, I noticed that the download eventually succeeds over HTTPS, if I get the files size below ~ 110KB. There is no specific size it works at though. I tried the same trick with prototype.js and it worked at a different size value.
I can't trace the SSL session, b/c I cannot get access to the certificate's private key, so now I have absolutely no clue what to try next.
Any ideas would be greatly appreciated.
Try using Fiddler for debugging. It can handle SSL.
You might also want to consider hosting the server yourself and taking a look at the server log.
The problem was solved by turning off compression of javascript files in Web Cache.
Sounds like the problem might be related to this: http://support.microsoft.com/default.aspx?scid=kb;en-us;327286

Resources