Multiple Ajax requests for same URL - ajax

I make asynchronous call for same URL for multiple times, But the response came as sequentially. Please see the attached image, The request has been started only after first request completes, the same happening for subsquent requests.
But If the URL is different the responses are not sequential. Please confirm me if one URL is being requested, wouldn't firefox make another request for same URL?

Necko's cache can only handle one writer per cache entry. So if you make multiple requests for the same URL, the first one will open the cache entry for writing and the later ones will block on the cache entry open until the first one finishes. There's work ongoing to rearchitect the cache to remove this restriction.

Generally browsers will have a limit to how many concurrent requests they will serve to a particular hostname, in the old days this was 2, but most browsers have now raised this. Defining a proxy can also affect these limits in some browsers. Again these are enforced client-side. However it should be more than one in any case. It may be possible that Firebug is also reducing the limit, check about about:config for concurrent configurations.
Are you sure you are not waiting until the callback is executed before launching the next request as given you only have one I suspect this what is really happening?

Related

Problem with caching in cloudFront (CDN- AWS), cache and collapse forward are acting the same

Users are sending requests for me for information, some times this info is personalized, and sometimes it's common to all. When it's common to all I want the CDN to cache the answer. I distinguish between the users by query params.
The problem - The problem is when I want them to stop using the cache, and each to get their personalized content.
I thought that if I'll send the response with cache flag off (max-age = 0) the user's requests won't use the cache, the requests would come to me, and I would give them their personalized answer.
But the CDN in that case doing collapse forwarding and all the users continue getting not personalized answers.
I didn't found a way to disable the collapse forwarding, and I couldn't find a way to continue serving the client personally after they start using the cache.
Any ideas?
I had the same issue and chatting with support (detailed here). It turns out that you can't stop CloudFront from sending non-personalized results just by using the Cache-Control header. Instead you have to have separate "Behaviors" and namely a CachingDisabled behavior for routes where you need personalized responses.

Cloudfront queues parallel requests - high and sequential time-to-first-byte (TTFB)

I have a web application that requests a lot of media assets in parallel using AJAX. All assets are coming from the same Cloudfront Origin, which is itself directly plugged into an S3 bucket.
I'm seeing requests from Cloudfront with TTFB of the order of seconds. Even more odd, it seems that those requests are basically queued until a previous request has been served:
Those two requests are initiated in parallel, and you can see that it's not Chrome queueing them, but Cloudfront not answering anything to the second (2KB) request until the first request has completed download. This is slowing down my application by a huge margin, and I cannot figure out what is going wrong... I see the same behavior when I check with Safari too.
Here are the two requests details
As you can see, they are also both Hit from cloudfront.
Finally, as it might be relevant, I'm using a lambda function in my Origin's behavior to add the proper Vary headers, to prevent Chrome from using cached requests without the CORS headers that will make subseqeuent CORS request fail (see details here).
Here is my complete Origin's behavior settings:
Any help is appreciated, and please feel free to ask more details if needed! Thanks a lot in advance.

HTTP request not hitting controller

I currently have a problem where I send an asynchronous ajax request to a .NET controller in order to start a database search. This request makes it to the server which kicks off the search and immediately (less than a second) replies to the callback with a search ID, at which point I begin sending ajax requests every 10 seconds to check if the search has finished. This method works fine, and has been tested successfully with multiple users sending simultaneous requests.
If I send a second search request from the same user before the first search is finished, this call will not make it to the controller endpoint until after the first search has completed, which can take up to a minute. I can see the request leave chrome (or FF/IE) in the dev tools, and using Fiddler as a proxy I can see the request hit the machine that the application is running on, however it will not hit the breakpoint on the first line of the endpoint until after the first call returns.
At the point this call is blocking, there are typically up to 3 pending requests from the browser. Does IIS or the .NET architecture have some mechanism that is queuing my request? Or if not, what else would be between the request leaving the proxy and entering the controller? I'm at a bit of a loss for how to debug this.
I was able to find the issue. It turns out that despite my endpoint being defined asynchronously, ASP.NET controllers by default synchronize by session. So while my endpoints were able to be executed simultaneously across sessions, within the same session it would only allow one call at a time. I was able to fix the issue by setting the controller SessionState attribute to Read Only, allowing my calls to come through without blocking.
[SessionState(System.Web.SessionState.SessionStateBehavior.ReadOnly)]

How can I cancel request in Django

I have writen searching in my site and now I am trying to make it search every time I start printing. So now I am sending many requests which contains different text to search for using AJAX one by one and every next reqest has to wait, before previous one is finished. Apperently I dont need old requests to be answered, but I need the only one response for the last request.
How can I kill the queue of not actual requests in Django?
Does anybody know the answer?
On the server side, it's probably too late to cancel requests, but you can ignore the responses on the client side. I would suggest aborting a pending AJAX request before sending a new one.
Here is how:
Abort Ajax requests using jQuery
An easier way to do this could be by waiting a bit before sending your request to the server. After each input, set up a timer that stops the previous (setTimout) and only send the request if the timeout is met.
If a request was already performed and has not returned you can still kill it as suggested in another answer.
I'm not aware of how to stop other requests using django -- hope that it's not even possible, it would be a security thread if requests could be killed by others.

How does an XMLHttpRequest response get routed to the right browser-callback?

I have made webpage that uses Ajax to update some values without reloading the page. I am using an XMLHttpRequest object to send a POST request, and I assign a callback function that gets called when the response arrives, and it works just fine.
But... how in the world does the browser know that some data coming from some ip:port should be sent to this particular callback function? I mean, in a worst case scenario, if I have Firefox and IE making some POST requests at roughly the same time from the same server, and even making subsequent POST requests before the responses arrive to the previous ones, how does the data coming in gets routed to the right callback functions ??
Each HTTP request made is on a seperate TCP connection. The browser simply waits for data to come back on that connection then invokes your callback function.
At a lower level, the TCP implementation on your OS will keep track of which packets belong to each socket (i.e. connection) by using a different "source port" for each one. There will be some lookup table mapping source ports to open sockets.
It is worth noting that the number of simultaneous connections a browser makes to any one server is limited (typically to 2). This was sensible back in the old days when pages reloaded to send and recieve data, but in these enlightened days of AJAX it is a real nuisance. See the page for an interesting discussion of the problem.
Each request has its own connection. Means that if you have single connection, of course you will have single response, and this response will be in your callback.
The general idea is that your browser opens a new connection entirely, makes a request to the server and waits for a response. This is all in one connection which is managed by the browser via a JavaScript API. The connection is not severed and then picked up again when the browser pushes something down, so the browser, having originated the request, knows what to do when the request finishes.
What truly makes things Asynchronous, is that these connections can happen separately in the background, which allows multiple requests to go out and return, while waiting for responses. This gives you the nice AJAX effect that appears to be the server returning something at a later time.

Resources