When calling cy.visit("https://my-page.com") my page will fetch a number of external libraries/scripts. One of them is for sentry.io. Sometimes this particular fetch will just hang forever. The website is perfectly usable without it. Is there a way to make cy.visit not wait for these fetch requests from certain domains?
Some things to try, depends on how the app responds
Controlling the response
req.destroy() - destroy the request and respond with a network error
cy.intercept('https://sentry.io/*', req => req.destroy())
req.reply() - stub out a response requiring no dependency on a real back-end
cy.intercept('https://sentry.io/*', {}) // stubbed, never goes to the server
You may have to tweak {} to include fake properties if the app expects them.
Related
I am using axios-mock-adapter in order to mock requests from the backend.
It intercepts the requests even before they are being sent to the network and so Cypress.intercept cannot catch them in any way, thus I cannot wait for them to finish as shown here.
Is it okay to wait manually (stated as an anti-pattern by Cypress) for these mock requests to come back or should I look into another solution.
In my Cypress test, I am trying to wait for a GET request & validate it's response. However, the test is timing out as the request never occurs.
Here is the request I am trying to wait for:
Here are some details around the test:
The URL of the app I am visiting in the test is https://ts-e2e-challenge.netlify.app/list.
In my test, an action is performed that sends a GET request to the URL https://bookshelf.jk/api/list-items, as you can see in the screenshot.
And here is my test code:
cy.intercept('GET', '/list-items').as('getListItems')
cy.wait('#getListItems').then((interception) => {
});
Full error message:
Timed out retrying after 5000ms: cy.wait() timed out waiting 5000ms
for the 1st request to the route: getListItems. No request ever occurred
I assume the URL that I am trying to intercept is incorrect, but I have tried to update it to the full path https://bookshelf.jk/api/list-items, but the request is still not being made.
Can someone please point out what the request URL should be based on the above screenshot?
I think the issue here is your request is not intercepted properly, You can try this:
cy.intercept('GET', '**/api/list-items').as('getListItems')
cy.wait('#getListItems').then((interception) => {
//Do something
})
There's a service worker intercepting the network requests before Cypress can intercept them.
If you put https://bookshelf.jk/api/list-items into a normal browser, it can't be reached. That's because the service worker is acting like the (non-existing) API server.
For the record, the full URL https://bookshelf.jk/api/list-items should work. Also /api/list-items would work if your baseUrl was common to the web page and the api (as you surmised).
As far as I can see, there's no simple adjustment to fix it. You may be able to hack the loading of the service worker but if you're just trying out cy.intercept() it's not worth the effort.
I'd look for another site to test against, https://jsonplaceholder.typicode.com is a good one.
I currently have a problem where I send an asynchronous ajax request to a .NET controller in order to start a database search. This request makes it to the server which kicks off the search and immediately (less than a second) replies to the callback with a search ID, at which point I begin sending ajax requests every 10 seconds to check if the search has finished. This method works fine, and has been tested successfully with multiple users sending simultaneous requests.
If I send a second search request from the same user before the first search is finished, this call will not make it to the controller endpoint until after the first search has completed, which can take up to a minute. I can see the request leave chrome (or FF/IE) in the dev tools, and using Fiddler as a proxy I can see the request hit the machine that the application is running on, however it will not hit the breakpoint on the first line of the endpoint until after the first call returns.
At the point this call is blocking, there are typically up to 3 pending requests from the browser. Does IIS or the .NET architecture have some mechanism that is queuing my request? Or if not, what else would be between the request leaving the proxy and entering the controller? I'm at a bit of a loss for how to debug this.
I was able to find the issue. It turns out that despite my endpoint being defined asynchronously, ASP.NET controllers by default synchronize by session. So while my endpoints were able to be executed simultaneously across sessions, within the same session it would only allow one call at a time. I was able to fix the issue by setting the controller SessionState attribute to Read Only, allowing my calls to come through without blocking.
[SessionState(System.Web.SessionState.SessionStateBehavior.ReadOnly)]
I make asynchronous call for same URL for multiple times, But the response came as sequentially. Please see the attached image, The request has been started only after first request completes, the same happening for subsquent requests.
But If the URL is different the responses are not sequential. Please confirm me if one URL is being requested, wouldn't firefox make another request for same URL?
Necko's cache can only handle one writer per cache entry. So if you make multiple requests for the same URL, the first one will open the cache entry for writing and the later ones will block on the cache entry open until the first one finishes. There's work ongoing to rearchitect the cache to remove this restriction.
Generally browsers will have a limit to how many concurrent requests they will serve to a particular hostname, in the old days this was 2, but most browsers have now raised this. Defining a proxy can also affect these limits in some browsers. Again these are enforced client-side. However it should be more than one in any case. It may be possible that Firebug is also reducing the limit, check about about:config for concurrent configurations.
Are you sure you are not waiting until the callback is executed before launching the next request as given you only have one I suspect this what is really happening?
I have made webpage that uses Ajax to update some values without reloading the page. I am using an XMLHttpRequest object to send a POST request, and I assign a callback function that gets called when the response arrives, and it works just fine.
But... how in the world does the browser know that some data coming from some ip:port should be sent to this particular callback function? I mean, in a worst case scenario, if I have Firefox and IE making some POST requests at roughly the same time from the same server, and even making subsequent POST requests before the responses arrive to the previous ones, how does the data coming in gets routed to the right callback functions ??
Each HTTP request made is on a seperate TCP connection. The browser simply waits for data to come back on that connection then invokes your callback function.
At a lower level, the TCP implementation on your OS will keep track of which packets belong to each socket (i.e. connection) by using a different "source port" for each one. There will be some lookup table mapping source ports to open sockets.
It is worth noting that the number of simultaneous connections a browser makes to any one server is limited (typically to 2). This was sensible back in the old days when pages reloaded to send and recieve data, but in these enlightened days of AJAX it is a real nuisance. See the page for an interesting discussion of the problem.
Each request has its own connection. Means that if you have single connection, of course you will have single response, and this response will be in your callback.
The general idea is that your browser opens a new connection entirely, makes a request to the server and waits for a response. This is all in one connection which is managed by the browser via a JavaScript API. The connection is not severed and then picked up again when the browser pushes something down, so the browser, having originated the request, knows what to do when the request finishes.
What truly makes things Asynchronous, is that these connections can happen separately in the background, which allows multiple requests to go out and return, while waiting for responses. This gives you the nice AJAX effect that appears to be the server returning something at a later time.