I am new to angular and want to use it to send data to my app's backend. In several occasions, I have to make several http post calls that should either all succeed or all fail. This is the scenario that's causing me a headache: given two http post calls, what if one call succeeds, but the other fails? This will lead to inconsistencies in the database. I want to know if there's a way to cancel the succeeding calls if at least one call has failed. Thanks!
Without knowing more about your specific situation I would urge you to use the promise error handling if you are not already doing so. There's only one situation that I know you can cancel a promise that has been sent is by using the timeout option in the $http(look at this SO post), but you can definitely prevent future requests. What happens when you make a $http call is that it returns a promise object(look at $q here). What this does is it returns two methods that you can chain on your $http request called success and failure so it looks like $http.success({...stuff...}).error({...more stuff..}). So if you do have error handling in each of these scenarios and you get a .error, dont make the next call.
You can cancel the next requests in the chain, but the previous ones have already been sent. You need to provide the necessary backend functionality to reverse them.
If every step is dependent on the other and causes changes in your database, it might be better to do the whole process in the backend, triggered by a single "POST" request. I think it is easier to model this process synchronously, and that is easier to do in the server than in the client.
However, if you must do the post requests in the client side, you could define each request step as a separate function, and chain them via then(successCallback, errorCallback) (Nice video example here: https://egghead.io/lessons/angularjs-chained-promises).
In your case, at each step you can check if the previous one failed an take action to reverse it by using the error callback of then:
var firstStep = function(initialData){
return $http.post('/some/url', data).then(function(dataFromServer){
// Do something with the data
return {
dataNeededByNextStep: processedData,
dataNeededToReverseThisStep: moreData
}
});
};
var secondStep = function(dataFromPreviousStep){
return $http.post('/some/other/url', data).then(function(dataFromServer){
// Do something with the data
return {
dataNeededByNextStep: processedData,
dataNeededToReverseThisStep: moreData
}
}, function(){
// On error
reversePreviousStep(dataFromPreviousStep.dataNeededToReverseThisStep);
});
};
var thirdFunction = function(){ ... };
...
firstFunction(initialData).then(secondFunction)
.then(thirdFunction)
...
If any of the steps in the chain fails, it's promise would fail, and next steps will not be executed.
Related
I want to be able to call an HTTP endpoint (that I own) from an Azure Function at the end of the Azure Function request.
I do not need to know the result of the request
If there is a problem in the HTTP endpoint that is called I will log it there
I do not want to hold up the return to the client calling the initial Azure Function
Offloading the call of the secondary WebApi onto a background job queue is considered overkill for this requirement
Do I simply call HttpClient.PutAsync without an await?
I realise that the dependencies I have used up until the point that the call is made may well not be available when the call returns. Is there a safe way to check if they are?
My answer may cause some controversy but, you can always start a background task and execute it that way.
For anyone reading this answer, this is far from recommended. The OP has been very clear that they don't care about exceptions or understanding what sort of result the request is returning ...
Task.Run(async () =>
{
using (var httpClient = new HttpClient())
{
await httpClient.PutAsync(...);
}
});
If you want to ensure that the call has fired, it may be worth waiting for a second or two after the call is made to ensure it's actually on it's way.
await Task.Delay(1000);
If you're worried about dependencies in the call, be sure to construct your payload (i.e. serialise it, etc.) external to the Task.Run, basically, minimise any work the background task does.
I am using cypress to test our web application.
In certain pages there are different endpoint requests that are executed multiple times. [ e.g. GET /A GET /B GET /A].
What would be the best practise in cypress in order to wait for all requests to finish and guarantee that page has been fully loaded.
I don't want to use a ton cy.wait() commands to wait for all request to be processed. (there are a lot of different sets of requests in each page)
You can use the cy.route() feature from cypress. Using this you can intercept all your Get requests and wait till all of them are executed:
cy.server()
cy.route('GET', '**/users').as('getusers')
cy.visit('/')
cy.wait('#getusers')
I'm sure this is not recommended practice but here's what I came up with. It effectively waits until there's no response for a certain amount of time:
function debouncedWait({ debounceTimeout = 3000, waitTimeout = 4000 } = {}) {
cy.intercept('/api/*').as('ignoreMe');
let done = false;
const recursiveWait = () => {
if (!done) {
// set a timeout so if no response within debounceTimeout
// send a dummy request to satisfy the current wait
const x = setTimeout(() => {
done = true; // end recursion
fetch('/api/blah');
}, debounceTimeout);
// wait for a response
cy.wait('#ignoreMe', { timeout: waitTimeout }).then(() => {
clearTimeout(x); // cancel this wait's timeout
recursiveWait(); // wait for the next response
});
}
};
recursiveWait();
}
According to Cypress FAQ there is no definite way. But I will share some solutions I use:
Use the JQuery sintax supported by cypress
$('document').ready(function() {
//Code to run after it is ready
});
The problem is that after the initial load - some action on the page can initiate a second load.
Select an element like an image or select and wait for it to load. The problem with this method is that some other element might need more time.
Decide on a maindatory time you will wait for the api requests (I personaly use 4000 for my app) and place a cy.wait(mandatoryWaitTime) where you need your page to be loaded.
I faced the same issue with our large Angular application doing tens of requests as you navigate through it.
At first I tried what you are asking: to automatically wait for all requests to complete. I used https://github.com/bahmutov/cypress-network-idle as suggested by #Xiao Wang in this post. This worked and did the job, but I eventually realized I was over-optimizing my tests. Tests became slow. Test was waiting for all kinds of calls to finish, even those that weren't needed at that point in time to finish (like 3rd party analytics etc).
So I'd suggest not trying to wait for everything at a step, but instead finding the key API calls (you don't need to know the full path, even api/customers is enough) in your test step, use cy.intercept() and create an alias for it. Then use cy.wait() with your alias. The result is that you are waiting only when needed and only for the calls that really matter.
// At this point, there are lots of GET requests that need to finish in order to continue the test
// Intercept calls that contain a GET request with a request path containing /api/customer/
cy.intercept({ method: 'GET', url: '**/api/customer/**' }).as("customerData");
// Wait for all the GET requests with path containing /api/customer/ to complete
cy.wait("#customerData");
// Continue my test knowing all requested data is available..
cy.get(".continueMyTest").click()
TLDR;
I managed to simplify my question after a good night's sleep. Here's the simpler question.
I want to upload N files to a server, which would process them together and return a single response (e.g. Total foobars in all files combined = XYZ).
What's the best way to send this single response back to the client?
Thanks.
&
Below is the old question, left behind as a lesson for me.
I'm using Dropzone.js to build D&D functionality into my app.
Please note: I know there are a couple of questions already that discuss multifile uploads. But they are different from my question. They talk about how to get a single callback call instead of multiple ones.
My issue is related to the situation where I drag and drop multiple files into the dropzone, but am seeing the single server response being duplicated multiple times. Here is my config:
Dropzone.options.inner = {
init: function() {
this.on("dragenter", function(e) {
$('#inner').addClass('drag-over');
//// TODO - find out WTF this isn't working (low priority)
}),
this.on("completemultiple", function(file, resp) {
//// TODO
})
},
url: "php/...upload...php",
timeout: 120000, // 2m
uploadMultiple: true,
autoProcessQueue: false,
clickable: false,
};
//// ... Some other stuff
//// ...
$(document).ready(function() {
$('#inner').click(function() {
Dropzone.forElement('.dropzone').processQueue();
});
In the beginning I intercepted the "complete" event, rather than "completemultiple". That resulted in its handler being invoked multiple separate times (once for each file), even though the server-side php was only being invoked once. Each invocation returned a duplicate copy of the same server-side message.
I didn't want that, so I changed it to "completemultiple", and now I can confirm that the handler only gets called once with an array of files, but the single server response is now buried within each file object returned - each has a duplicate copy of the exact same response.
It doesn't matter ultimately because it is the same message, after all. But the whole esthetics of the thing now seems off which indicates to me I'm doing something wrong - the response seems to indicate two independent uploads, but they were part of a single invocation of the server side php. Why make the client "believe" there were two separate upload requests when the server-side script only has one opportunity to respond (i.e. The php is not sending back different messages for each file - should it? And if so, what's the best way to do it?)
How can I make it so that if I have a scenario in which it's all-or-none, I get a single response back from the php script?
This is especially important to me because my server response will contain the status and some other data. The script does more than simply receiving the uploaded files (hence the longer timeout).
I thought maybe that's a sign that I should separate the uploading part from the processing part and trigger the processing once the upload is complete.
But that means that the server side upload script can't clean up after itself. It needs to persist data beyond its own life. Also it now needs to return a handle to this data back to the client, which would dispatch the server-side processor in a different ajax call passing it this handle - and the subsequent call needs to clean up the files left by the uploader after it is done processing them.
This seems the less elegant solution. Is this something I just need to get used to? Or is there a better way of accomplishing what I want?
Also, any other free tips and hints from the front-end gurus in my network will be gratefully accepted.
Thanks.
&
The following approach works. Until something better can be found.
Dropzone.options.inner = {
// . . .
init: function() {
this.on("completemultiple", function(file) {
var code = JSON.parse(file[0].xhr.response).code;
var data = { "code" : code };
$.post('php/......php', data, function(res) {
// TODO - surface the res back to the user
});
})
},
};
&
I just have implemented AJAX calls using fetch in react-native.
Implementing queuing of these AJAX calls hasn't been very well implemented.
Can any one help me?
If you want to make parallel requests, you can use the fact that fetch returns a promise to you, and then you can use Promise.all to wait for completion of all promises.
For example:
var urls = ['http://url1.net', 'http://url2.net'];
var requests = [];
urls.forEach((url)=>{
request = fetch(url); // You can also pass options or any other parameters
requests.push(request);
});
// Then, wait for all Promises to finish. They will run in parallel
Promise.all(requests).then((results) => {
// Results will hold an array with the results of each promise.
}).catch((err)=>{
// Promise.all implements a fail-fast mechanism. If a request fails, the catch method will be called immediately
});
I noticed that you added the 'multithreading' tag. It is important to notice that this code won't be doing any threading for you, as JS (generally) runs in only one thread.
I'm trying to synchronize my POSTs to an endpoint in Angular. I did see some examples of doing a synchronized GET but had trouble understanding the examples well enough to apply them to POSTs.
The POSTs are pretty simple, at least from my perspective as the front-end developer. I send an object with an parent group ID and a sub group ID to a /parentgroups endpoint. On the backend, however, async calls cause the data to get overwritten.
Apologies for lack of an example, but I am pretty far from having one that's close to working how I need. My code is still async and a loop over calls to $http.post().
You actually cannot do real synchronous (as in blocking) http calls in Angular, it forces you do use async. If you can't do it with callbacks then you have a problem with your architecture that the entire team should focus on solving ASAP. If your current architecture requires the frontend to do blocking calls then your architecture is quite simply broken and needs to be fixed.
Anyway, while I recommend against it you could always register your request in a list, and then in each callback you pop the next request from the list and run it. That way you can just keep pushing requests into the list without knowing how many there will be. Something like this (untested, but the general principle should work):
var requestList = [];
requestList.push(function() {
$http.post('/someUrl', {})
.success(function(data, status, headers, config) {
// Remove the next request from list and call it
requestList.shift()();
});
});
requestList.push(function() {
$http.post('/someOtherUrl', {})
.success(function(data, status, headers, config) {
// Remove the next request from list and call it
requestList.shift()();
});
});
// Start the first request
requestList.shift()();
This is fairly clean, but still a bit of a hack. It would probably work fine but I would be taking a good long look at why the API forces you to do something like this.