I have two for loops and an HTTP call inside them.
for(i=0;i<m;i++) {
for(j=0;j<n;j++) {
$http call that uses i and j as GET parameters
.success(//something)
.error(//something more)
}
}
The problem with this is it makes around 200-250 AJAX calls based on values of m and n. This is causing problem of browser crash when tried to access from mobile.
I would like to know if there is a way to call HTTP requests in batched form (n requests at a time) and once these calls are finished, move to next batch and so on.
You could always use a proper HTTP batch module like this angular-http-batcher - which will take all of the requests and turn them into a single HTTP POST request before sending it to the server. Therefore it reduces 250 calls into 1! The module is here https://github.com/jonsamwell/angular-http-batcher and a detailed explanation of it is here http://jonsamwell.com/batching-http-requests-in-angular/
Yes, use the async library found here: https://github.com/caolan/async
First, use the loop to create your tasks:
var tasks = []; //array to hold the tasks
for(i=0;i<m;i++) {
for(j=0;j<n;j++) {
//we add a function to the array of "tasks"
//Async will pass that function a standard callback(error, data)
tasks.push(function(cb){
//because of the way closures work, you may not be able to rely on i and j here
//if i/j don't work here, create another closure and store them as params
$http call that uses i and j as GET parameters
.success(function(data){cb(null, data);})
.error(function(err){cb(err);});
});
}
}
Now that you've got an array full of callback-ready functions that can be executed, you must use async to execute them, async has a great feature to "limit" the number of simultaneous requests and therefore "batch".
async.parallelLimit(tasks, 10, function(error, results){
//results is an array with each tasks results.
//Don't forget to use $scope.$apply or $timeout to trigger a digest
});
In the above example you will run 10 tasks at a time in parallel.
Async has a ton of other amazing options as well, you can run things in series, parlallel, map arrays, etc.It's worth noting that you might be able to achieve greater efficiency by using a single function and the "eachLimit" function of async.
The way I did it is as follows (this will help when one wants to call HTTP requests in a batch of n requests at a time )
call batchedHTTP(with i=0);
batchedHTTP = function() {
/* check for terminating condition (in this case, i=m) */
for(j=0;j<n;j++) {
var promise = $http call with i and j GET parameters
.success(// do something)
.error(// do something else)
promisesArray.push(promise);
}
$q.all(promisesArray).then(function() {
call batchedHTTP(with i=i+1)
});
}
Related
I'm creating multiple web workers making http calls. I have to limit the number of web workers and so I'm trying to wait for some of the workers to finish.
here's an example of what I thought might work using Promises:
anArray.map(async contact => {
await new Promise((res, rej) => {
const worker = new Worker('./http.worker', { type: 'module' });
worker.onmessage = () => {
res();
};
worker.postMessage(contact);
});
});
I thought this would wait for each promise to resolve before moving on to the next item... but it's not.
What could I do to make this work? or... I've also thought of building an array of workers and run a recursive loop that checks/waits for one to be available... I'm open to general suggestions for solving this.
.map() is not promise aware. It does not look at the return value from each iteration to see if it's a promise and then pause the loop. Instead, it just blindly runs all the iterations one after another. When you return promises from .map() which you are with the async callback, that just means that your .map() will produce an array of promises and all your loop iterations will be "in-flight" at the same time, not sequenced.
If you want to iterate a loop and pause the loop in each iteration until a promise resolves, then use a regular for loop:
async function someFunc() {
for (let contact of anArray) {
await new Promise((res, rej) => {
const worker = new Worker('./http.worker', { type: 'module' });
worker.onmessage = () => {
res();
};
worker.postMessage(contact);
});
}
}
FYI, http calls in Javascript are non-blocking and asynchronous so it's not entirely clear why you're doing them in WebWorkers. Unless you have CPU-intensive processing of the result, you can do the http requests in the main thread just fine without blocking it.
Also FYI, for a number of different options for processing an array, with only N requests in flight at the same time (where you decide what value N is), see these various answers:
runN(fn, limit, cnt, options): Loop through an API on multiple requests
pMap(array, fn, limit): Make several requests to an api that can only handle 20 at a time
rateLimitMap(array, requestsPerSec, maxInFlight, fn): Proper async method for max requests per second
mapConcurrent(array, maxConcurrent, fn): Promise.all() consumes all my ram
There are also features to do this built into the Bluebird promise library and the Async-promises library.
I'm working on something that is recording data coming from a queue. It was easy enough to process the queue into an Observable so that I can have multiple endpoints in my code receiving the information in the queue.
Furthermore, I can be sure that the information arrives in order. That bit works nicely as well since the Observables ensure that. But, one tricky bit is that I don't want the Observer to be notified of the next thing until it has completed processing the previous thing. But the processing done by the Observer is asynchronous.
As a more concrete example that is probably simple enough to follow. Imagine my queue contains URLs. I'm exposing those as an Observable in my code. The I subscribe an Observer whose job is to fetch the URLs and write the content to disk (this is a contrived example, so don't take issue with these specifics). The important point is that fetching and saving are async. My problem is that I don't want the observer to be given the "next" URL from the Observable until they have completed the previous processing.
But the call to next on the Observer interface returns void. So there is no way for the Observer to communicate back to me that has actually completed the async task.
Any suggestions? I suspect there is probably some kind of operator that could be coded up that would basically withhold future values (queue them up in memory?) until it somehow knew the Observer was ready for it. But I was hoping something like that already existed following some established pattern.
similar use case i ran into before
window.document.onkeydown=(e)=>{
return false
}
let count=0;
let asyncTask=(name,time)=>{
time=time || 2000
return Rx.Observable.create(function(obs) {
setTimeout(function() {
count++
obs.next('task:'+name+count);
console.log('Task:',count ,' ', time, 'task complete')
obs.complete();
}, time);
});
}
let subject=new Rx.Subject()
let queueExec$=new Rx.Subject()
Rx.Observable.fromEvent(btnA, 'click').subscribe(()=>{
queueExec$.next(asyncTask('A',4000))
})
Rx.Observable.fromEvent(btnB, 'click').subscribe(()=>{
queueExec$.next(asyncTask('B',4000))
})
Rx.Observable.fromEvent(btnC, 'click').subscribe(()=>{
queueExec$.next(asyncTask('C',4000))
})
queueExec$.concatMap(value=>value)
.subscribe(function(data) {
console.log('onNext', data);
},
function(error) {
console.log('onError', error);
},function(){
console.log('completed')
});
What you describe sounds like "backpressure". You can read about it in RxJS 4 documentation https://github.com/Reactive-Extensions/RxJS/blob/master/doc/gettingstarted/backpressure.md. However this is mentioning operators that don't exist in RxJS 5. For example have a look at "Controlled Observables" that should refer to what you need.
I think you could achieve the same with concatMap and an instance of Subject:
const asyncOperationEnd = new Subject();
source.concatMap(val => asyncOperationEnd
.mapTo(void 0)
.startWith(val)
.take(2) // that's `val` and the `void 0` that ends this inner Observable
)
.filter(Boolean) // Always ignore `void 0`
.subscribe(val => {
// do some async operation...
// call `asyncOperationEnd.next()` and let `concatMap` process another value
});
Fro your description it actually seems like the "observer" you're mentioning works like Subject so it would make maybe more sense to make a custom Subject class that you could use in any Observable chain.
Isn't this just concatMap?
// Requests are coming in a stream, with small intervals or without any.
const requests=Rx.Observable.of(2,1,16,8,16)
.concatMap(v=>Rx.Observable.timer(1000).mapTo(v));
// Fetch, it takes some time.
function fetch(query){
return Rx.Observable.timer(100*query)
.mapTo('!'+query).startWith('?'+query);
}
requests.concatMap(q=>fetch(q));
https://rxviz.com/v/Mog1rmGJ
If you want to allow multiple fetches simultaneously, use mergeMap with concurrency parameter.
In my frontend I have an input-field that sends an ajax request on every character typed in (using vue.js) to get realtime-filtering (can't use vue filter because of pagination).
Everything works smooth in my test environment, but could this lead to performance issues on (a bigger amount of) real data and if so, what can I do to prevent this?
Is it problematic?
Yes.
The client will send a lot of requests. Depending on the network connection and browser, this could lead to a perceptible feeling of lag by the client.
The server will receive a lot of requests, potentially leading to degraded performance for all clients, and extra usage of resources on the server side.
Responses to requests have a higher chance of arriving out of order. If you send requests very fast, it has increased chances of being apparent (e.g. displaying autocomplete for "ab" when the user has already typed "abc")
Overall, it's bad practice mostly because it's not necessary to do that many requests.
How to fix it?
As J. B. mentioned in his answer, debouncing is the way to go.
The debounce function (copied below) ensures that a certain function doesn't get called more than once every X milliseconds. Concretely, it allows you to send a request as soon as the user hasn't typed anything for, say, 200ms.
Here's a complete example (try typing text very fast in the input):
function debounce(func, wait, immediate) {
var timeout;
return function() {
var context = this, args = arguments;
var later = function() {
timeout = null;
if (!immediate) func.apply(context, args);
};
var callNow = immediate && !timeout;
clearTimeout(timeout);
timeout = setTimeout(later, wait);
if (callNow) func.apply(context, args);
};
}
var sendAjaxRequest = function(inputText) {
// do your ajax request here
console.log("sent via ajax: " + inputText);
};
var sendAjaxRequestDebounced = debounce(sendAjaxRequest, 200, false); // 200ms
var el = document.getElementById("my-input");
el.onkeyup = function(evt) {
// user pressed a key
console.log("typed: " + this.value)
sendAjaxRequestDebounced(this.value);
}
<input type="text" id="my-input">
For more details on how the debounce function works, see this question
I actually discuss this exact scenario in my Vue.js training course. In short, you may want to wait until a user clicks a button or something of that nature to trigger sending the request. Another approach to consider is to use the lazy modifier, which will delay the event until the change event is fired.
It's hard to know the correct approach without knowing more about the goals of the app. Still, the options listed above are two options to consider.
I hope this helps.
The mechanism I was searching for is called debouncing.
I used this approach in the application.
I am wondering if there's a way to create a promise chain that I can build based on a series of if statements and somehow trigger it at the end. For example:
// Get response from some call
callback = (response) {
var chain = Q(response.userData)
if (!response.connected) {
chain = chain.then(connectUser)
}
if (!response.exists) {
chain = chain.then(addUser)
}
// etc...
// Finally somehow trigger the chain
chain.trigger().then(successCallback, failCallback)
}
A promise represents an operation that has already started. You can't trigger() a promise chain, since the promise chain is already running.
While you can get around this by creating a deferred and then queuing around it and eventually resolving it later - this is not optimal. If you drop the .trigger from the last line though, I suspect your task will work as expected - the only difference is that it will queue the operations and start them rather than wait:
var q = Q();
if(false){
q = q.then(function(el){ return Q.delay(1000,"Hello");
} else {
q = q.then(function(el){ return Q.delay(1000,"Hi");
}
q.then(function(res){
console.log(res); // logs "Hi"
});
The key points here are:
A promise represents an already started operation.
You can append .then handlers to a promise even after it resolved and it will still execute predictably.
Good luck, and happy coding
As Benjamin says ...
... but you might also like to consider something slightly different. Try turning the code inside-out; build the then chain unconditionally and perform the tests inside the .then() callbacks.
function foo(response) {
return = Q().then(function() {
return (response.connected) ? null : connectUser(response.userData);
}).then(function() {
return (response.exists) ? null : addUser(response.userData);//assuming addUser() accepts response.userData
});
}
I think you will get away with returning nulls - if null doesn't work, then try Q() (in two places).
If my assumption about what is passed to addUser() is correct, then you don't need to worry about passing data down the chain - response remains available in the closure formed by the outer function. If this assumption is incorrect, then no worries - simply arrange for connectUser to return whatever is necessary and pick it up in the second .then.
I would regard this approach to be more elegant than conditional chain building, even though it is less efficient. That said, you are unlikely ever to notice the difference.
I want to poll the data from the WebServer(PHP) at an interval of 15 seconds, for 50 to 100 times (or lets say infinite loop, until the stopFlag variable is set to true.).
For this data polling, i am going to use the AJAX ASYNC message for sending the requests to the WebServer.
How can i achieve this?
I have tried to solve this puzzle by myself, but unfortunatly, i failed as there is no keyword for pausing the script execution in the javascript.
Is there any way to make it work? or any workaround on this? Kindly let me know or share your experience if you have already faced this issue.
You have to use a callback for the timeout; it will recursively call the next function.
You can also use jQuery which can help you to make your code more compact. The result might look something like this:
var finished = false;
function keepTrying() {
if (finished) {
return;
}
$.ajax(params);
setTimeout(function() {
keepTrying();
}, 15000);
}
And in params you would have a success function like this:
function success() {
finished = true;
}
Just call keepTrying() the first time; it will loop until it is successful. This code is a bit ugly but hopefully you get the idea.