I'm creating multiple web workers making http calls. I have to limit the number of web workers and so I'm trying to wait for some of the workers to finish.
here's an example of what I thought might work using Promises:
anArray.map(async contact => {
await new Promise((res, rej) => {
const worker = new Worker('./http.worker', { type: 'module' });
worker.onmessage = () => {
res();
};
worker.postMessage(contact);
});
});
I thought this would wait for each promise to resolve before moving on to the next item... but it's not.
What could I do to make this work? or... I've also thought of building an array of workers and run a recursive loop that checks/waits for one to be available... I'm open to general suggestions for solving this.
.map() is not promise aware. It does not look at the return value from each iteration to see if it's a promise and then pause the loop. Instead, it just blindly runs all the iterations one after another. When you return promises from .map() which you are with the async callback, that just means that your .map() will produce an array of promises and all your loop iterations will be "in-flight" at the same time, not sequenced.
If you want to iterate a loop and pause the loop in each iteration until a promise resolves, then use a regular for loop:
async function someFunc() {
for (let contact of anArray) {
await new Promise((res, rej) => {
const worker = new Worker('./http.worker', { type: 'module' });
worker.onmessage = () => {
res();
};
worker.postMessage(contact);
});
}
}
FYI, http calls in Javascript are non-blocking and asynchronous so it's not entirely clear why you're doing them in WebWorkers. Unless you have CPU-intensive processing of the result, you can do the http requests in the main thread just fine without blocking it.
Also FYI, for a number of different options for processing an array, with only N requests in flight at the same time (where you decide what value N is), see these various answers:
runN(fn, limit, cnt, options): Loop through an API on multiple requests
pMap(array, fn, limit): Make several requests to an api that can only handle 20 at a time
rateLimitMap(array, requestsPerSec, maxInFlight, fn): Proper async method for max requests per second
mapConcurrent(array, maxConcurrent, fn): Promise.all() consumes all my ram
There are also features to do this built into the Bluebird promise library and the Async-promises library.
Related
TLDR: Working example is in the last codeblock of this question. Check out #bryan60 answer for a working example using concat rather than mergeMap.
I'm trying to run a number of remote requests sequentially, but only the first observable is executed.
The number of request vary, so I can't do a dodgy solution where I nest observables within each other.
I'm using the following code:
const observables = [
observable1,
observable2,
...
];
from(observables).pipe(
mergeMap(ob=> {
return ob.pipe(map(res => res));
}, undefined, 1)
).subscribe(res => {
console.log('Huzzah!');
})
In the past (rxjs 5.5) Ive used the following:
let o = Observable.from(observables).mergeMap((ob) => {
return ob;
}, null, 1);
o.subscribe(res => {
console.log('Huzzah!');
})
I'm not sure what I'm doing wrong, can anybody shed some light?
An additional request would be to only print 'Huzzah!' once on completion of all requests rather than for each individual Observable.
EDIT:
Removing undefined from my original code will make it work, however there was another issue causing only the first observable to be executed.
I'm using Angular's HttpClient for remote requests. My observable code looked like this:
const observables = [];
// Only the first observable would be executed
observables.push(this.http.get(urla));
observables.push(this.http.get(urlb));
observables.push(this.http.get(urlc));
Adding .pipe(take(1)) to each observable results in each observable being executed:
const observables = [];
// All observables will now be executed
observables.push(this.http.get(urla).pipe(take(1));
observables.push(this.http.get(urlb).pipe(take(1));
observables.push(this.http.get(urlc).pipe(take(1));
The code I ended up using, which executes all observables in sequential order and only triggers Huzzah! once is:
const observables = [];
observables.push(this.http.get(urla).pipe(take(1));
observables.push(this.http.get(urlb).pipe(take(1));
observables.push(this.http.get(urlc).pipe(take(1));
from(observables).pipe(
mergeMap(ob=> {
return ob.pipe(map(res => res));
}, 1),
reduce((all: any, res: any) => all.concat(res), [])
).subscribe(res => {
console.log('Huzzah!');
})
Thanks to #bryan60 for helping me wit this issue.
if these are http requests that complete, I think your bug is caused by a change to the mergeMap signature that removed the result selector. it's hard to be sure without knowing exactly which version you're on as it was there, then removed, then added again, and they're removing it once more for good in v7.
if you want to run them sequentially... this is all you need...
// concat runs input observables sequentially
concat(...observables).subscribe(res => console.log(res))
if you want to wait till they're all done to emit, do this:
concat(...observables).pipe(
// this will gather all responses and emit them all when they're done
reduce((all, res) => all.concat([res]), [])
// if you don't care about the responses, just use last()
).subscribe(allRes => console.log(allRes))
In my personal utility rxjs lib, I always include a concatJoin operator that combines concat and reduce like this.
the only trick is that concat requires observables to complete till it moves on to the next one, but the same is true for mergeMap with concurrent subscriptions set to 1.. so that should be fine. things like http requests are fine, as they complete naturally after one emission.. websockets or subjects or event emitters will behave a bit differently and have to be manually completed, either with operators like first or take or at the source.
If you are not concerned about the sequence of execution and just want 'Huzzah!' to be printed once all the observable has been executed forkJoin can also be used.Try this.
forkJoin(...observables).subscribe(res => console.log('Huzzah');
I have a list of multiple inputs (dynamically generated - unknown number).
I want each to trigger an ajax request on every keystroke
I want these ajax requests to be queued up, so only one is sent to
the server at a time, and the next one is sent only after getting a response from the earlier one.
if new requests are triggered from an input that already has requests in the queue, I want the old ones associated with the same input to be cancelled.
if new requests are triggered from an input that does not already have inputs in the queue, I want the new requests to just be added to the end of the queue without cancelling anything.
I'm told that RxJS makes these kinds of complicated async operations easy, but I can't seem to wrap my head around all the RxJS operators.
I have queueing working with a single input below, but I don't really understand why the defer is necessary or how to queue requests for separate inputs while maintaining the switchMap-like behavior I think I want for individual inputs themselves.
Rx.Observable.fromEvent(
$("#input"),
'keyup'
)
.map((event) => {
return $("#input").val();
});
.concatMap((inputVal) => {
return Rx.Observable.defer(() => Rx.Observable.fromPromise(
fetch(myURL + inputVal)
))
.catch(() => Rx.Observable.empty());
})
.subscribe();
First of all you have to create some sort of function that manages each input. Something along the following lines
requestAtKeyStroke(inputId: string) {
return Rx.Observable.fromEvent(
$(inputId),
'keyup'
)
.map((event) => {
return $("#input").val();
})
.filter(value => value.length > 0)
.switchMap((inputVal) => Rx.Observable.fromPromise(fetch(myURL + inputVal)))
}
Such a function deals with your third requisite, to cancel requests still on fly when a new one arrives. The key here is the switchMap operator.
Then what you can do is to merge all the Observables corresponding to your inputs into one Observable. One way could be the following
Observable.from(['input1, 'input2']).map(input => requestAtKeyStroke(input)).mergeAll()
This is not fulfilling all you requisites, since you still may have more than one requests under execution at the same time, coming from different inputs. I am not sure though if it is possible to fulfill all your requisites at the same time.
I'm working on something that is recording data coming from a queue. It was easy enough to process the queue into an Observable so that I can have multiple endpoints in my code receiving the information in the queue.
Furthermore, I can be sure that the information arrives in order. That bit works nicely as well since the Observables ensure that. But, one tricky bit is that I don't want the Observer to be notified of the next thing until it has completed processing the previous thing. But the processing done by the Observer is asynchronous.
As a more concrete example that is probably simple enough to follow. Imagine my queue contains URLs. I'm exposing those as an Observable in my code. The I subscribe an Observer whose job is to fetch the URLs and write the content to disk (this is a contrived example, so don't take issue with these specifics). The important point is that fetching and saving are async. My problem is that I don't want the observer to be given the "next" URL from the Observable until they have completed the previous processing.
But the call to next on the Observer interface returns void. So there is no way for the Observer to communicate back to me that has actually completed the async task.
Any suggestions? I suspect there is probably some kind of operator that could be coded up that would basically withhold future values (queue them up in memory?) until it somehow knew the Observer was ready for it. But I was hoping something like that already existed following some established pattern.
similar use case i ran into before
window.document.onkeydown=(e)=>{
return false
}
let count=0;
let asyncTask=(name,time)=>{
time=time || 2000
return Rx.Observable.create(function(obs) {
setTimeout(function() {
count++
obs.next('task:'+name+count);
console.log('Task:',count ,' ', time, 'task complete')
obs.complete();
}, time);
});
}
let subject=new Rx.Subject()
let queueExec$=new Rx.Subject()
Rx.Observable.fromEvent(btnA, 'click').subscribe(()=>{
queueExec$.next(asyncTask('A',4000))
})
Rx.Observable.fromEvent(btnB, 'click').subscribe(()=>{
queueExec$.next(asyncTask('B',4000))
})
Rx.Observable.fromEvent(btnC, 'click').subscribe(()=>{
queueExec$.next(asyncTask('C',4000))
})
queueExec$.concatMap(value=>value)
.subscribe(function(data) {
console.log('onNext', data);
},
function(error) {
console.log('onError', error);
},function(){
console.log('completed')
});
What you describe sounds like "backpressure". You can read about it in RxJS 4 documentation https://github.com/Reactive-Extensions/RxJS/blob/master/doc/gettingstarted/backpressure.md. However this is mentioning operators that don't exist in RxJS 5. For example have a look at "Controlled Observables" that should refer to what you need.
I think you could achieve the same with concatMap and an instance of Subject:
const asyncOperationEnd = new Subject();
source.concatMap(val => asyncOperationEnd
.mapTo(void 0)
.startWith(val)
.take(2) // that's `val` and the `void 0` that ends this inner Observable
)
.filter(Boolean) // Always ignore `void 0`
.subscribe(val => {
// do some async operation...
// call `asyncOperationEnd.next()` and let `concatMap` process another value
});
Fro your description it actually seems like the "observer" you're mentioning works like Subject so it would make maybe more sense to make a custom Subject class that you could use in any Observable chain.
Isn't this just concatMap?
// Requests are coming in a stream, with small intervals or without any.
const requests=Rx.Observable.of(2,1,16,8,16)
.concatMap(v=>Rx.Observable.timer(1000).mapTo(v));
// Fetch, it takes some time.
function fetch(query){
return Rx.Observable.timer(100*query)
.mapTo('!'+query).startWith('?'+query);
}
requests.concatMap(q=>fetch(q));
https://rxviz.com/v/Mog1rmGJ
If you want to allow multiple fetches simultaneously, use mergeMap with concurrency parameter.
I am wondering whether the following is defined behavior per the Promise specification:
var H = function (c) {
this.d_p = Promise.resolve();
this.d_c = c;
};
H.prototype.q = function () {
var s = this;
return new Promise(function (resolve) {
s.d_p = s.d_p.then(function () { // (1)
s.d_c({
resolve: resolve
});
});
});
};
var a,
h = new H(function (args) { a = args; }),
p;
Promise.resolve()
.then(function () {
p = h.q();
})
.then(function () { // (2)
a.resolve(42);
return p;
});
The question is whether it's guaranteed that the then callback marked (1) is called before the then callback marked (2).
Note that both promises in question are instantly resolved, so it seems to me like the (1) then callback should be scheduled as part of calling h.q(), which should be before the promise used to resolve (2) is resolved, so it should be before (2) is scheduled.
An example jsfiddle to play about with: https://jsfiddle.net/m4ruec7o/
It seems that this is what happens with bluebird >= 2.4.1, but not prior versions. I tracked the change in behavior down to this commit: https://github.com/petkaantonov/bluebird/commit/6bbb3648edb17865a6ad89a694a3241f38b7f86e
Thanks!
You can guarantee that h.q() will be called before a.resolve(42); is called because chained .then() handlers do execute in order.
If you were asking about code within h.q(), then s.d_p.then() is part of a completely different promise chain and promise specifications do not provide ordering for separate promise chains. They are free to execute with their own asynchronous timing. And, in fact, I've seen some differences in the execution of independent promise chains in different Javascript environments.
If you want to direct the execution order between two independent promise chains, then you will have to link them somehow so one operation does not run until some other operation has completed. You can either link the two chains directly or you can do something more complicated involving an intermediate promise that is inserted into one chain so it blocks that chain until it is resolved.
You may find this answer useful What is the order of execution in javascript promises which provides a line by line analysis of execution order of both chained and independent promises and discusses how to make execution order predictable.
I have two for loops and an HTTP call inside them.
for(i=0;i<m;i++) {
for(j=0;j<n;j++) {
$http call that uses i and j as GET parameters
.success(//something)
.error(//something more)
}
}
The problem with this is it makes around 200-250 AJAX calls based on values of m and n. This is causing problem of browser crash when tried to access from mobile.
I would like to know if there is a way to call HTTP requests in batched form (n requests at a time) and once these calls are finished, move to next batch and so on.
You could always use a proper HTTP batch module like this angular-http-batcher - which will take all of the requests and turn them into a single HTTP POST request before sending it to the server. Therefore it reduces 250 calls into 1! The module is here https://github.com/jonsamwell/angular-http-batcher and a detailed explanation of it is here http://jonsamwell.com/batching-http-requests-in-angular/
Yes, use the async library found here: https://github.com/caolan/async
First, use the loop to create your tasks:
var tasks = []; //array to hold the tasks
for(i=0;i<m;i++) {
for(j=0;j<n;j++) {
//we add a function to the array of "tasks"
//Async will pass that function a standard callback(error, data)
tasks.push(function(cb){
//because of the way closures work, you may not be able to rely on i and j here
//if i/j don't work here, create another closure and store them as params
$http call that uses i and j as GET parameters
.success(function(data){cb(null, data);})
.error(function(err){cb(err);});
});
}
}
Now that you've got an array full of callback-ready functions that can be executed, you must use async to execute them, async has a great feature to "limit" the number of simultaneous requests and therefore "batch".
async.parallelLimit(tasks, 10, function(error, results){
//results is an array with each tasks results.
//Don't forget to use $scope.$apply or $timeout to trigger a digest
});
In the above example you will run 10 tasks at a time in parallel.
Async has a ton of other amazing options as well, you can run things in series, parlallel, map arrays, etc.It's worth noting that you might be able to achieve greater efficiency by using a single function and the "eachLimit" function of async.
The way I did it is as follows (this will help when one wants to call HTTP requests in a batch of n requests at a time )
call batchedHTTP(with i=0);
batchedHTTP = function() {
/* check for terminating condition (in this case, i=m) */
for(j=0;j<n;j++) {
var promise = $http call with i and j GET parameters
.success(// do something)
.error(// do something else)
promisesArray.push(promise);
}
$q.all(promisesArray).then(function() {
call batchedHTTP(with i=i+1)
});
}