Rx : Force observable to take at least N seconds to complete - rxjs

I am making a splash screen for my app. I want it to last at least N seconds before going to the main screen.
I have an Rx variable myObservable that returns data from the server or from my local cache. How do I force myObservable to complete in at least N seconds?
myObservable
// .doStuff to make it last at least N seconds
.subscribe(...)

You can use forkJoin to wait until two Observables complete:
Observable.forkJoin(myObservable, Observable.timer(N), data => data)
.subscribe(...);
For RxJS 6 without the deprecated result selector function:
forkJoin(myObservable, Observable.timer(N)).pipe(
map(([data]) => data),
)
.subscribe(...);
Edit: As mentioned in comments, Observable.timer(N) with just one parameter will complete after emitting one item so there's not need to use take(1).

Angular 7+ example of forkjoin
I like to build in a higher delay on my development system since I assume production will be slower. Observable.timer doesn't seem to be available any longer but you can use timer directly.
forkJoin(
// any observable such as your service that handles server coms
myObservable,
// or http will work like this
// this.http.get( this.url ),
// tune values for your app so very quick loads don't look strange
timer( environment.production ? 133 : 667 ),
).subscribe( ( response: any ) => {
// since we aren't remapping the response you could have multiple
// and access them in order as an array
this.dataset = response[0] || [];
// the delay is only really useful if some visual state is changing once loaded
this.loading = false;
});

Related

Deleting rows from a table after testing, generic function

I have some tests on an HTML table which add, modify, delete. I'd like a generic function I can apply to clean up previous data to start clean each time.
I currently reset the page, but there's quite a few steps to take to get to the start of testing so an "undo" function would be very useful WRT faster tests.
This is currently what I have (simplified) for a single row
cy.get('tr').should('have.length', 3).eq(0).click()
cy.get('tr').should('have.length', 2)
Now I need to enhance it to handle any number of rows. I tried looping but it didn't work - the test seems to run too fast for the page to keep up, if that makes sense?
To delete rows from a table is tricky if the DOM gets re-written each time you delete.
At minimum use a .should() assertion on the number of rows after each delete, to ensure each step is complete before the next one.
To be really safe, use a recursive function which controls the process, for example
const clearTable = (attempt = 0) => {
if (attempt === 100) throw 'Too many attempts' // guards against too many steps
cy.get('tbody').then($tbody => {
if($tbody.find('tr').length === 0 ) return; // exit condition tested here
cy.get('tr').then($rows => {
cy.wrap($rows).first().click() // action to delete
cy.then(() => {
clearTable(++attempt) // next step queued using then()
})
})
})
}
clearTable()

Is the observable setup async?

Lets consider the following example:
// Subject sources have been created from Subjects
const one$ = scheduled([firstSubjectSource$, secondSubjectSource$], asyncScheduler)
.pipe(
mergeAll(),
share(),
);
const two$ = scheduled([thirdSubjectSource$, fourthSubjectSource$], asyncScheduler)
.pipe(
mergeAll(),
share(),
);
const final$ = scheduled([one$, two$], asyncScheduler)
.pipe(
combineLatestAll(),
map(() => { /* Some mapping */ }),
);
return final$;
The final$ is created, returned and can be subscribed to.
I have observed that the marble tests work perfectly, i.e, by the time the tests run all the observables have been setup and subscribed to correctly. But in the actual executing environment (iOS 15 JavascriptCore), this doesn't seem to be the case. Values are forwarded to the ...SubjectSource$ observables after subscription to final$, but final$ never emits anything. Tapping console logs in the one$, two$ shows that they also don't emit anything. My current hypothesis is that the internal subscription process hasn't finished. I have combed through some rxjs code but it doesn't look like the subscription process is async.
AFAIK, the asyncScheduler shouldn't make the internal subscription async. It should only affect how the input values are processed and forwarded.
If the return statement is changed to below, then everything works fine. However, putting an arbitrary wait time doesn't seem like the correct thing to do either.
setTimeout(() => cb(final$), 100);
Is the internal setup of observables one$, two$ and final$ async or sync?
Is there an event that I 'need to'/'can' wait on before returning final$ for use?
How do I make sure that the observables are actually ready for use before I return it?

using forkJoin multiple times

I am working on a Project where our client generates almost 500 request simultaneously. I am using the forkJoin to get all the responses as Array.
But the Server after 40-50 request Blocks the requests or sends only errors. I have to split these 500 requests in Chunks of 10 requests and loop over this chunks array and have to call forkJoin for each chunk, and convert observable to Promise.
Is there any way to get rid of this for loop over the chucks?
If I understand right you question, I think you are in a situation similar to this
const clientRequestParams = [params1, params2, ..., params500]
const requestAsObservables = clientRequestParams.map(params => {
return myRequest(params)
})
forkJoin(requestAsObservables).subscribe(
responses => {// do something with the array of responses}
)
and probably the problem is that the server can not load so many requests in parallel.
If my understanding is right and if, as you write, there is a limit of 10 for concurrent requests, you could try with mergeMap operator specifying also the concurrent parameter.
A solution could therefore be the following
const clientRequestParams = [params1, params2, ..., params500]
// use the from function from rxjs to create a stream of params
from(clientRequestParams).pipe(
mergeMap(params => {
return myRequest(params)
}, 10) // 10 here is the concurrent parameter which limits the number of
// concurrent requests on the fly to 10
).subscribe(
responseNotification => {
// do something with the response that you get from one invocation
// of the service in the server
}
)
If you adopt this strategy, you limit the concurrency but you are not guaranteed the order in the sequence of the responses. In other words, the second request can return before the first one has returned. So you need to find some mechanism to link the response to the request. One simple way would be to return not only the response from the server, but also the params which you used to invoke that specific request. In this case the code would look like this
const clientRequestParams = [params1, params2, ..., params500]
// use the from function from rxjs to create a stream of params
from(clientRequestParams).pipe(
mergeMap(params => {
return myRequest(params).pipe(
map(resp => {
return {resp, params}
})
)
}, 10)
).subscribe(
responseNotification => {
// do something with the response that you get from one invocation
// of the service in the server
}
)
With this implementation you would create a stream which notifies both the response received from the server and the params used in that specific invocation.
You can adopt also other strategies, e.g. return the response and the sequence number representing that response, or maybe others.

Rxjs buffer the emitted values for specified time after source emitted values

I have a source$ observable collecting a stream of data if there are some events trigger. I want to collect these data which occurred in a specified time into array.
const eventSubject = new Subject();
eventSubject.next(data);
const source$ = eventSubject.asObservable();
source$.pipe(takeUntil(destroyed$)).subscribe(
data => {
console.log(data);
}
);
The above source$ handle emitted data immediately.
Now i want to improve this that wait for a few seconds and collect all data happened in that specified time and emit once. So i modify to use with bufferTime like below:
const source$ = eventSubject.asObservable();
source$.pipe(takeUntil(destroyed$), bufferTime(2000)).subscribe(
data => {
console.log(data);
}
);
After testing with bufferTime, I found that it emits every 2s even source is not receiving data. If source not receiving data, it emit empty object.
What i want is only when source$ receiving data, then start to buffer for 2s, then emit value. If source$ not receiving data, it shouldn't emit anything.
I checked the bufferWhen, windowWhen, windowTime not all meeting my requirements. They are emitting every time interval specified.
Is there have other operator can do what i want?
Thanks a lot.
You can just add a filter operator to ignore the empty object emission
const source$ = eventSubject.asObservable();
source$.pipe(takeUntil(destroyed$), bufferTime(2000),filter(arr=>arr.length)).subscribe(
data => {
console.log(data);
}
);
I'd go for connect(shared$ => ...) and buffer(signal$).
I think something along these lines:
source$.pipe(
connect(shared$ => shared$.pipe(
buffer(shared$.pipe(
debounceTime(2000)
))
))
)
connect creates a shared observable so that you can have multiple subscriptions on the source without actually opening those subscriptions to it.
In there I run a buffer, whose selector is the debounceTime of the same source, so that it debounces for that much (i.e. will emit the array when source$ doesn't emit for more than 2 seconds)
Maybe what you need is throttleTime(2000, { leading: false, trailing: true }) instead of debounceTime. It depends on your use case.
The optimal solution for this case is to use buffer operator with a notifier of debouceTime operator.
For example :
const source$ = eventSubject.asObservable();
source$
.pipe(
// buffer: accumulate emitions to an array,until closing notifier emits. (closing notifier is the argument below)
buffer(
// debounceTime : will grab the emit that afterwards 2 seconds has passed without another emit.
$source.pipe(debounceTime(2000))
).subscribe(
// will return the data that has been emitted througout the 2 seconds in a form of an array , where each item in the array is the emits, by the order they were triggered.
data => {
console.log(data);
}
);
buffer:
Buffers the source Observable values until closingNotifier emits.
debounceTime:
Emits a notification from the source Observable only after a particular time span has passed without another source emission.
This solution in contrast of the filter operator solution, will not keep an interval/timer alive.
And its pretty clean and elegant IMO.

How to tabulate/aggregating a total value from an array of observables using reduce/scan (in NGRX/NGXS)

I am trying to aggregate/tabulate the results of a set of observables. I have an array of observables that each return a number and I want to total up those results and emit that as the value. Each time the source numbers change, I want the end result to reflect the new total. The problem is that I am getting the previous results added to the new total. This has to do with how I am using the reduce/scan operator. I believe it needs to be nested inside a switchMap/mergeMap, but so far I have been unable to figure out the solution.
I mocked up a simple example. It shows how many cars are owned by all users in total.
Initially, the count is correct, but when you add a car to a user, the new total includes the previous total.
https://stackblitz.com/edit/rxjs-concat-observables-3-drfd36
Any help is greatly appreciated.
Your scan works perfectly right, the point is that for each update the stream gets all data repetitively, so, the fastest way to fix I think is to set a new instance of the stream at the handleClickAddCar.
https://stackblitz.com/edit/rxjs-wrong-count.
I ended up doing this:
this.carCount$ = this.users$.pipe(
map((users: User[]): Array<Observable<number>> => {
let requests = users.map(
(user: User): Observable<number> => {
return this.store.select(UserSelectors.getCarsForUser(user)).pipe(
map((cars: Car[]): number => {
return cars.length;
})
);
}
);
return requests;
}),
flatMap((results): Observable<number> => {
return combineLatest(results).pipe(
take(1),
flatMap(data => data),
reduce((accum: number, result: number): number => {
return accum + result;
}, 0)
)
})
);
I think the take(1) ends up doing the same thing as Yasser was doing above by recreating the entire stream. I think this way is a little cleaner.
I also added another stream below it (in the code) that does one level deeper in terms of retrieving observables of observables.
https://stackblitz.com/edit/rxjs-concat-observables-working-1
Anyone have a cleaner, better way of doing this type of roll-up of observable results?

Resources