I have the following code:
const observable_one = this.loadFromStorage(key) //fast!
const observable_two = this.http.myRequest() //slow!
my_observable =
merge(
observable_one,
oobservable_two
)
Now, when I do the following, it does only give data AFTER the slow second one has finished
const data = await my_observable.toPromise()
whilst
my_observable.subscribe(data => {
//work with data
})
emits twice, as expected.
How can this be explained? How can toPromise() know there is something coming up?
For some reasons I prefer the await / toPromise() approach.
you can use combineLatest
combineLatest(observable_one,oobservable_two).subscribe(data => {
//work with data
})
Related
My situation is as follows: I am performing sequential HTTP requests, where one HTTP request depends on the response of the previous. I would like to combine the response data of all these HTTP requests into one observable. I have implemented this before using an async generator. The code for this was relatively simple:
async function* AsyncGeneratorVersion() {
let moreItems = true; // whether there is a next page
let lastAssetId: string | undefined = undefined; // used for pagination
while (moreItems) {
// fetch current batch (this performs the HTTP request)
const batch = await this.getBatch(/* arguments */, lastAssetId);
moreItems = batch.more_items;
lastAssetId = batch.last_assetid;
yield* batch.getSteamItemsWithDescription();
}
}
I am trying to move away from async generators, and towards RxJs Observables. My best (and working) attempt is as follows:
const observerVersion = new Observable<SteamItem>((subscriber) => {
(async () => {
let moreItems = true;
let lastAssetId: string | undefined = undefined;
while (moreItems) {
// fetch current batch (this performs the HTTP request)
const batch = await this.getBatch(/* arguments */, lastAssetId);
moreItems = batch.more_items;
lastAssetId = batch.last_assetid;
const items = batch.getSteamItemsWithDescription();
for (const item of items) subscriber.next(item);
}
subscriber.complete();
})();
});
Now, I believe that there must be some way of improving this Observer variant - this code does not seem very reactive to me. I have tried several things using pipe, however unfortunately these were all unsuccessful.
I found concatMap to come close to a solution. This allowed me to concat the next HTTP request as an observable (done with the this.getBatch method), however I could not find a good way to also not abandon the response of the current HTTP request.
How can this be achieved? In short I believe this problem could be described as appending data to an observable inside the observable itself. (But perhaps this is not a good way of handling this situation)
TLDR;
Here's a working StackBlitz demo.
Explanation
Here would be my approach:
// Faking an actual request
const makeReq = (prevArg, response) =>
new Promise((r) => {
console.log(`Running promise with the prev arg as: ${prevArg}!`);
setTimeout(r, 1000, { prevArg, response });
});
// Preparing the sequential requests.
const args = [1, 2, 3, 4, 5];
from(args)
.pipe(
// Running the reuqests sequantially.
mergeScan(
(acc, crtVal) => {
// `acc?.response` will refer to the previous response
// and we're using it for the next request.
return makeReq(acc?.response, crtVal);
},
// The seed(works the same as `reduce`).
null,
// Making sure that only one request is run at a time.
1
),
// Combining all the responses into one object
// and emitting it after all the requests are done.
reduce((acc, val, idx) => ({ ...acc, [`request${idx + 1}`]: val }), {})
)
.subscribe(console.warn);
Firstly, from(array) will emit each item from the array, synchronously and one by one.
Then, there is mergeScan. It is exactly the result of combining scan and merge. With scan, we can accumulate values(in this case we're using it to have access to the response of the previous request) and what merge does is to allow us to use observables.
To make things a bit easier to understand, think of the Array.prototype.reduce function. It looks something like this:
[].reduce((acc, value) => { return { ...acc }}, /* Seed value */{});
What merge does in mergeScan is to allow us to use the accumulator something like (acc, value) => new Observable(...) instead of return { ...acc }. The latter indicates a synchronous behavior, whereas with the former we can have asynchronous behavior.
Let's go a bit step by step:
when 1 is emitted, makeReq(undefined, 1) will be invoked
after the first makeReq(from above) resolves, makeReq(1, 2) will be invoked
after makeReq(1, 2) resolves, makeReq(2, 3) will be invoked and so on...
Somebody I consulted regarding this matter came up with this solution, I think it's quite elegant:
defer(() => this.getBatch(options)).pipe(
expand(({ more_items, last_assetid }) =>
more_items
? this.getBatch({ ...options, startAssetId: last_assetid })
: EMPTY,
),
concatMap((batch) => batch.getSteamItemsWithDescription()),
);
From my understanding the use of expand here is very similar to the use of mergeScan in #Andrei's answer
To be honest I am a total noob at NGRX and only limited experience in rxjs. But essentially I have code similar to this:
#Effect()
applyFilters = this.actions$.pipe(
ofType<ApplyFilters>(MarketplaceActions.ApplyFilters),
withLatestFrom(this.marketplaceStore.select(appliedFilters),
this.marketplaceStore.select(catalogCourses)),
withLatestFrom(([action, filters, courses]) => {
return [courses,
this.combineFilters([
this.getCourseIdsFromFiltersByFilterType(filters, CatalogFilterType.TRAINING_TYPE),
this.getCourseIdsFromFiltersByFilterType(filters, CatalogFilterType.INDUSTRIES)
])
];
}),
map(([courses, filters]) => {
console.log('[applyFilters effect] currently applied filters =>', filters);
console.log('courseFilters', filters);
const filteredCourses = (courses as ShareableCourse[]).filter(x => (filters as number[]).includes(+x.id));
console.log('all', courses);
console.log('filtered', filteredCourses);
return new SetCatalogCourses(filteredCourses);
})
);
Helper method:
private combineFilters(observables: Observable<number[]>[]): number[] {
if (!observables.some(x => x)) {
return [];
} else {
let collection$ = (observables[0]);
const result: number[] = [];
for (let i = 0; i < observables.length; i++) {
if (i >= 1) {
collection$ = concat(collection$, observables[i]) as Observable<number[]>;
}
}
collection$.subscribe((x: number[]) => x.forEach(y => result.push(y)));
return result;
}
}
So essentially the store objects gets populated, I can get them. I know that the observables of 'this.getCourseIdsFromFiltersByFilterType(args)' do work as on the console log of the 'filters' they are there. But the timing of the operation is wrong. I have been reading up and am just lost after trying SwitchMap, MergeMap, Fork. Everything seems to look okay but when I am trying to actually traverse the collections for the result of the observables from the service they are not realized yet. I am willing to try anything but in the simplest form the problem is this:
Two observables need to be called either in similar order or pretty close. Their 'results' are of type number[]. A complex class collection that has a property of 'id' that this number[] should be able to include. This works just fine when all the results are not async or in a component.(I event dummied static values with variables to check my 'filter' then 'includes' logic and it works) But in NGRX I am kind of lost as it needs a return method and I am simply not good enough at rxjs to formulate a way to make it happy and ensure the observables are fully realized for their values from services to be used appropriately. Again I can see that my console log of 'filters' is there. Yet when I do a 'length' of it, it's always zero so I know somewhere there is a timing problem. Any help is much appreciated.
If I understand the problem, you may want to try to substitute this
withLatestFrom(([action, filters, courses]) => {
return [courses,
this.combineFilters([
this.getCourseIdsFromFiltersByFilterType(filters, CatalogFilterType.TRAINING_TYPE),
this.getCourseIdsFromFiltersByFilterType(filters, CatalogFilterType.INDUSTRIES)
])
];
}),
with something like this
switchMap(([action, filters, courses]) => {
return forkJoin(
this.getCourseIdsFromFiltersByFilterType(filters, CatalogFilterType.TRAINING_TYPE),
this.getCourseIdsFromFiltersByFilterType(filters, CatalogFilterType.INDUSTRIES
).pipe(
map(([trainingFilters, industryFilters]) => {
return [courses, [...trainingFilters, ...industryFilters]]
})
}),
Now some explanations.
When you exit this
withLatestFrom(this.marketplaceStore.select(appliedFilters),
this.marketplaceStore.select(catalogCourses)),
you pass to the next operator this array [action, filters, courses].
The next operator has to call some remote APIs and therefore has to create a new Observable. So you are in a situation when an upstream Observable notifies something which is taken by an operator which create a new Observable. Similar situations are where operators such as switchMap, mergeMap (aka flatMap), concatMap and exhastMap have to be used. Such operators flatten the inner Observable and return its result. This is the reason why I would use one of these flattening operators. Why switchMap in your case? It is not really a short story. Maybe reading this can cast some light.
Now let's look at the function passed to switchMap
return forkJoin(
this.getCourseIdsFromFiltersByFilterType(filters, CatalogFilterType.TRAINING_TYPE),
this.getCourseIdsFromFiltersByFilterType(filters, CatalogFilterType.INDUSTRIES
).pipe(
map(([trainingFilters, industryFilters]) => {
return [courses, [...trainingFilters, ...industryFilters]]
})
This function first executes 2 remote API calls in parallel via forkJoin, then take the result of these 2 calls and map it to a new Array containing both courses and the concatenation of trainingFilters and industryFilters
In my case there are multiple requests could be performed in parallel at first, after those requests complete, another request will be sent with previous result, the pseudo code would look like
let uploads$ = [obs1$, obs2$, obs3$];
Observable.forkJoin(uploads$).mergeMap(
res => {
// never get called if uploads$ = []
let data = someCalculation(res);
return this.http.post('http://endpoint/api/resource', data);
}
).subscribe(
res => {
}
);
If uploads$ = [], the inner mergeMap never got called.
Can someone help? I'm on RxJS 5.4
It's not called b/c there is no emission on the source observable. To create one on, if observables is empty you can use the defaultIfEmpty or toArray operators.
const observables = [];
Rx.Observable.forkJoin(observables)
.defaultIfEmpty([]) // or .toArray()
.mergeMap(results => Rx.Observable.of(results.length))
.subscribe(console.log);
Let's say I have a rather typical use of rx that does requests every time some change event comes in (I write this in the .NET style, but I'm really thinking of Javascript):
myChanges
.Throttle(200)
.Select(async data => {
await someLongRunningWriteRequest(data);
})
If the request takes longer than 200ms, there's a chance a new request begins before the old one is done - potentially even that the new request is completed first.
How to synchronize this?
Note that this has nothing to do with multithreading, and that's the only thing I could find information about when googling for "rx synchronization" or something similar.
You could use concatMap operator which will start working on the next item only after previous was completed.
Here is an example where events$ appear with the interval of 200ms and then processed successively with a different duration:
const { Observable } = Rx;
const fakeWriteRequest = data => {
console.log('started working on: ', data);
return Observable.of(data).delay(Math.random() * 2000);
}
const events$ = Observable.interval(200);
events$.take(10)
.concatMap(i => fakeWriteRequest(i))
.subscribe(e => console.log(e));
<script src="https://unpkg.com/rxjs/bundles/Rx.min.js"></script>
I have an SPA that is loading some global/shared data (let's call this APP_LOAD_OK) and page-specific data (DASHBOARD_LOAD_OK) from the server. I want to show a loading animation until both APP_LOAD_OK and DASHBOARD_LOAD_OK are dispatched.
Now I have a problem with expressing this in RxJS. What I need is to trigger an action after each DASHBOARD_LOAD_OK, as long as there had been at least one APP_LOAD_OK. Something like this:
action$
.ofType(DASHBOARD_LOAD_OK)
.waitUntil(action$.ofType(APP_LOAD_OK).first())
.mapTo(...)
Does anybody know, how I can express it in valid RxJS?
You can use withLatestFrom since it will wait until both sources emit at least once before emitting. If you use the DASHBOARD_LOAD_OK as the primary source:
action$.ofType(DASHBOARD_LOAD_OK)
.withLatestFrom(action$.ofType(APP_LOAD_OK) /*Optionally*/.take(1))
.mapTo(/*...*/);
This allows you to keep emitting in the case that DASHBOARD_LOAD_OK fires more than once.
I wanted to avoid implementing a new operator, because I thought my RxJS knowledge was not good enough for that, but it turned out to be easier than I thought. I am keeping this open in case somebody has a nicer solution. Below you can find the code.
Observable.prototype.waitUntil = function(trigger) {
const source = this;
let buffer = [];
let completed = false;
return Observable.create(observer => {
trigger.subscribe(
undefined,
undefined,
() => {
buffer.forEach(data => observer.next(data));
buffer = undefined;
completed = true;
});
source.subscribe(
data => {
if (completed) {
observer.next(data);
} else {
buffer.push(data);
}
},
observer.error.bind(observer),
observer.complete.bind(observer)
);
});
};
If you want to receive every DASHBOARD_LOAD_OK after the first APP_LOAD_OK You can simply use skipUntil:
action$ .ofType(DASHBOARD_LOAD_OK)
.skipUntil(action$.ofType(APP_LOAD_OK).Take(1))
.mapTo(...)
This would only start emitting DASHBOARD_LOAD_OK actions after the first APP_LOAD_OK, all actions before are ignored.