Live: https://stackblitz.com/edit/rxjs-6aikyy?file=index.ts
const sub = new ReplaySubject(1);
const source = combineLatest(
of('World'),
sub,
).pipe(
map(([, n]) => n) ,
tap(x => console.log('Tap', x)),
publishReplay(1),
refCount(),
);
sub.next(1);
const subscription = source.subscribe(x => console.log('Sub', x));
sub.next(2);
subscription.unsubscribe();
sub.next(3);
console.log('pause');
sub.next(4);
source.subscribe(x => console.log('Sub', x));
/**
* Tap 1 (unwanted)
* Sub 1 (unwanted)
* Tap 2
* Sub 2
* pause
* Sub 2 (why exaclty does this happened?)
* Tap 4
* Sub 4
*/
This is simplified a bit, but it shows my problem well. In my actual code, I combine 1 BehaviourSubject and 2 ReplySubjects, then pipe it into switchMap and fetching data from the server. I really don't want to keep making HTTP calls unless someone actually listens.
Does someone know why do I get the first notification from the ReplaySubject (Tap 1 & Sub 1), and also why does "Sub 2" appear after the pause?
Actual test case
this.currentPage = new BehaviourSubject(1);
// Internal
const folderIDAndPage: Observable<[string, number]> = combineLatest(
/** Angular's BehaviourSubject of URL maped to id */
activatedRoute.url.pipe(
switchMap(segments => folderService.getFoldersFromSegments(segments)),
distinctUntilChanged(),
),
/** BehaviourSubject/ReplaySubject (either) of QueryString maped to page */
this.currentPage.pipe(
distinctUntilChanged(),
),
).pipe(
// Prevent repeating HTTP somehow...
);
// Public
this.folders = folderIDAndPage.pipe(
switchMap(([folderID, page]) => folderService.getFfolders(folderID, page)),
// Prevent repeating HTTP somehow...
);
// Public
this.files = folderIDAndPage.pipe(
switchMap(([folderID, page]) => fileService.getFiles(folderID, page)),
// Prevent repeating HTTP somehow...
);
As of now, I have it that if I don't subscribe to either, no HTTP will be made, but if I subscribe to files or folders, the folderService.getFoldersFromSegments seems to get called twice.
Related
This question builds upon this one, where it is shown how to feed an Observable into a Subject. My question is similar, but I want to avoid making the Observable hot unless it is necessary, so that it's .pipe() doesn't run needlessly. For example:
const subject$ = new Subject();
const mouseMove$ = Observable.fromEvent(document, 'mousemove')
.pipe(map(it => superExpensiveComputation(it)));
mouseMove$.subscribe(n => subject$.next(n));
Because of the subscription, this will make mouseMove$ hot, and superExpensiveComputation will be run for every mouse move, whether someone is listening for it on subject$ or not.
How can I feed the result of mouseMove$ into subject$ without running superExpensiveComputation unneccessarily?
You can simply use tap instead of subscribe to pass emissions to your subject:
const mouseMove$ = fromEvent(document, 'mousemove').pipe(
map(it => superExpensiveComputation(it)),
tap(subject$)
);
Of course you still need to subscribe to mouseMove$ to make the data flow, but you don't need to have a subscription dedicated to passing the data to your subject.
However, you'll probably want to add share as not to repeat the expensive logic for multiple subscribers.
const mouseMove$ = fromEvent(document, 'mousemove').pipe(
map(it => superExpensiveComputation(it)),
share(),
tap(subject$)
);
But... then in that case, do you really need a subject at all? Unless you are going to be calling .next() from somewhere else, you probably don't.
Because of the subscription, this will make mouseMove$ hot
Wrong. The subscription doesn't change behavior of observable.
Observable.fromEvent(document, 'mousemove') is already hot.
Here's naive cold version of it.
The key takeaway, I have to resubscribe it everytime to get the latest data.
const { of } = rxjs;
const mouseMove$ = of((() => {
let payload = {clientX: 0};
document.addEventListener("mousemove", (ev) => payload.clientX = ev.clientX)
return payload;
})())
let subscriber = mouseMove$.subscribe(pr => console.log(pr));
const loop = () => {
if(subscriber) {
subscriber.unsubscribe();
}
subscriber = mouseMove$.subscribe(pr => console.log(pr));
setTimeout(loop, 1000);
};
setTimeout(loop, 1000);
<script src="https://cdnjs.cloudflare.com/ajax/libs/rxjs/6.5.5/rxjs.umd.js"></script>
and superExpensiveComputation will be run for every mouse move,
whether someone is listening for it on subject$ or not
Then you can get rid of map(it => superExpensiveComputation(it)) and move that lamda to subscription so it still executes on every mouse move but after it has been subscribed..
I am looking to preload a bunch of images, and have discounted base64 and createObjectURL so i'd take a better option, but this is what I have.
Anyway this is what I am looking at, a function which does this. Loads an array of URLS as images.
const urls = ["lol.jpg"];
const images = urls.map((url) => {
const imageElement = document.createElement('img');
const imageComplete = fromEvent(imageElement, 'load');
imageElement.src = targetURL;
return imageComplete;
});
forkJoin(images)
But how do I correctly handle loading errors here? I have added a new fromEvent but now I have 2 events where I used to just have one, and further one of them is the special error case.
const urls = ["lol.jpg"];
const images = urls.map((url) => {
const imageElement = document.createElement('img');
const imageComplete = fromEvent(imageElement, 'load');
const imageError = fromEvent(imageElement, 'error');
imageElement.src = targetURL;
return imageComplete; // <--- not good enough now
});
forkJoin(images)
Is it correct to listen for an error here? Ultimately I need to know if any of these fail and consider them all failures but during my tests, a 404 doesn't catchError anywhere and this brings me to this question.
The answer, in part, depends on what fromEvent(imageElement, 'error') does here:
fromEvent(imageElement, 'error').subscribe({
next: val => console.log("next: ", val),
error: err => console.log("error:", err)
});
If you do this, and you receive an error, does the event trigger next or error? Either way, I assume you want to remove imageElement if it fails to load.
If it triggers error, then you can do this:
const imageComplete = fromEvent(imageElement, 'load');
const imageError = fromEvent(imageElement, 'error').pipe(
catchError(err => {
imageElement.remove();
return throwError(err);
})
);
imageElement.src = targetURL;
return merge(imageComplete, imageError);
If it triggers next, then you can switch imageError into a stream that errors out by throwing an error like this:
const imageComplete = fromEvent(imageElement, 'load');
const imageError = fromEvent(imageElement, 'error').pipe(
tap(x => {
imageElement.remove();
throw(new Error(x.message));
})
);
imageElement.src = targetURL;
return merge(imageComplete, imageError);
Now, forkJoin will fail with your error if any of its sources fail. If you don't want that, you need to handle the error before it reaches your forkJoin.
Retry failed images a few times instead
The trickiest bit here is that you need to create an image element as part of your stream so that when you re-try it will not just resubscribe to an event of an already failed element.
The other tricky bit is that you want to create your imageElement as part of subscribing to your stream rather than as part of its definition. That's what defer essentially accomplishes for you.
With that out of the way, not many changes, and you can re-try loading your image pretty simply. You can read up on retryWhen if you want to make the process a bit more sophisticated (Like delay a bit before retrying).
const images = urls.map(targetURL =>
defer(() => of(document.createElement('img'))).pipe(
mergeMap(imageElement => {
const imageComplete = fromEvent(imageElement, 'load');
const imageError = fromEvent(imageElement, 'error').pipe(
catchError(err => {
imageElement.remove();
return throwError(err);
})
);
imageElement.src = targetURL;
return merge(imageComplete, imageError);
}),
retry(5),
catchError(err => {
console.log(`Failed to load img (${targetURL}) 5 times`);
// Throw error if you want forkJoin to fail, emit anything else
// if you want only this image to fail and the rest to keep going
return of(new DummyImage());
})
)
);
forkJoin(images);
In this case, since we've dealt with errors before they hit the forkJoin, we can expect the array of images to contain a DummyImage anywhere that an image failed to load 5 times. You can make DummyImage anything, really. Have a low-res default img loaded locally, for example.
If you return throwError(err) instead, then the entire forkJoin will fail the moment any image fails to load 5 times (It'll still retry 5 times though) and you'll not get any images. That might be what you want?
const s1$ = of(Math.random())
const s2$ = ajax.getJSON(`https://api.github.com/users?per_page=5`)
const s3$ = from(fetch(`https://api.github.com/users?per_page=5`))
const click$ = fromEvent(document, 'click')
click$.pipe(
switchMap(() => s1$)
).subscribe(e => {
console.log(e)
})
I was confused by the code above and can not reason about them properly.
In the first case(s1$), the same result is received every time, it LOOKs fine to me even though I can not understand why switchMap do not start a new stream each time. OK, it is fine
The really wired thing happen when you run s2$ and s3$, the looks equivalent, right? WRONG!!! the behaviours are completely different if you try them out!
The result of s3$ is cached somehow, i.e. if you open the network panel, you will see the http request was send only ONCE. In comparison, the http request is sent each time for s2$
My problem is that I can not use something like ajax from rx directly because the http request is hidden a third-party library, The solution I can come up with is to use inline stream, i.e. create new stream every time
click$.pipe(
switchMap(() => from(fetch(`https://api.github.com/users?per_page=5`)))
).subscribe(e => {
console.log(e)
})
So, how exactly I can explain such behaviour and what is the correct to handle this situation?
One problem is that you actually execute Math.random and fetch while setting up your test case.
// calling Math.random() => using the return value
const s1$ = of(Math.random())
// calling fetch => using the return value (a promise)
const s3$ = from(fetch(`https://api.github.com/users?per_page=5`))
Another is that fetch returns a promise, which resolves only once. from(<promise>) then does not need to re-execute the ajax call, it will simply emit the resolved value.
Whereas ajax.getJSON returns a stream which re-executes every time.
If you wrap the test-streams with defer you get more intuitive behavior.
const { of, defer, fromEvent } = rxjs;
const { ajax } = rxjs.ajax;
const { switchMap } = rxjs.operators;
// defer Math.random()
const s1$ = defer(() => of(Math.random()));
// no defer needed here (already a stream)
const s2$ = ajax.getJSON('https://api.github.com/users?per_page=5');
// defer `fetch`, but `from` is not needed, as a promise is sufficient
const s3$ = defer(() => fetch('https://api.github.com/users?per_page=5'));
const t1$ = fromEvent(document.getElementById('s1'), 'click').pipe(switchMap(() => s1$));
const t2$ = fromEvent(document.getElementById('s2'), 'click').pipe(switchMap(() => s2$));
const t3$ = fromEvent(document.getElementById('s3'), 'click').pipe(switchMap(() => s3$));
t1$.subscribe(console.log);
t2$.subscribe(console.log);
t3$.subscribe(console.log);
<script src="https://unpkg.com/#reactivex/rxjs#6/dist/global/rxjs.umd.js"></script>
<button id="s1">test random</button>
<button id="s2">test ajax</button>
<button id="s3">test fetch</button>
Imagine a stream of messages, each with associated user id. For each message that comes in, fetch the associated user information ('user-fetch' observable). These user-fetch observables will stay alive, and monitor any future changes for the target user.
Questions:
How to prevent duplicate 'user-fetch' observables from being created for a given user-id (and reuse the possibly already created observable)?
How to correctly cleanup all user-fetch observables for unsubscribe and/or complete?
Where I'm at:
I wasn't able to determine an existing operator or methodology to prevent duplicate observables, so I wrote an operator similar to switchMap. I don't love it. How is this done in practice?
If I can solve 1, I believe the solution to correct cleanup and reuse is refCount().
if I understood the problem correctly you have one stream that emits id-s and based on that stream events, another stream that receives some data related to the id from remote place (server).
The solution that I suggest is to create some kind of store to hold the cached data and upon receiving a message from the id stream to check it and return either the response from new request or the cached data.
/**
* callBack end mocks an http request
*/
let callBackEnd$ = id => {
customLog("___________________");
customLog("Calling the server for " + id);
customLog("___________________");
return of({ id: id, data: `Some data about ${id}` });
};
/**
* idStream$ mock the stream of id-s to be called trough http request
*/
let idStream$ = from([1, 2, 2, 3, 1, 5, 3, 4, 5]);
/**
* We use reqStore$ to cache the already retrieved data
*/
let reqStore$ = new BehaviorSubject([]);
/**
* 1. We subscribe to the message stream ( the stream that will tell us what to load )
*
* 2. With `latestFrom` we take the current store and check for any cached data, and return
* the cached data or the response of the new request
*
* 3. If the response of the `switchMap` doesn't exist in our store we add it.
*/
idStream$
.pipe(
tap(message => customLog(`Receiving command to retrieve : ${message}`)),
withLatestFrom(reqStore$),
switchMap(([e, store]) => {
let elementSaved = store.find(x => x.id === e);
return elementSaved ? of(elementSaved) : callBackEnd$(e);
}),
withLatestFrom(reqStore$),
tap(([response, store]) => {
if (!store.find(x => x.id === response.id)) {
reqStore$.next([...store, response]);
}
})
)
.subscribe(([currentResponse, currentStore]) => {
customLog("Receiving response for " + currentResponse.data);
});
Here is live demo at Codesandbox I hope that helps you out :)
I'm trying to implement a togglable auto-save feature using RxJS streams. The goal is to:
While auto-save is enabled, send changes to the server as they come.
While auto-save is disabled, buffer the changes and send them to the server when auto-save is re-enabled.
Here is what I came across with:
autoSave$ = new BehaviorSubject(true);
change$ = new Subject();
change$.pipe(
bufferToggle(
autoSave$.pipe(filter(autoSave => autoSave === false)),
() => autoSave$.pipe(filter(autoSave => autoSave === true)),
),
concatMap(changes => changes),
concatMap(change => apiService.patch(change)),
).subscribe(
() => console.log('Change sent'),
(error) => console.error(error),
);
Thanks to bufferToggle, I'm able to buffer the changes while autoSave is off and send them when it's re-enabled.
Problem is that while autoSave is enabled, nothing passes through. I understand it's because bufferToggle ignores the flow coming while its opening observable doesn't emit.
I feel that I should have a condition there to bypass the bufferToggle while autoSave is enabled, but all my attempts miserably failed.
Any idea to achieve this?
We can buffer events in-between autosave on and off using bufferToggle(on, off), and open a filtering window between off and on using windowToggle(off, on). And then we merge those together:
const on$ = autoSave$.filter(v=>v);
const off$ = autoSave$.filter(v=>!v);
const output$ =
Observable.merge(
changes$
.bufferToggle(
off$,
()=>on$
)
changes$
.windowToggle(
on$,
()=>off$
)
)
.flatMap(x=>x) // < flattern buffer and window
Play with this example at https://thinkrx.io/gist/3d5161fc29b8b48194f54032fb6d2363
* Please, note that since buffer wraps values in Array — I've used another flatMap(v=>v) in the example to unwrap buffered values. You might want to disable this particular line to get arrays from buffers mixed with raw values.
Also, check my article "Pausable Observables in RxJS" to see more examples.
Hope this helps
Another solution.
Just one observable to play / pause
export type PauseableOptions = 'paused' | 'playing'
export function pauseableBuffered(pauser$: Observable<PauseableOptions>) {
return function _pauseableBuffer<T>(source$: Observable<T>): Observable<T> {
let initialValue = 'paused'
// if a value is already present (say a behaviour subject use that value as the initial value)
const sub = pauser$.subscribe(v => initialValue = v)
sub.unsubscribe()
const _pauser$ = pauser$.pipe(startWith(initialValue), distinctUntilChanged(), shareReplay(1))
const paused$ = _pauser$.pipe(filter((v) => v === 'paused'))
const playing$ = _pauser$.pipe(filter((v) => v === 'playing'))
const buffer$ = source$.pipe(bufferToggle(paused$, () => playing$))
const playingStream$ = source$
.pipe(
withLatestFrom(_pauser$),
filter(([_, state]) => state === 'playing'),
map(([v]) => v)
)
return merge(
buffer$.pipe(
mergeMap(v => v)
),
playingStream$
)
}
}
const stream$ = new Subject<number>()
const playPause$ = new BehaviorSubject<PauseableOptions>('playing')
const result: number[] = []
const sub = stream$.pipe(pauseableBuffered(playPause$))
.subscribe((v) => result.push(v))