I have an external hot source pushing values before observers can subscribe. Upon subscription, the late observers should receive the latest value and every value from that point on. For this, I used the following code (the relevant line is marked with '<<<', the s Subject is here just to be able to create the simplest sample possible, in reality the hot source works differently):
// irrelevant, just to send values
const s = new Subject();
// make the observable cache the last value
const o = s.pipe(shareReplay(1)); // <<<
// now, before subscription, values start coming in
s.next(1);
s.next(2);
s.next(3);
o.subscribe(n => console.warn('!!!', n));
This doesn't work (I expected it to print !!! 3 but nothing happens), but I found a way to make it work:
// irrelevant, just to send values
const s = new Subject();
const r = new ReplaySubject(1);
s.subscribe(r);
const o = r.asObservable();
s.next(1);
s.next(2);
s.next(3);
o.subscribe(n => console.warn('!!!', n));
i.e instead of using shareReplay(1), I create a ReplaySubject(1) and use it as a bridge. With this code, I do get the coveted !!! 3.
While I'm happy it works, I would like to understand why the first snippet doesn't. I always thought shareReplay is pretty much equivalent to the second way and actually kind of implemented this way. What am I missing?
When you use s.pipe(shareReplay(1)) you're just adding an operator to the chain (like changing the chain's prototype). But there's no subscription and shareReplay doesn't subscribe to its source when it itself doesn't have any observers. So it's not caching anything because there's no subscription to source Observable even when source is "hot".
However, when you use s.subscribe(r) you're regularly making a subscription to s so r starts receiving items and ReplaySubject will be caching them.
Related
I have two observables
baseObs$ = serviceCall(); // this return Observable<Foo>
secondObs$ = serviceCall(args); // this return Observable<Bar>
args in this example is public variable defined somewhere else. It doesn't need to be though if that makes this easier.
baseObs I can call whenever I want, but secondObs I can only call after baseObs is succesfully called and handled (don't know the right words so example follows):
I have now something like
baseObs$.subscribe(x => {
const args = x.args; // just example. Point is, I need x to build args.
serviceCall(args).subscribe(y => {
console.log(y); // This is fine
});
});
This suits my needs but I got feedback that no subscribe should live inside another subscribe. How would you achieve same thing using baseObs$ and secondObs$ defined above?
PS. All is pseudo code but hopefully I didn't do too much typos. I think the idea should be clear though.
In the simple case that the first observable only emits once (like a typical HTTP request), any one of switchMap, mergeMap etc. will do:
serviceCall().pipe(
switchMap(x => serviceCall(x.args))
).subscribe(console.log);
If that assumption is not true, you're going to want to read their respective documentation to understand how behavior will change between them. In fact, I'd recommend reading up on them to know their differences even just in general, as it's very valuable information when dealing with reactive code.
I am trying to implement top level await in my react native & expo environment so I can pull data from AsyncStorage and store it in a variable to use globally in my app.
I'm running the app on an android emulator. All I want to do is get the data and use it in my application, but the the variable always returns an empty object when called outside the async function. It seems to apply a value to the variable before the data from AsyncStorage is done loading.
async function getData(){
try {
const value = await AsyncStorage.getItem('redditInsights')
console.log('----------- getStore insights ---------')
const insights = JSON.parse(value)
console.log(insights) // -------------> returns desired data
return insights
}
catch (error) {
console.log(error)
}
}
const outsights = getData()
console.log("-------------------- outsights---------------------")
console.log(outsights) // ----------------------> returns empty object
I know I can pull the data inside an async function, but I can't use that data anywhere except inside the async function. I can't call a variable declared inside async outside of the function without it trying to apply a value to the variable before the data is even pulled.
Ideally I would simply store the data into a variable to use wherever I want without an async function like so, but this requires top level await support:
const value = await AsyncStorage.getItem('redditInsights')
const insights = JSON.parse(value)
I tried implementing top level await through numerous flags (--harmony-top-level-await, --experimental-repl-await, --experimental-top-level-await) at npm start like so: npm start --flag to no avail. I also upgraded to node v14.15.1 which is suppose to support top level await, but I still get an error.
How do I implement top level await in my react native & expo environment so I can use the data in my AsyncStorage
Or if anyone knows a better way to get my data from AsynStorage, I'm more than willing to try it out!!
The simple answer is this is not possible.
There's a more detailed answer about top level await in react-native here which says that React Native doesn't meet the requirements for top level await (uses node modules not ECMAScript native modules).
But your code sample has a mistake, you could declare a top level variable insights and assign the result of your async data to it. Your mistake is using the return value of getData you would need to declare a top level variable with let and then assign the result to it (so instead of return insights) you would do something like topScopeInsights = insights after a let topScopeInsights.
But, this doesn't make any sense, because you won't know if the data has arrived yet. The app will boot with your variable undefined and then you'll need to check if it has arrived or not. That's what promises are for.
So, back to my original answer, this is not possible. :-(
What I am really interested in with data-loader is the per request caching. For example say my graphql query needs to call getUser("id1") 3x. I would like something to dedupe that call.
However it seems like with data-loader I need to pass in an array of keys into my batch function, and multiple requests will be batched into one api calls.
This has me making a few assumptions that i dont like:
1.) That each service I am calling has a batch api (some of the ones im dealing with do not).
2.) What if multiple calls get batched into 1 api call, and that call fails because 1 of the items was not found. Normally I could handle this by returning null for that field, and that could be a valid case. Now however my entire call may fail, if the batch API decides to throw an error since 1 item was not found.
Is there anyway to use dataloader with single-key requests.
Both assumptions are wrong because the implementation of the batch function is ultimately left up to you. As indicated in the documentation, the only requirements when writing your batch function are as follows:
A batch loading function accepts an Array of keys, and returns a Promise which resolves to an Array of values or Error instances.
So there's no need for the underlying data source to also accept an array of IDs. And there's no need for one or more failed calls to cause the whole function to throw since you can return either null or an error instance for any particular ID in the array you return. In fact, your batch function should never throw and instead should always return an array of one or more errors.
In other words, your implementation of the batch function might look something like:
async function batchFn (ids) {
const result = await Promise.all(ids.map(async (id) => {
try {
const foo = await getFooById(id)
return foo
} catch (e) {
// either return null or the error
}
}))
}
It's worth noting that it's also possible to set the maxBatchSize to 1 to effectively disable batching. However, this doesn't change the requirements for how your batch function is implemented -- it always needs to take an array of IDs and always needs to return an array of values/errors of the same length as the array of IDs.
Daniel's solution is perfectly fine and is in fact what I've used so far, after extracting it into a helper function.
But I just found another solution that does not require that much boilerplate code.
new DataLoader<string, string>(async ([key]) => [await getEntityById(key)], {batch: false});
When we set batch: false then we should always get a key-array of size one passed as argument. We can therefore simply destructure it and return a one-sized array with the data. Notice the brackets arround the return value! If you omit those, this could go horribly wrong e.g. for string values.
Coming from AngularJS I'm struggling trying to solve the next problem. I need a function that returns an object (lets call it A, but this object cannot be returned till all the requests that are contained in that function are resolved. The process should be like:
The object A is downloaded from a remote server
Using A, we do operations over another object (B)
B is downloaded from the server
B is patched using some attributes from A
Using A and the result of B we do operations over a third object, C
C is downloaded from the server
C is patched using some attributes from A and B
After B and C are processed, the function must return A
I'd like to understand how to do something like this using rxjs, but with Angular 6 most of the examples around the internet seem to be deprecated, and the tutorials out there are not really helping me. And I cannot modify the backend to make this a bit more elegant. Thanks a lot.
Consider the following Observables:
const sourceA = httpClient.get(/*...*/);
const sourceB = httpClient.get(/*...*/);
const sourceC = httpClient.get(/*...*/);
Where httpClient is Angular's HTTPClient.
The sequence of the operations you described may look as follows:
const A = sourceA.pipe(
switchMap(a => sourceB.pipe(
map(b => {
// do some operation using a and b.
// Return both a and b in an array, but you can
// also return them in an object if you wish.
return [a,b];
})
)),
switchMap(ab => sourceC.pipe(
map(c => {
// do some operations using a, b, and/or c.
return a;
})
))
);
Now you just need to subscribe to A:
A.subscribe(a => console.log(a));
You can read about RxJs operators here.
Well, first of all, it appears to me that this function-call, as described, would be somehow expected to block the calling process until all of the specified events have occurred – which of course is unreasonable in JavaScript.
Therefore, first of all, I believe that your function should require, as its perhaps-only parameter, a callback that will be invoked when everything has finally taken place.
Now – as to "how to handle steps 1, 2, and 3 elegantly" ... what immediately comes to mind is the notion of a finite-state machine (FSM) algorithm.
Let's say that your function-call causes a new "request" to be placed on some request-table queue, and, if necessary, a timer-request (set to go off in 1 millisecond) to service that queue. (This entry will contain, among other things, a reference to your callback.) Let's assume also that the request is given a random-string "nonce" that will serve to uniquely identify it: this will be passed to the various external requests and must be included in their corresponding replies.
The FSM idea is that the request will have a state, (attribute), such as: DOWNLOADING_FROM_B, B_DOWNLOADS_COMPLETE, DOWNLOADING_FROM_C, C_REQUESTS_COMPLETE, and so on. Such that each and every callback that will play a part in this fully-asynchronous process will (1) be able to locate a request-entry by its nonce, and then (2) unambiguously "know what to do next," and "what new-state (if any) to assign," based solely upon examination of the entry's state.
For instance, when the state reaches C_REQUESTS_COMPLETE, it would be time to invoke the callback that you originally provided, and to delete the request-table entry.
You can easily map-out all of the "state transitions" that might occur in an arbitrarily-complex scenario (what states can lead to what states, and what to do when they do), whether or not you actually create a data-structure to represent that so-called "state table," although sometimes it is even-more elegant(!) when you do. (Possibly-messy decision logic is simply pushed to a simple table-lookup.)
This is, of course, a classic algorithm that is applicable to – and, has been used in – "every programming language under the sun." (Lots of hardware devices use it, too.)
I setup a subject and then put some methods on it. It seems to work as intended until it gets to .switch() which I thought would simply keep track of the last call. I get the error Property 'subscribe' does not exist on type 'ApiChange' It seems to convert it to type ApiChange from an observable. I don't understand this behavior. Should I be using a different operator?
Service:
private apiChange = new Subject<ApiChange>();
apiChange$ = this.apiChange.asObservable().distinctUntilChanged().debounceTime(1000).switch();
Component:
this.service.apiChange$.subscribe(change => {
this.service.method(change);
});
.debounceTime(1000) will already assure you will only get a maximum of one value emitted from your observable chain per second. All the values preceding the 1 second quiet time will already be discarded.
With a simple Subject (not a ReplaySubject), past values are not provided to subscribers anyway.
You probably just want to skip the .switch() and enjoy the chain without it.