How to set up a pinia store to wait for the state - async-await

I have a (composition) store that is used in s couple of modules, they are unaware of each other and can be initialized in the same moment as well as at different times.
The store retrieves data from an API and saves it in the state. All of the modules await the execute store's fetchData method, all of them await a promise it returns.
The problem is that when the modules are initialized at the same moment, both perform fetchData resulting in two requests fired. The ideal situation would be to
Allow only one request being fired (that's pretty easy to do, just save the request state in a ref, f.e. "pending")
Export the state as a promise, so that the module awaits the state to be loaded from an API, receiving it's data only thereafter.
Those with a requirement that the modules are unaware of the store implementation - the less code on their side the better.
How do you handle such cases?
Is a there anything wrong with this approach? If not - how to implement it?

Related

How to cancel http requests made by Apollo (angular) client?

I noticed that when I unsubscribe from query, http request is still executing and not being canceled. Also tried to use AbortController but without any luck. How does one cancel http requests made by Apollo client?
This is an old question, but since I just wanted to do the same and managed to do it with the latest Apollo Client (3.4.13) and Apollo-Angular (2.6.0), you need to make sure that you're using watchQuery() instead of query(), and then call unsubscribe() on the returned Apollo subscription from the previous request. The latter implies of course that you should store somewhere the subscription object that you want to abort.
This is an old question, but I spent two days on this bananas problem and I want to share for posterity.
We're using Angular and GraphQL (apollo-angular and codegen to make GraphQL services) and we opted for an event-driven architecture using NgRx to send events and then perform http calls. When sending multiple identical events (but with different property values) we noticed we got stale data in some cases, especially edge cases like when 20+ of these identical events were sent. Obviously not common, but not ideal, and a hint of perhaps bad scale since we were going to need many more events in future.
The way we resolved this issue was by using .watch() instead of .fetch() on the GraphQL generated services. Initially, since .fetch() returned the same Observable as .watch().valueChanges, we thought it was easier and simpler to just use .fetch(), but their behavior seems much different. We were never able to cancel http requests performed by .fetch(). But after changing to .watch().valueChanges, the Observable acted exactly as http request Observables would, complete with -- thankfully -- cancelation.
So in NgRx, we swapped our generic mergeMap operator for the switchMap operator. This will ensure previous effects listening on dispatched events will be canceled. We needed no extra overhead, no .next-ing to Subjects, no extra Subscriptions. Just change .fetch() into .watch().valueChanges and then switchMap to your heart's content. The takeUntil operator will now also cancel these requests, which is our performed method of unsubscribing from Observables.
Sidenote: I'm amazed that this information was this hard to come by, and honestly this question and one GitHub issue was all I could find to intimate this discrepancy. Even now I don't quite understand why anyone would want .fetch() if all it does is perform an http call that will always resolve and then return an Observable that does not behave the way you expect Observables to behave.

How to cache a reactive publisher like Observable?

I'm trying to understand what happens if I cache the result of a method returning a cold Observable? The flow has not been materialized yet, so what does the cache actually contain? I tried to find out using Hazelcast and Spring Boot but couldn't get the cache working.
Edit:
When I say cache not working, I am speaking based on what I see from Hazelcast Management Center. Depending on the cache config (I tried many things), either the cache shows up but no entries, or the cache doesn't show up at all.
Example:
#javax.cache.annotation.CacheResult
Observable<Integer> random() {
// Do I get a new number every time?
return Observable.just(new Random().nextInt());
}
From rx-java wiki (source here):
A cold Observable emits a particular sequence of items, but can begin
emitting this sequence when its Observer finds it to be convenient,
and at whatever rate the Observer desires, without disrupting the
integrity of the sequence. For example if you convert a static
Iterable into an Observable, that Observable will emit the same
sequence of items no matter when it is later subscribed to or how
frequently those items are observed. Examples of items emitted by a
cold Observable might include the results of a database query, file
retrieval, or web request.
With cold Observable, like your example, the request is done at subscribe time, for each subscriber. Even without cache, if you subscribe twice to the same Observable, the request will occur twice. The Observable is not bound to a specific stream. Observable is just a contract describing how to access data.
Caching the result of a method returning an Observable is I think somewhat similar to storing the result to a local property; you just avoid to recreate the Observable object again later. But just the 'getter', not the data.
rx-java give some tools to achieve caching by it's own. You could have a look to Subject or ConnectableObservable.

Is it a good practice to call action within another action (in flux)

I have an action as follows:
SomeActions.doAction1(){
//..dispatch event "started"...
//...do some process....
FewActions.doAnotherAction(); //CAN WE DO THIS
//...do something more....
//..dispatch event "completed"..
}
While the above works with no problems, just wondering, if it is valid according to flux pattern/standard or is there a better way.
Also, I guess calling Actions from Stores are a bad idea. Correct me if I am wrong.
Yes, calling an Action within another Action is a bad practice. Actions should be atomic; all changes in the Stores should be in response to a single action. They should describe one thing that happened in the real world: the user clicked on a button, the server responded with data, the screen refreshed, etc.
Most people get confused by Actions when they are thinking about them as imperative instructions (first do A, then do B) instead of descriptions of what happened and the starting point for reactive processes.
This is why I recommend to people that they name their Action types in the past tense: BUTTON_CLICKED. This reminds the programmer of the fundamentally externally-driven, descriptive nature of Actions.
Actions are like a newspaper that gets delivered to all the stores, describing what happened.
Calling Actions from Stores is almost always the wrong thing to do. I can only think of one exception: when the Store responds to the first Action by starting up an asynchronous process. When the async process completes, you want to fire off a second Action. This is the case with a XHR call to the server. But the better way is to put the XHR handling code into a Utils module. The store can then respond to the first Action by calling a method in the Utils module, and then the Utils module has the code to call the second Action when the server response comes back.

Tracking ajax request status in a Flux application

We're refactoring a large Backbone application to use Flux to help solve some tight coupling and event / data flow issues. However, we haven't yet figured out how to handle cases where we need to know the status of a specific ajax request
When a controller component requests some data from a flux store, and that data has not yet been loaded, we trigger an ajax request to fetch the data. We dispatch one action when the request is initiated, and another on success or failure.
This is sufficient to load the correct data, and update the stores once the data has been loaded. But, we have some cases where we need to know whether a certain ajax request is pending or completed - sometimes just to display a spinner in one or more views, or sometimes to block other actions until the data is loaded.
Are there any patterns that people are using for this sort of behavior in flux/react apps? here are a few approaches I've considered:
Have a 'request status' store that knows whether there is a pending, completed, or failed request of any type. This works well for simple cases like 'is there a pending request for workout data', but becomes complicated if we want to get more granular 'is there a pending request for workout id 123'
Have all of the stores track whether the relevant data requests are pending or not, and return that status data as part of the store api - i.e. WorkoutStore.getWorkout would return something like { status: 'pending', data: {} }. The problem with this approach is that it seems like this sort of state shouldn't be mixed in with the domain data as it's really a separate concern. Also, now every consumer of the workout store api needs to handle this 'response with status' instead of just the relevant domain data
Ignore request status - either the data is there and the controller/view act on it, or the data isn't there and the controller/view don't act on it. Simpler, but probably not sufficient for our purposes
The solutions to this problem vary quite a bit based on the needs of the application, and I can't say that I know of a one-size-fits-all solution.
Often, #3 is fine, and your React components simply decide whether to show a spinner based on whether a prop is null.
When you need better tracking of requests, you may need this tracking at the level of the request itself, or you might instead need this at the level of the data that is being updated. These are two different needs that require similar, but slightly different approaches. Both solutions use a client-side id to track the request, like you have described in #1.
If the component that calls the action creator needs to know the state of the request, you create a requestID and hang on to that in this.state. Later, the component will examine a collection of requests passed down through props to see if the requestID is present as a key. If so, it can read the request status there, and clear the state. A RequestStore sounds like a fine place to store and manage that state.
However, if you need to know the status of the request at the level of a particular record, one way to manage this is to have your records in the store hold on to both a clientID and a more canonical (server-side) id. This way you can create the clientID as part of an optimistic update, and when the response comes back from the server, you can clear the clientID.
Another solution that we've been using on a few projects at Facebook is to create an action queue as an adjunct to the store. The action queue is a second storage area. All of your getters draw from both the store itself and the data in the action queue. So your optimistic updates don't actually update the store until the response comes back from the server.

Asynchronous data loading in flux stores

Say I have a TodoStore. The TodoStore is responsible for keeping my TODO items. Todo items are stored in a database.
I want to know what is the recommended way for loading all todo items into the store and how the views should interact with the store to load the TODO items on startup.
The first alternative is to create a loadTodos action that will retrieve the Todos from the database and emit a TODOS_LOADED event. Views will then call the loadTodos action and then listen to the TODOS_LOADED event and then update themselves by calling TodoStore.getTodos().
Another alternative is to not have a loadTodos action, and have a TodoStore.getTodos() that will return a promise with the existing TODO items. If the TodoStore has already loaded the TODO items, it just returns them; if not, then it will query from the database and return the retrieved items. In this case, even though the store now has loaded the TODO items, it will not emit a TODOS_LOADED event, since getTodos isn't an action.
function getTodos() {
if (loaded)
return Promise.resolve($todoItems);
else
return fetchTodoItemsFromDatabase().then(todoItems) {
loaded = true;
$todoItems = todoItems;
return $todoItems;
});
}
I'm sure many will say that that breaks the Flux architecture because the getTodos function is changing the store state, and store state should only be changed though actions sent in from the dispatcher.
However, if you consider that state for the TodoStore is the existing TODO items in the database, then getTodos isn't really changing any state. The TODO items are exactly the same, hence no view need to be updated or notified. The only thing is that now the store has already retrieved the data, so it is now cached in the store. From the View's perspective, it shouldn't really care about how the Store is implemented. It shouldn't really care if the store still needs to retrieve data from the database or not. All views care about is that they can use the Store to get the TODO items and that the Store will notify them when new TODO items are created, deleted, or changed.
Hence, in this scenario, views should just call TodoStore.getTodos() to render themselves on load, and register an event handler on TODO_CHANGE to be notified when they need to update themselves due to a addition, deletion, or change.
What do you think about these two solutions. Are they any other solutions?
The views do not have to be the entities that call loadTodos(). This can happen in a bootstrap file.
You're correct that you should try your best to restrict the data flow to actions inside the dispatch payload. Sometimes you need to derive data based on the state of other stores, and this is what Dispatcher.waitFor() is for.
What is Flux-like about your fetchTodoItemsFromDatabase() solution is that no other entity is setting data on the store. The store is updating itself. This is good.
My only serious criticism of this solution is that it could result in a delay in rendering if you are actually getting the initial data from the server. Ideally, you would send down some data with the HTML. You would also want to make sure to call for the stores' data within your controller-views' getInitialState() method.
Here is my opinion about that, very close to yours.
I maintain the state of my application in Store via Immutable.Record and Immutable.OrderedMap from Immutable.js
I have a top controller-view component that get its state from the Store.
Something such as the following :
function getInitialState() {
return {
todos: TodoStore.getAll()
}
}
TodoStore.getAll methods will retrieve the data from the server via a APIUtils.getTodos() request if it's internal _todos map is empty. I advocate for read data triggered in Store and write data triggered in ActionCreators.
By the time the request is processing, my component will render a simple loading spinner or something like that
When the request resolves, APIUtils trigger an action such as TODO_LIST_RECEIVE_SUCCESS or TODO_LIVE_RECEIVE_FAIL depending on the status of the response
My TodoStore will responds to these action by updating its internal state (populating it's internal Immutable.OrderedMap with Immutable.Record created from action payloads.
If you want to see an example through a basic implementation, take a look to this answer about React/Flux and xhr/routing/caching .
I know it's been a couple of years since this was asked, but it perfectly summed up the questions I am struggling with this week. So to help any others that may come across this question, I found this blog post that really helped me out by Nick Klepinger: "ngrx and Tour of Heroes".
It is specifically using Angular 2 and #ngrx/store, but answers your question very well.

Resources