How is state immutability actually used in Redux? - immutability

I am trying to understand how immutability is actually (if at all) used in Redux. Every tutorial/article/document I found states that reducers should never alter the state object but instead create a new copy of the changed data (i.e. the reducers must be pure functions). I understand this but I could not find any place where there is an explanation of how this guideline is actually used by Redux internal implementation.
Is this just a strong recomendation or will something inside Redux breaks if I make the reducers non pure?
If it is the later then what exactly will break?
I did find several places where Dan says that in some (very rare) cases the reducers may be non pure but it is risky (again, with no explanation what is the risk exactly).

Redux, when used with React, is typically connected by using connect from react-redux. Wrapping a component with connect, makes it subscribe to changes to the redux store by specifying a change handler, which gets invoked after an action is dispatched and the reducers are called. Upon invocation of the change handler, connect compares the current state of the store with the new state using identity comparison (previousState !== newState) – which is faster than doing a shallow or deep comparison – and only when the two states aren't identical, does it update the wrapped component using setState. Mutating the state directly will not cause the references to change, thereby causing the wrapped component to not re-render.
This is how connect determines if it should call setState:
if (!pure || prevStoreState !== storeState) {
this.hasStoreStateChanged = true
this.setState({ storeState })
}
connect also provides an option to override this behaviour by specifying that the wrapped component is not pure using:
connect(mapStateToProps, mapDispatchToProps, mergeProps, {pure: false} )
Identity comparison is also used by pure components by implementing shouldComponentUpdate to prevent unnecessary calls to render.
TL;DR: a component wrapped with connect will not re-render if a store's state changes due to mutation.
Edit: Nothing in Redux itself will break since it is so minimal that it doesn't try to inspect the state returned by reducers.

Related

How to set up a pinia store to wait for the state

I have a (composition) store that is used in s couple of modules, they are unaware of each other and can be initialized in the same moment as well as at different times.
The store retrieves data from an API and saves it in the state. All of the modules await the execute store's fetchData method, all of them await a promise it returns.
The problem is that when the modules are initialized at the same moment, both perform fetchData resulting in two requests fired. The ideal situation would be to
Allow only one request being fired (that's pretty easy to do, just save the request state in a ref, f.e. "pending")
Export the state as a promise, so that the module awaits the state to be loaded from an API, receiving it's data only thereafter.
Those with a requirement that the modules are unaware of the store implementation - the less code on their side the better.
How do you handle such cases?
Is a there anything wrong with this approach? If not - how to implement it?

What's the best way to propagate data through your Vue3 App?

Is there a recommended way of how to propagate data through your vue components? What I'm trying to do is get the data from backend once and propagate it everywhere in my project but I can't find the right strategy.
sessionStorage: Works great and resets on refresh/close window but as soon as you need to create target="_blank" anchor tags, it will not propagate your data to new tabs.
localStorage: Requires in my opinion more work than sessionStorage because you need to delete data manually to keep things tidy. One big problem for me is that it looks like you can't pass markdown and arrays properly, at least without stringify. I've built a project with localStorage and ended up sending ajax requests from most of my components because I couldn't propagate the data through my app how I wanted. At this point my frontend is backend.
My personal problems with localStorage: I am using the marked package to display Markdown but it throws errors if passed undefined. This gets problematic, when I want to use it in a reactive state because instead of resulting in undefined, it throws an error and crashes the whole app. The point I am trying to make is that when you pass an undefined localStorage value to marked in an either or field like so:
const state = reactive({
value: marked(localStorage.value) || ""
})
it crashes your app, if localStorage.value is empty.
Another problem is that I fetch text content depending on a locale and store it in localStorage. This is great until the user changes locale and all content strings have to be replaced by the translated strings. It gets really tricky, if I want to use one component as template to load in different locales.
vuex: I've tried vuex shortly and found it useful but didn't see the benefit over just using localStorage for my purposes. Prolly I'll give it another go.
How do you propagate data through your app?
There are a few good arguments why Vuex is better than Local Storage:
https://www.quora.com/What-is-the-benefit-of-using-Vuex-over-LocalStorage-to-store-the-state-of-an-application-in-Vue-js
You can also try composables. They are reusable functions (similar to mixins) in composition-api (you need composition-api plugin in vue2, in vue3 it is built-in). It can be also the place you store your data. It can be easier and more intuitive than Vuex.
First, create directory /composables and add javascript file (it's
good practice to create file beginning with use word) useState.js:
import { reactive, toRefs } from "vue";
const state = reactive({
isMenuOpened: false
});
const toggleMenuState = () => {
state.isMenuOpened = !state.isMenuOpened;
};
export default {
...toRefs(state),
toggleMenuState
};
toRefs converts all of the properties, to a plain object with
properties that are refs
Now you can use composable in vue components:
<script>
import useState from "./composables/useState";
export default {
setup() {
const { isMenuOpened, toggleMenuState } = useState;
return {
isMenuOpened,
toggleMenuState,
};
},
};
</script>
Demo:
https://codesandbox.io/s/happy-chandrasekhar-o05uv?file=/src/App.vue
About composition api and composables:
https://v3.vuejs.org/guide/composition-api-introduction.html
Since you have mentioned local storage and session storage, I believe you must be trying to share your state across tabs/windows rather than just different components on a single page. At this scale I don't think this is necessarily a VueJS specific issue/pattern. Generally speaking, you want data to be shared pretty much across process boundaries.
Session Storage used to be one of the most sensible ways because it is shared between one window and all the child windows it has created, until all of them are closed at which point the storage will be discarded as well. However, depending on your use cases, Chrome (within the past year) made a change to NOT inherit the session storage from the original window if the popup windows is opened as noopener, hence if you are relying on the noopener (and its performance implications), session storage is no longer usable for this purpose.
Vuex does not solve this issue neither. In fact, Vuex is pretty much irrelevant here. Given the application architecture implied, the state management capability Vuex brings to your app will be likely redundant because any state mutation will probably be submitted to your backend anyway. In some sense the Vuex store is on your backend rather than your frontend.
So we typically do one of the three approaches:
directly broadcast from backend to all frontend tabs. E.g. there is no state sync-ing directly between frontend tabs. Every single tab (child window) communicates directly with the server: it mutates the state by submitting actions to the server, and only the server can change the state and broadcast the changes back to all the tabs in real time (again, conceptually it feels like the Vuex store is on your backend)
use SharedWorker. SharedWorker is shared by all the browsing context with the same origin. It is activated the moment the first browsing context (of a certain origin) is created, and is kept alive until the last browsing context is destroyed. In some sense its sharing semantic is similar to that of the old session storage. You can use SharedWorker as the single entity to communicate to your backend. States can be maintained by the SharedWorker and accessed from the tabs in a RPC fashion. Or states can be maintained separately in each tab and SharedWorker just broadcast the changes to the tabs.
if you actually do not have a backend, but you just want to build multi-window single page application, you can make one of your tabs special and act as the owner of the state store. For all the child windows created from this "master" window, their local store will be a proxy - the actions they perform against the local store will be proxied over to the master window; and the master window performs the action in its store, and broadcast the changes to all the child windows.
By the way, I have used the word "store" many times, but I do not necessary mean the Vuex store. The store is just a shared place where you keep your state.

Can a React-Redux app really scale as well as, say Backbone? Even with reselect. On mobile

In Redux, every change to the store triggers a notify on all connected components. This makes things very simple for the developer, but what if you have an application with N connected components, and N is very large?
Every change to the store, even if unrelated to the component, still runs a shouldComponentUpdate with a simple === test on the reselected paths of the store. That's fast, right? Sure, maybe once. But N times, for every change? This fundamental change in design makes me question the true scalability of Redux.
As a further optimization, one can batch all notify calls using _.debounce. Even so, having N === tests for every store change and handling other logic, for example view logic, seems like a means to an end.
I'm working on a health & fitness social mobile-web hybrid application with millions of users and am transitioning from Backbone to Redux. In this application, a user is presented with a swipeable interface that allows them to navigate between different stacks of views, similar to Snapchat, except each stack has infinite depth. In the most popular type of view, an endless scroller efficiently handles the loading, rendering, attaching, and detaching of feed items, like a post. For an engaged user, it is not uncommon to scroll through hundreds or thousands of posts, then enter a user's feed, then another user's feed, etc. Even with heavy optimization, the number of connected components can get very large.
Now on the other hand, Backbone's design allows every view to listen precisely to the models that affect it, reducing N to a constant.
Am I missing something, or is Redux fundamentally flawed for a large app?
This is not a problem inherent to Redux IMHO.
By the way, instead of trying to render 100k components at the same time, you should try to fake it with a lib like react-infinite or something similar, and only render the visible (or close to be) items of your list. Even if you succeed to render and update a 100k list, it's still not performant and it takes a lot of memory. Here are some LinkedIn advices
This anwser will consider that you still try to render 100k updatable items in your DOM, and that you don't want 100k listeners (store.subscribe()) to be called on every single change.
2 schools
When developing an UI app in a functional way, you basically have 2 choices:
Always render from the very top
It works well but involves more boilerplate. It's not exactly the suggested Redux way but is achievable, with some drawbacks. Notice that even if you manage to have a single redux connection, you still have have to call a lot of shouldComponentUpdate in many places. If you have an infinite stack of views (like a recursion), you will have to render as virtual dom all the intermediate views as well and shouldComponentUpdate will be called on many of them. So this is not really more efficient even if you have a single connect.
If you don't plan to use the React lifecycle methods but only use pure render functions, then you should probably consider other similar options that will only focus on that job, like deku (which can be used with Redux)
In my own experience doing so with React is not performant enough on older mobile devices (like my Nexus4), particularly if you link text inputs to your atom state.
Connecting data to child components
This is what react-redux suggests by using connect. So when the state change and it's only related to a deeper child, you only render that child and do not have to render top-level components everytime like the context providers (redux/intl/custom...) nor the main app layout. You also avoid calling shouldComponentUpdate on other childs because it's already baked into the listener. Calling a lot of very fast listeners is probably faster than rendering everytime intermediate react components, and it also permits to reduce a lot of props-passing boilerplate so for me it makes sense when used with React.
Also notice that identity comparison is very fast and you can do a lot of them easily on every change. Remember Angular's dirty checking: some people did manage to build real apps with that! And identity comparison is much faster.
Understanding your problem
I'm not sure to understand all your problem perfectly but I understand that you have views with like 100k items in it and you wonder if you should use connect with all those 100k items because calling 100k listeners on every single change seems costly.
This problem seems inherent to the nature of doing functional programming with the UI: the list was updated, so you have to re-render the list, but unfortunatly it is a very long list and it seems unefficient... With Backbone you could hack something to only render the child. Even if you render that child with React you would trigger the rendering in an imperative way instead of just declaring "when the list changes, re-render it".
Solving your problem
Obviously connecting the 100k list items seems convenient but is not performant because of calling 100k react-redux listeners, even if they are fast.
Now if you connect the big list of 100k items instead of each items individually, you only call a single react-redux listener, and then have to render that list in an efficient way.
Naive solution
Iterating over the 100k items to render them, leading to 99999 items returning false in shouldComponentUpdate and a single one re-rendering:
list.map(item => this.renderItem(item))
Performant solution 1: custom connect + store enhancer
The connect method of React-Redux is just a Higher-Order Component (HOC) that injects the data into the wrapped component. To do so, it registers a store.subscribe(...) listener for every connected component.
If you want to connect 100k items of a single list, it is a critical path of your app that is worth optimizing. Instead of using the default connect you could build your own one.
Store enhancer
Expose an additional method store.subscribeItem(itemId,listener)
Wrap dispatch so that whenever an action related to an item is dispatched, you call the registered listener(s) of that item.
A good source of inspiration for this implementation can be redux-batched-subscribe.
Custom connect
Create a Higher-Order component with an API like:
Item = connectItem(Item)
The HOC can expect an itemId property. It can use the Redux enhanced store from the React context and then register its listener: store.subscribeItem(itemId,callback). The source code of the original connect can serve as base inspiration.
The HOC will only trigger a re-rendering if the item changes
Related answer: https://stackoverflow.com/a/34991164/82609
Related react-redux issue: https://github.com/rackt/react-redux/issues/269
Performant solution 2: listening for events inside child components
It can also be possible to listen to Redux actions directly in components, using redux-dispatch-subscribe or something similar, so that after first list render, you listen for updates directly into the item component and override the original data of the parent list.
class MyItemComponent extends Component {
state = {
itemUpdated: undefined, // Will store the local
};
componentDidMount() {
this.unsubscribe = this.props.store.addDispatchListener(action => {
const isItemUpdate = action.type === "MY_ITEM_UPDATED" && action.payload.item.id === this.props.itemId;
if (isItemUpdate) {
this.setState({itemUpdated: action.payload.item})
}
})
}
componentWillUnmount() {
this.unsubscribe();
}
render() {
// Initially use the data provided by the parent, but once it's updated by some event, use the updated data
const item = this.state.itemUpdated || this.props.item;
return (
<div>
{...}
</div>
);
}
}
In this case redux-dispatch-subscribe may not be very performant as you would still create 100k subscriptions. You'd rather build your own optimized middleware similar to redux-dispatch-subscribe with an API like store.listenForItemChanges(itemId), storing the item listeners as a map for fast lookup of the correct listeners to run...
Performant solution 3: vector tries
A more performant approach would consider using a persistent data structure like a vector trie:
If you represent your 100k items list as a trie, each intermediate node has the possibility to short-circuit the rendering sooner, which permits to avoid a lot of shouldComponentUpdate in childs.
This technique can be used with ImmutableJS and you can find some experiments I did with ImmutableJS: React performance: rendering big list with PureRenderMixin
It has drawbacks however as the libs like ImmutableJs do not yet expose public/stable APIs to do that (issue), and my solution pollutes the DOM with some useless intermediate <span> nodes (issue).
Here is a JsFiddle that demonstrates how a ImmutableJS list of 100k items can be rendered efficiently. The initial rendering is quite long (but I guess you don't initialize your app with 100k items!) but after you can notice that each update only lead to a small amount of shouldComponentUpdate. In my example I only update the first item every second, and you notice even if the list has 100k items, it only requires something like 110 calls to shouldComponentUpdate which is much more acceptable! :)
Edit: it seems ImmutableJS is not so great to preserve its immutable structure on some operations, like inserting/deleting items at a random index. Here is a JsFiddle that demonstrates the performance you can expect according to the operation on the list. Surprisingly, if you want to append many items at the end of a large list, calling list.push(value) many times seems to preserve much more the tree structure than calling list.concat(values).
By the way, it is documented that the List is efficient when modifying the edges. I don't think these bad performances on adding/removing at a given index are related to my technique but rather related to the underlying ImmutableJs List implementation.
Lists implement Deque, with efficient addition and removal from both the end (push, pop) and beginning (unshift, shift).
This may be a more general answer than you're looking for, but broadly speaking:
The recommendation from the Redux docs is to connect React components fairly high in the component hierarchy. See this section.. This keeps the number of connections manageable, and you can then just pass updated props into the child components.
Part of the power and scalability of React comes from avoiding rendering of invisible components. For example instead of setting an invisible class on a DOM element, in React we just don't render the component at all. Rerendering of components that haven't changed isn't a problem at all as well, since the virtual DOM diffing process optimizes the low level DOM interactions.

What is *like* a promise but can resolve mutliple times?

I am looking for a pub/sub mechanism that behaves like a promise but can resolve multiple times, and behaves like an event except if you subscribe after a notification has happened it triggers with the most recent value.
I am aware of notify, but deferred.notify is order-sensitive, so in that way it behaves just like an event. eg:
d.notify('notify before'); // Not observed :(
d.promise.progress(function(data){ console.log(data) });
d.notify('notify after'); // Observed
setTimeout(function(){ d.notify('notify much later') }, 100); // Also Observed
fiddle: http://jsfiddle.net/foLhag3b/
The notification system I'd like is a good fit for a UI component that should update to reflect the state of the data behind it. In these cases, you don't want to care about whether the data has arrived yet or not, and you want updates when they come in.
Maybe this is similar to Immediate mode UIs, but is distinct because it is message based.
The state of the art for message based UI updating, as far as I'm aware, is something which uses a promise or callback to initialize, then also binds an update event:
myUIComponent.gotData(model.data);
model.onUpdate(myUIComponent.gotData); // doing two things is 2x teh workz :(
I don't want to have to do both. I don't think anyone should have to, the use case is common enough to abstract.
model.whenever(myUIComponent.gotData); // yay one intention === one line of code
I could build a mechanism to do what I want, but I wanted to see if a pub/sub mechanism like this already exists. A lot of smart people have done a lot in CS and I figure this concept must exist, I just am looking for the name of it.
To be clear, I'm not looking to change an entire framework, say to Angular or React. I'm looking only for a pub/sub mechanism. Preferably an implementation of a known concept with a snazzy name like notifr or lemme-kno or touch-base.
You'll want to have a look at (functional) reactive programming. The concept you are looking for is known as a Behavior or Signal in FRP. It models the change of a value over time, and can be inspected at any time (continuously holds a value, in contrast to a stream that discretely fires events).
var ui = state.map(render); // whenever state updates, so does ui with render() result
Some popular libraries in JS are Bacon and Rx, which use their own terminology however: you'll find properties and observables.

Model View Controller and Callbacks

I'm currently developing a multiplayer cardgame for android, with libgdx as the game engine. My question is more generel tho.
I'm not sure whats the best practice for handling callbacks in this architecture. My controller is a big statemachine, that checks inputs over and over while beeing called from the render() method of the gameengine.
I have two main callbacks, userinput from the gui, and network callbacks from the android google play services part.
Currently these callback methode/ inputListeners just set member variables, which are check by getter methods from the controller/statemachine, for example i call this from the controller over and over, check if its != null and proceed if it is.
#Override
public Boolean allPlayersConnected() {
Boolean allConnected = null;
if (startGame != null) {
allConnected = startGame;
startGame = null;
}
return allConnected;
}
The startGame "flag" beeing set by callbacks from the google play services api.
I dont know if this is good practice, doesnt look like.
I could call controller methods from the google play services callbacks that set a controller member variable, and check this in each render loop, but thats just moving the variable.
I could also design the controller as an observer of those events, but what am i going to do in the update method inside the controller thats beeing called if an event happens. i dont think i want change stats in these, even if i can access the currrent state. Im spreading state code all over the place with this, some in different parts of a huge update method and some in the actual state machine code. Just setting a member variable in the update method is quite similar to the think i did above.
Another thing would be, to directly change controller state from the callback methods. That would be less code, less variables and a little faster, but i think i'd destroy the MVP concept, cause i take away the control from the controller and let i.e. the gui change the state of the controller.
Any input on this ?
Edit:
The more i think about it, the more i think a combination of observer and command pattern is the way to go.
So i could indeed cut big part of the current state machine and pack it into the observer update() method. Instead of sending the commands through a big command enum, i could create command object with the information available, and pass them to the observer(controller), where i check the command as viable, and call the execute with the information needed to be excecuted, eg the model interface.
First, I think whether your commands are enums or command objects is independent of the main problem here -- which is how to connect user and network input to state management.
The most common game architecture I've seen is an update loop that checks input, iterates the game simulation, and then renders a frame. In the MVC world, this structure just synchronizes those steps; you still have an encapsulated view and data model, with the controller (the game loop) serving as a a bridge between those two worlds.
Input, whether from the local user or one over the net, is generally treated as a request to modify game state. That is, the controller (as the first part of its loop) reads in pending input messages and processes them, modifying state as it goes. This way, the code that changes state is in one place: that controller. You are right, spreading state-modification code throughout the app is a bad practice; basically, it's not MVC.
In other words, all of your callbacks should convert the input to commands and stick them into a queue. You don't want to synchronize the controller -- whose job it is to modify state -- with those input callbacks. You have no idea when input will occur relative to the game loop, so it's best to decouple them. Serializing input processing with game simulation should also make your logic simpler.
You have some choice in how to connect the callbacks to the controller; a shared queue (where one side writes into it and the other reads out from it) is a strong pattern and easy to make thread-safe.

Resources