Spartacus storefront | Wrong NGRX state update | SetDeliveryMode action in checkout custom feature state in addition to updating its own PROCESS state - ngrx-store

I have been creating a new feature that delivery mode contain custom delivery slots.I have added new actions(StateUtils.EntityLoadAction), effects, reducers etc for new custom delivery slot which is working fine and creating a feature state (LoaderState) "SelectedDeliverySlot".
Custom slot feature is working fine and its state is also being captured.
But when i am switching deliveryMode then "SetDeliveryMode" action is updating its PROCESS state and also removing newly independent Feature state.
I am not able to understand why my Custom feature state is being removed on selection of delivery mode.
Some code excepts
Newly Created SetDeliverySlot action
export class SetDeliverySlot extends StateUtils.EntityLoadAction {
readonly type = "[DeliverySlot] set delivery slot";
constructor(
// public payload: { userId: string; cartId: string; selectedDate :string; selectedModeId:string, selectedSlotId: string }
public payload: { userId: string; cartId: string; selectedDate :string; modeId:string, selectedSlotId: string }
) {
super("process", "setDeliverySlot");
}
}
Custom feature state
App State
Switching to another mode "SetDeliveryMode" action called and its diff (update in application state and somehow removing deliverySlot state also)
Diff for SetDeliveryMode
I am not able to understand that each action is responsible for updating its own state so why deliveryMode action is removing another state and where in code this removal is being happened.

One thing you can double check is the type field of your custom actions. If one of your custom actions has the same string value in the type field than one of the delivery mode related actions, this could cause the delivery mode action to be handled by your custom feature reducer as well, producing the kind of behaviour you observe.
On the develop branch, you can see the delivery mode actions in this source file. You can switch to the release branch that matches your Spartacus version for the most accurate comparison.

Related

Retrieve Plugin not getting triggered

We are on Dynamics CRM 2016 On-Premise. Using a plugin I'm trying to automatically update a field when a user open the CRM Account form, in this example to value "5". Here's my code:
var targetEntity = (Entity)context.OutputParameters["BusinessEntity"];
if (targetEntity == null)
throw new InvalidPluginExecutionException(OperationStatus.Failed, "Target Entity cannot be null");
var serviceFactory = (IOrganizationServiceFactory)serviceProvider.GetService(typeof(IOrganizationServiceFactory));
var service = serviceFactory.CreateOrganizationService(context.UserId);
if (targetEntity.Attributes.Contains("MyField"))
fedTaxId = targetEntity.Attributes["MyField"].ToString();
targetEntity.Attributes.Add("MyField"d, "5");
targetEntity["MyField"] = "5";
service.Update(targetEntity);
I list this in message type 10 (Before Main Operation Outside Transaction).
In Plugin Registration I list this as Post Operation stage and Synchronous.
However when I open the Account form, the page blinks once but the value did not get automatically populated. There is no javascript that would've manipulated this form or value either.
Any suggestion? Thanks.
Two options:
Add a script on the form setting the field value on load. Keep in mind this script should only do its thing if the form type = 2.
(Not recommended) Register a plugin on the synchronous post retrieve message for the entity. Make sure this step sets the field value on the entity object in the OutputParameters collection. Now, keep in mind your form will not be aware of the fact that this field has been modified, so it will not be flagged dirty and it will not automatically be submitted when record changes are being saved. So, in this scenario you would still need to add some JavaScript OR you would need an extra plugin registered on the pre update message of the entity setting the field as desired.

Make mandatory fields only when moving from specific field

Make mandatory fields only when moving from specific field - for ex. State from Implement to Review.
Can I do that through UI Polices?
This is definitely possible with Client scripts of ServiceNow.
function onChange(control, oldValue, newValue, isLoading, isTemplate) {
if (newValue == "review" && oldValue == "implement")
g_form.setMandatory("your_field", true);
}
On the change state model, the only transition available for Implement State is to move towards the Review State. So you can reach Review only through a direct pass from Implement state.
So Yes this is possible via a UI Policy.
You can have a UI Policy condition that says State is Review
Add your required UI policy actions to make the fields Mandatory=true

CRM 4 to 2016 migration, Plugin events

I'm migrating CRM 4 to 2016 and I need to clarify something about plugins execution. In both CRM versions we have account and quote entities. Account to quote is connected with parental relation 1:N. In CRM 4 when you assigned account to different owner first Assign and next Update message was fired, but only on account entity.
In CRM 2016 I observed that Update (only update - not assign) message is fired also on quote (and other child entities if relation is set to parental). Also if quote has child entities connected with parental relation, Update message is fired on this child entities and so on. Is there any way to recognize this situation (cascade update) inside a plugin?
There should be a parent context referring to the event source. You can traverse the IPluginExecutionContext.ParentContext property back to the root to find out the origin of the trigger. When you cannot find it there (e.g. when synchronous and asynchronous operations are mixed), there is no other option I'm afraid.
Technically the updates on the related entities are executed in the plugin's child pipeline. In CRM 4.0 we can only register plugin steps in the child pipeline for Create, Update and Delete messages. In CRM 2011 the event model was 'simplified' and since that version it is no longer possible to specify a pipeline. Instead, plugins registered on the PreOperation and PostOperation stages for the Create, Update and Delete messages are always registered in the child pipeline.
There are two solutions for this problem. First one and I think this is how should I be doing this from the beginning is to make use of Filtering attributes. As you can read here it is:
A list of entity attributes that, when changed, cause the plug-in to execute. A value of null causes the plug-in to execute if any of the attributes change.
Second is partially using ParentContext mentioned by Henk - thanks for pointing me in the right direction! You must do checks as shown in method below. If anyone want to use this method should remember to test it first. It works in my case and my plugins but your plugins may be registerred on different steps, messages and entities and this method may not work for you.
public static Boolean IsInternalParentAssign(IPluginExecutionContext context)
{
Boolean result = false;
if (context.ParentContext != null)
{
IPluginExecutionContext parentContext = context.ParentContext;
if (parentContext.MessageName == "Assign"
&& context.Depth == 1
&& parentContext.PrimaryEntityId != context.PrimaryEntityId)
{
result = true;
}
}
return result;
}

How to fire AJAX calls in response to the state changes with Redux?

I'm converting an existing state model to Redux and it has been painless for the most part. However the one point I'm having trouble with is converting "observed" state ajax requests. Essentially, I have certain ajax requests "linked" to other pieces of state, so no matter who modifies them they'll always be issued correctly. I can get similar behavior by subscribing to the Redux store updates, but firing actions in the listener feels like a hack.
A possible solution is to move logic to the action creator via the thunk pattern. Problem is that I'd either have to duplicate fetching logic across actions (since multiple actions could modify "observed" state), or pull most reducer logic to the action creator level. The action creator also shouldn't be aware of how the reducers will respond to issued actions.
I could batch "sub-actions" so I only need to place the appropriate fetching logic in each action "block", but this seems to violate the concept of actions producing a valid state. I'd rather have this liability at the action creator level.
Are there any generally accepted rules surrounding this? This is not a simple application where ad hoc ajax requests are made as components are interacted with, most data is shared between multiple components and requests are optimized and fetched in reaction to state change.
TLDR;
I want to fire ajax requests in response to changes in state, not when a specific action happens. Is there a better, "Redux specific" way of organizing action/actionCreators to mock this behavior, other than firing these actions in a subscribe listener?
Using store.subscribe()
The easiest way is to simply use store.subscribe() method:
let prevState
store.subscribe(() => {
let state = store.getState()
if (state.something !== prevState.something) {
store.dispatch(something())
}
prevState = state
})
You can write a custom abstraction that lets you register conditions for side effects so they are expressed more declaratively.
Using Redux Loop
You might want to look at Redux Loop which let you describe effects (such as AJAX) calls together with state updates in your reducers.
This way you can “return” those effects in response to certain actions just like you currently return the next state:
export default function reducer(state, action) {
switch (action.type) {
case 'LOADING_START':
return loop(
{ ...state, loading: true },
Effects.promise(fetchDetails, action.payload.id)
);
case 'LOADING_SUCCESS':
return {
...state,
loading: false,
details: action.payload
};
This approach is inspired by the Elm Architecture.
Using Redux Saga
You can also use Redux Saga that lets you write long-running processes (“sagas”) that can take actions, perform some asynchronous work, and put result actions to the store. Sagas watch specific actions rather than state updates which is not what you asked for, but I figured I’d still mention them just in case. They work great for complicated async control flow and concurrency.
function* fetchUser(action) {
try {
const user = yield call(Api.fetchUser, action.payload.userId);
yield put({type: "USER_FETCH_SUCCEEDED", user: user});
} catch (e) {
yield put({type: "USER_FETCH_FAILED",message: e.message});
}
}
function* mySaga() {
yield* takeEvery("USER_FETCH_REQUESTED", fetchUser);
}
 No One True Way
All these options have different tradeoffs. Sometimes people use one or two, or even all three of them, depending on what turns out to be most convenient for testing and describing the necessary logic. I encourage you to try all three and pick what works best for your use case.
You can use a middleware to fire up your remote actions in response to the local action.
Let say I have a local action:
const updateField = (val) => {
{type: UPDATE_FIELD, val}
}
And a input field with:
<input type='text' onChange={this.props.updateField.bind(this.val)}>
So in a nutshell when you type inside of the field it fires your action that in turn changes the state via reducer. Lets just forget how this action was passed to the component or what this.val is - we just assume this has been already solved and it is working.
All is fine about this setup but it only changes your state locally. To update the server you will have to fire another action. Lets build it:
const updateFieldOnServer = (val) => {
return (dispatch) => {
MAKE_AJAX.done(
FIRE_SOME_ACTIONS_ON_SUCCESS
).failure(
FIRE_SOME_ACTIONS_ON_FAILURE
)
}
}
This is just an simple thunk async action thats somehow makes ajax request, returns promises and does something else on success or failure.
So the problem we have now is that I want both of this actions to be fired when I change the state of my input but I can't have the onChange to take two functions. So I will create a middleware named ServerUpdatesMiddleware
import _ from 'lodash'
import {
UPDATE_FIELD,
} from 'actionsPath'
export default ({ dispatch }) => next => action => {
if(_.includes([UPDATE_FIELD], action.type)){
switch(action.type){
case UPDATE_FIELD:
dispatch(updateFieldOnServer(action.val))
}
}
return next(action)
}
I can add it to my stack:
import ServerUpdatesMiddleware from 'pathToMe'
const createStoreWithMiddleware = applyMiddleware(
ServerUpdatesMiddleware,
thunkMiddleware,
logger
)(createStore);
And right now every single time when updateField action will be dispatched It will automatically dispatch updateFieldOnServer action.
This is just example I think will describe the problem easily - this problem can be fixed in many other different ways but I think it nicely fits the requirements. It is just how I do things - hope it will help you.
I am using middlewares all the time and have many of them - never had any problem with this approach and it simplifies the application logic - you only have to look in a single place to find out whats going on.
Having modules that subscribe to the state updates and the launch Ajax requests (firing actions as they go) seems fine to me, since it puts the stores/reducers firmly in charge of triggering requests. In my large app, ALL Ajax requests and other async behaviours are done this way, so all actions can be just payloads, with no concept of 'action creators'.
If possible, avoid cascading sync actions. My async handlers never fire actions synchronously, but only once the request completes.
In my view, this is a much more functional approach than async action creators, which you may or may not prefer!
componentWillReceiveProps of react life cycle is the best place to do this. componentWillReceiveProps will be passed both new and old props and inside that you can check for the change and dispatch your action which in turn will fire the ajax call.
But the catch here is state object for which you are checking needs to be added as component's props via mapStateToProps, so that it gets passed to componentWillReceiveProps. Hope it helps!

Prism 2.1 Publish/Subscribe with weak reference?

I am building a Prism 2.1 demo by way of getting up to speed with the technology. I am having a problem with CompositePresentationEvents published and subscribed via the Event Aggregation service. The event subscription works fine if I set a strong reference (KeepSubscriberReferenceAlive = true), but it fails if I set a weak reference (KeepSubscriberReferenceAlive omitted).
I would like to subscribe with a weak reference, so that I don't have to manage unsubscribing from the event. Is there any way to do that? Why is a strong reference required here? Thanks for your help!
Here are the particulars: My demo app is single-threaded and has two regions, Navigator and Workspace, and three modules, NavigatorModule, WorkspaceAModule, and WorkspaceBModule. The NavigatorModule has two buttons, 'Show Workspace A' and 'Show Workspace B'. When one of these buttons is clicked, an ICommand is invoked that publishes a CompositePresentationEvent called ViewRequested. The event carries a string payload that specifies which workspace module should be shown.
Here is the event declaration, from the app's Infrastructure project:
using Microsoft.Practices.Composite.Presentation.Events;
namespace Prism2Demo.Common.Events
{
public class ViewRequestedEvent : CompositePresentationEvent<string>
{
}
}
Here is the event publishing code, from the Navigator module:
// Publish ViewRequestedEvent
var eventAggregator = viewModel.Container.Resolve<IEventAggregator>();
var viewRequestedEvent = eventAggregator.GetEvent<ViewRequestedEvent>();
viewRequestedEvent.Publish(workspaceName);
Here is the event subscription code, which each Workspace module includes in its Initialize() method:
// Subscribe to ViewRequestedEvent
var eventAggregator = m_Container.Resolve<IEventAggregator>();
var viewRequestedEvent = eventAggregator.GetEvent<ViewRequestedEvent>();
viewRequestedEvent.Subscribe(this.ViewRequestedEventHandler, ThreadOption.PublisherThread, true);
The Subscribe() statement is shown with a strong reference.
Thanks again for your help.
A couple of things to check:
Make sure that your EventAggregator instance is being correctly registered with the container or it may itself be garbage collected:
container.RegisterType<IEventAggregator, EventAggregator>(new ContainerControlledLifetimeManager());
Also make sure that you have a strong reference to the subscribed object held somewhere (this in your subscription code).

Resources