Deserialization Exception in Plugin when using Early Bound Entities - dynamics-crm

I wrote several plugin for previous CRM versions, most of them using Early Bound entities.
Right now I am writing a plugin for version 2015 with just one custom entity; the plugin contains the generated entity definition (early-bound entity class).
As soon as I attempt to retrieve an entity using the Organization Service in the Plugin Pipeline I get the following exception:
Element 'http://schemas.microsoft.com/xrm/2011/Contracts:Entity' contains data from a type that maps to the name 'http://schemas.microsoft.com/xrm/2011/Contracts:new_TestEntity'. The deserializer has no knowledge of any type that maps to this name. Consider changing the implementation of the ResolveName method on your DataContractResolver to return a non-null value for name 'new_TestEntity' and namespace 'http://schemas.microsoft.com/xrm/2011/Contracts'.
Plugin has a ProxyTypesAssembly attribute.
I am attempting to intercept the RetrieveMultiple and Retrieve messages. It's all working. It faults as soon as I attempt to execute a Retrieve within the plugin execution context (using the pipeline's org service).

Looks like you have changed your entity model without regenerating the early bound classes. Try regenerating your early bound entities.
Looks like the same issue as described in CRM 2015 SDK : The deserializer has no knowledge of any type that maps to this name

Related

Keeping service member up-to-date in other parts of backend application

I have a Spring + Kotlin backend. In this application there exists a service which holds a member of type MutableMap<String, Model>. Here the Key is of type String to represent the objectId of a Mongodb Document and the Model is the Document Entity itself. My goal with this MutableMap is to quickly retrieve the Entity knowing the respective objectId for further processing / business logic. This member of Type MutableMap<String, Model> will get updated fairly regularly over the course of the application lifespan: new Models will get added and unnecessary Models will get removed / deleted via the MongoRepository from my database and therefore should also disappear from this MutableMap.
This Service is responsible for keeping the repository and this MutableMap member up to date. On the other hand I'm planning on using this MutableMap in other Services. My Problem here is, that I don't know how to keep the MutableMap updated in other parts of my application.
What I mean by that is that if for example this MutableMap is the member of ServiceA and I update the member via a method like
ServiceA.addModelToMutableMap(model: Model) {...}
How do I implement the functionality for detecting the change of this member in other Services? How to always have an up-to-date copy of this MutableMap in other Services / parts of my application?
I think that I have made a poor design choice that I'm not aware of, don't know which exact technology/dependency to use, unaware of a specific Spring feature or simply don't know what to google for.

AWS Lambda C# EF Core Serialization Error

I created a .net 6 minimal API project with EF Core that uses DI to create repositories with Scoped lifetime. The API project uses mediatr to send the request to a proper handler. The handler's get injected with db repositories. This works when I run this project directly.
I am migrating that project to an AWS Lambda project using the new AWS .NET 6 Templates in the visual studio toolkit. For whatever reason, the exact same code that runs fine in the minimal API project now throws an error because the injected repositories dispose their connections before the end of the request.
This error occurs anytime I run a command against the database.
I believe this is happening because of a serialization error that occurs in entity framework core. This issue doesn't occur in my regular project because I'm guessing it's using a different serializer to handle the serialization of entities.
The errors being thrown are:
System.Text.Json.JsonException: A possible object cycle was detected. This can either be due to a cycle or if the object depth is larger than the maximum allowed depth of 32. Consider using ReferenceHandler.Preserve on JsonSerializerOptions to support cycles.
Cannot access a disposed context instance
If I update the Json Serializer that .net is using to handle cycles, then the 1st error turns into: "System.NotSupportedException: Serialization and deserialization of 'System.Type' instances are not supported".
This looks like some sort of conflict with Pomelo Entity Framework Core
and the way the .net 6 lambda templated project is setup.
EDIT:
After looking at this more, I think the issue is with whatever serialization library that AWS Lambda template project uses vs whatever serialization library is normally used by Pomelo to handle things.

How to generate Kotlin and or Java data model from GraphQL schema

My current project has switched to GraphQL API's and I wish to automate the generation of model objects that match both Query/Mutations requests/responses.
All I require is the model classes, I do not want to use tools such as Apollo at runtime in my Application.
I require model classes to be either Java or Kotlin.
I found this https://www.graphql-java-kickstart.com/tools/schema-definition/
however this appears to require me to create the model classes my self...
based on this statement "GraphQL Java Tools will expect to be given three classes that map to the GraphQL types: Query, Book, and Author. The Data classes for Book and Author are simple:"
What am I missing?
When I attempt use Apollo-cli to download my schema I get this error
~ - $ npx apollo-cli download-schema $https://my.graphql.end.point/graphql --output schema.json
Error while fetching introspection query result: only absolute urls are supported
Surely this is an basic requirement when employing GraphQL
So if I understand you correctly what you are trying to do is to a) download and locally create the schema from an existing graphql endpoint and b) create java model objects from this schema.
To download the schema you can use the graphql-cli. First install via npm install -g graphql-cli and run graphql init to setup your .graphqlconfig. Finally run graphql get-schema to download the schema from the defined endpoint.
Next you want to leverage a Java code generator that takes the GraphQL schema and creates:
Interfaces for GraphQL queries, mutations and subscriptions
Interfaces for GraphQL unions
POJO classes for GraphQL types
Enum classes for each GraphQL enum
There are various options depending on your setup / preferences (e.g. gradle vs maven):
https://graphql-maven-plugin-project.graphql-java-generator.com/index.html
https://github.com/kobylynskyi/graphql-java-codegen-gradle-plugin
https://github.com/kobylynskyi/graphql-java-codegen-maven-plugin
I recommend you to check out the first option, since it looks very well documented and also provides full flexibility after generating the desired helpers:
graphql-java-generator generates the boilerplate code, and lets you
concentrate on what’s specific to your use case. Then, the running
code doesn’t depend on any dependencies from graphql-java-generator.
So you can get rid of graphql-java-generator at any time: just put the
generated code in your SCM, and that’s it.
When in client mode, you can query the server with just one line of
code.
For instance :
Human human = queryType.human("{id name appearsIn homePlanet
friends{name}}", "180");
In this mode, the plugin generates:
One java class for the Query object, One java class for the Mutation
object (if any), One POJO for each standard object of the GraphQL
object, All the necessary runtime is actually attached as source code
into your project: the generated code is stand-alone. So, your
project, when it runs, doesn’t depend on any external dependency from
graphql-java-generator.

Entity Framework POCO Serialization

I will start to code a new Web application soon. The application will be built using ASP.Net MVC 3 and Entity Framework 4.1 (Database First approach). Instead of using the default EntityObject classes, I will create POCO classes using the ADO.NET POCO Entity Generator.
When I create POCOs using this tool, it automatically adds the Virtual keyword to all properties for change tracking and navigation properties for lazy loading.
I have however read and seen from demonstrations, that Julie Lerman (EF Guru!) seems to turn off lazy loading and also modifies her POCO template so that the Virtual keyword is removed from her POCO classes. Julie states the reason why she does this is because she is writing applications for WCF services and using the Virtual keyword with this causes a Serialization issue. She says, as an object is getting serialized, the serializer is touching the navigation properties which then triggers lazy loading, and before you know it you are pulling the whole database across the wire.
I think Julie was perhaps exagarating when she said this could pull the whole database across the wire, however, even so, this thought scares me!
My question is (finally), should I also remove the Virtual keyword from my POCO classes for my MVC application and use DectectChanges for my change tracking and Eager Loading to request navigation properties.
Your help with this would be greatly appreciated.
Thanks as ever.
Serialization can indeed trigger lazy loading because the getter of the navigation property doesn't have a way to detect if the caller is the serializer or user code.
This is not the only issue: whether you have virtual navigation properties or all properties as virtual EF will create a proxy type at runtime for your entities, therefore entity instances the serializer will have to deal with at runtime will typically be of a type different from the one you defined.
Julie's recommendations are the simplest and most reasonable way to deal with the issues, but if you still want to work with the capabilities of proxies most of the time and only sometimes serialize them with WCF, there are other workarounds available:
You can use a DataContractResolver to map the proxy types to be serialized as the original types
You can also turn off lazy loading only when you are about to serialize a graph
More details are contained in this blog post: http://blogs.msdn.com/b/adonet/archive/2010/01/05/poco-proxies-part-2-serializing-poco-proxies.aspx
Besides this, my recommendation would be that you use the DbContext template and not the POCO template. DbContext is the new API we released as part of EF 4.1 with the goal of providing greater productivity. It has several advantages like the fact that it will automatically perform DetectChanges so that you won't need in general to care about calling the method yourself. Also the POCO entities we generate for DbContext are simpler than the ones that we generate with the POCO templates. You should be able to find lots of MVC exampels using DbContext.
Well it depends on your need, if you are going to serialize your POCO classes than yes you should remove them (For example: when using WCF services or basically anything that will serialize your entire object). But if you are just building a web app that needs to access your classes than I would leave them in your classes as you control the objects that you will access in your classes through your code.

struts2: accessing external service from type converter

is it possible to inject a service reference into custom type converter?
my situation is quite typical in fact, I have a combo, which binds to collection of entities. On submit I get only an ID of selected entity and have to refetch the real object in my action. I was thinking about more elegant way to do this, and it seems like making an ID-to-entity custom converter which would perform fetching - would be a good idea.
But I failed trying to map a converter to Spring bean in the same fashion like actions...
Interesting question. Are you using the spring plugin ?.
It is supposed to take care of service-objects creation, (and wiring with other services) for Struts2, and this should be able to include Type Converters. From here:
By using the struts2-spring-plugin in conjunction with type conversion, developers easily can use dependency injection to provide a converter with services
But I have not used that feature.

Resources