Scenario: the application has a long running method to return the data and build the final entity of a slowly changing dimension. In the previous versions of .net the data could be setup for interception and caching with the CacheInterceptionBehavior while registering the Interface with Unity.
Is there a corresponding way to do something similar in .net core?
Related
I created a .net 6 minimal API project with EF Core that uses DI to create repositories with Scoped lifetime. The API project uses mediatr to send the request to a proper handler. The handler's get injected with db repositories. This works when I run this project directly.
I am migrating that project to an AWS Lambda project using the new AWS .NET 6 Templates in the visual studio toolkit. For whatever reason, the exact same code that runs fine in the minimal API project now throws an error because the injected repositories dispose their connections before the end of the request.
This error occurs anytime I run a command against the database.
I believe this is happening because of a serialization error that occurs in entity framework core. This issue doesn't occur in my regular project because I'm guessing it's using a different serializer to handle the serialization of entities.
The errors being thrown are:
System.Text.Json.JsonException: A possible object cycle was detected. This can either be due to a cycle or if the object depth is larger than the maximum allowed depth of 32. Consider using ReferenceHandler.Preserve on JsonSerializerOptions to support cycles.
Cannot access a disposed context instance
If I update the Json Serializer that .net is using to handle cycles, then the 1st error turns into: "System.NotSupportedException: Serialization and deserialization of 'System.Type' instances are not supported".
This looks like some sort of conflict with Pomelo Entity Framework Core
and the way the .net 6 lambda templated project is setup.
EDIT:
After looking at this more, I think the issue is with whatever serialization library that AWS Lambda template project uses vs whatever serialization library is normally used by Pomelo to handle things.
The Unity DI Container has an extension for ASP.NET MVC (based on the older .NET Framework) that includes PerRequestLifetimeManager. This allows you to have one instance of an object that has the lifetime of one http request.
I am building an ASPNET Core MVC project and would like to use Unity, but there does not seem to be an equivalent to the PerRequestLifetimeManager in the Unity.Microsoft.DependencyInjection extension. Is anyone aware if such a thing exists?
I can find a lot of articles on how to enable DbContextPooling in ASP Net Core through AddDbContextPool function.
But what about .Net framework 4.7.2 when this is not available. How can I reuse the DbContext from the pool?
Thanks
AddDbContextPool sets up services in the dependency injection container to make it more efficient to get DbContext instances for each request.
If you are not using ASP.NET Core, but your application follows a similar model (e.g. DbContext instances are needed to process Web or service requests) and the number of requests per second is very large, then my first recommendation would be to setup DI using Microsoft.Extensions.DependencyInjection and create DI scopes per request like ASP.NET Core does. Once you do that, you should be able to call AddDbContextPool the same way you would do in an ASP.NET Core application, and resolve your DbContext using DI with all the benefits of DbContext pooling.
Besides that option, in theory you could do manually what AddDbContextPool achieves using DI.
For example, first create a singleton pool of type DbContextPool. Then, for every instance of YourDbContext you need, get a lease using pool.Lease, and then get the context using lease.Context.
You have to make sure that you dispose both the context and the lease when you are done using them.
Caveat: this approach requires direct usage of low level APIs that are in internal namespaces and that therefore could change or disappear in any future minor or major release of EF Core.
If your application doesn't work like that (for example, if it is not a web application or a web service that need to process large numbers of requests), then there is no advantage on using DbContext pooling.
(I had a hard time titling the question so feel free to suggest edits)
Here's the situation: we have just started building a system which is comprised of two integrated MVC 3 web applications running on Azure with a shared AzureSQL database. There are many reasons for running two apps instead of one and I'd rather not get into that...
Originally, database was created code-first from the MVC application "A". 75% of entities from all created will be relevant to application "B" plus application "B" will need a few entities specific to it.
Currently, the entities-defining classes have been extracted into a class library so within the application "A" solution to allow for reuse in application "B". But I am still unsure how to go about adding entities required for application "B"...
The question is: what is the best way to manage the database development/management in this situation? Specifically, where should the definition of entities be? Should we just have a separate db project defining the database and work db-first? (with this option being my preferred at this stage).
Since both of the devs (me and the other dev) working on this are new to MVC and EF, any advice would be much appreciated.
Without seeing what you have its not entirely mapping here in my brain - but I think I may have an idea on this.
Can you create an additional projects containing your models (data access layer) that has your entity framework edmx (or code first) and poco templates installed. This project will be shared by both applications - ie both projects get this assembly and both have the ef connect string in their web.configs.
Another approach is to put all code first into a single project (whatever.domain, whatever.models) etc. Your mapping code then goes into your DataAccess project
protected override void OnModelCreating(DbModelBuilder modelBuilder)
{
modelBuilder.Conventions.Remove();
modelBuilder.Configurations.Add(new CustomerMap());
...
}
You now have shared poco classes and a single data access layer.
Some treat their poco classes as their domain objects (theres no problem with this) and their business logic goes in the poco classes. This is fine as long as your poco objects themselves remain persistent ignorant and ideally you don't want to reference implementation specific components in your poco classes. For a good writeup here see:
POCO - if POCO means pure .net class with only properties, where i can write validations in MVC
Personally I like db first and then reverse engineer it using the EF power tools to have a code first model as if you ever want to integration test it, you can simply create the db for your integration tests and remove it when done.
can anyone help me with converting my project to use PetaPoco?
here is my issue. backend is SQL 2010 database .NET fraimework 4.0
I have an existing 3-tier win app in C# that uses a custom DAL -- each Data call uses stored procs with parameters and either returns dataset or specific value as needed -- each call accepts dataset referenced parameter and baseClass parameter (base class is identical to DB table schema well mostly)
I want to replace my custom DAL with PetaPoco but keep the 3-tier layout
the app is relying on predefined base classes as DTO to pass info between UI-BAL-DAL
does anyone have a sample/example of app solution layout as to how to use PetaPoco in 3-tier enviroment code example would be very helpfull
thanks in advance...
Vlad
Example not really needed
All you have to do is get acquainted with PetaPoco library. The best way is its documentation. It's not a complicated/complex library, so you should get up to speed with it quite quickly.
If you also have you application broken down into projects for each layer (UI, BL, DAL), then the easiest thing to do is to create a new DAL project and implement all used functionality of existing DAL but use PetaPoco in this one. Then just change your project references and voila. That's it. You can keep your POCOs/DAO. If you've used IoC then it will be even easier because instantiating DAL repositories (or whatever you're using) is probably done via some DI container.
Layering and PetaPoco
PetaPoco has nothing to do with application layering. If you use it in 3-tier applicatin that's fine.
What are you using now?
You didn't mention which DAL library (if any) you're using right now. If you don't, then using PetaPoco will result in less lines of code and much simplified object mapping.