can anyone help me with converting my project to use PetaPoco?
here is my issue. backend is SQL 2010 database .NET fraimework 4.0
I have an existing 3-tier win app in C# that uses a custom DAL -- each Data call uses stored procs with parameters and either returns dataset or specific value as needed -- each call accepts dataset referenced parameter and baseClass parameter (base class is identical to DB table schema well mostly)
I want to replace my custom DAL with PetaPoco but keep the 3-tier layout
the app is relying on predefined base classes as DTO to pass info between UI-BAL-DAL
does anyone have a sample/example of app solution layout as to how to use PetaPoco in 3-tier enviroment code example would be very helpfull
thanks in advance...
Vlad
Example not really needed
All you have to do is get acquainted with PetaPoco library. The best way is its documentation. It's not a complicated/complex library, so you should get up to speed with it quite quickly.
If you also have you application broken down into projects for each layer (UI, BL, DAL), then the easiest thing to do is to create a new DAL project and implement all used functionality of existing DAL but use PetaPoco in this one. Then just change your project references and voila. That's it. You can keep your POCOs/DAO. If you've used IoC then it will be even easier because instantiating DAL repositories (or whatever you're using) is probably done via some DI container.
Layering and PetaPoco
PetaPoco has nothing to do with application layering. If you use it in 3-tier applicatin that's fine.
What are you using now?
You didn't mention which DAL library (if any) you're using right now. If you don't, then using PetaPoco will result in less lines of code and much simplified object mapping.
Related
I've started using EF and LINQ in a project and I'm trying to decide on the best approach/pattern to use. Until now I've been using a custom persistence framework that was based on DataSets and XML configuration. Basically it was a VS Custom Tool that would read the XML configuration file and the DataSets and would generate Object Oriented classes with all the necessary properties/associations/methods. This auto-generated classes then were used from the UI and I had the flexibility to expose only what the UI would need.
Now with EF and LINQ, I'm not comfortable with the idea that the UI can use directly the auto-generated classes and all the LINQ stuff. It seems that this approach would have a very tight integration between UI and the database.
So I'm looking for some pattern that would "hide" all the EF and LINQ goodies and basically limit what the UI can do. Is there any standard way to do this?
What you're looking for is an n-tier application. It's not so much a pattern as an architecture. You break your app up into 2 or more pieces, typically 3 composed of UI, business and Data. You might implement this through other patterns such as the Facade or Repository patterns to keep a strong seperation of concerns.
You might also use a Service Layer, which could be implemented by a facade or as a web service.
You would, ideally, pass data through objects called DTO's or Data Transfer Objects, and you might adapt those DTO's by using a view model in your UI (not to be confused with MVVM which another poster erroneously mentioned.)
Beyond that, much of it depends on the type of app you're buiding. Desktop app, server app, web app, etc..
The pattern you're looking for is, in general, Model-View-ViewModel, or MVVM.
Here's a tutorial that seems to hit on the high points of the design pattern: http://csharperimage.jeremylikness.com/2010/04/model-view-viewmodel-mvvm-explained.html
I would like to create a DataAccess / DataLayer project and encapsulate EF there, so that my MVC project doesn't know that I'm actually using EF. I may decide to use NHibernate in the future, and the out-of-the-box MVC project created by Visual Studio adds EF referece/DLL to the web project.
I cannot access the DbContext from MVC of course, because it needs EF reference.
As a result I wouldn't be able to use Code First data annotations, due to EF being required.
Is it worth creating a repository, or should I keep it "simple" and add EF reference to my MVC project?
It just doesn't make sense to me that I need to add a reference to EF to all my projects, tests and clients that use the context/database.
Thanks
What you are trying to create is the typical layer pattern. At the top you have the Presentation Layer, in the middle you have your Business Layer, and at the bottom (or last layer), you have your DAL layer.
How you design your layers is completely up to opinion and need, but the way I described it above requires you to have 3 different projects. A MVC project, a Logic project, and a DAL project. The DAL project will contain your EF reference and your repository objects. It's then up to you to convert your DbContext/ObjectContext items to POCOs to use them in the business layer. The business layer would know about EF (depending on how you pass your EF objects around), but the business layer would then pass it's own objects (mapping them from your DAL layer objects) to MVC -- thus completely decoupling EF from the MVC layer.
If you are going to use this type of pattern, you should go a step further and include Dependency Injection with a bootstrapped container (crosscutting project using Unity Framework, or something like that).
See Microsoft Pattern & Practices, http://msdn.microsoft.com/en-us/library/ff650706 (Chapter 25 is a good example of layering).
HTH
I choose to implement the repository system in just about all my projects for the exact purpose of decoupling my DAL from any one DataAccess technology or even a specific database.
(I had a hard time titling the question so feel free to suggest edits)
Here's the situation: we have just started building a system which is comprised of two integrated MVC 3 web applications running on Azure with a shared AzureSQL database. There are many reasons for running two apps instead of one and I'd rather not get into that...
Originally, database was created code-first from the MVC application "A". 75% of entities from all created will be relevant to application "B" plus application "B" will need a few entities specific to it.
Currently, the entities-defining classes have been extracted into a class library so within the application "A" solution to allow for reuse in application "B". But I am still unsure how to go about adding entities required for application "B"...
The question is: what is the best way to manage the database development/management in this situation? Specifically, where should the definition of entities be? Should we just have a separate db project defining the database and work db-first? (with this option being my preferred at this stage).
Since both of the devs (me and the other dev) working on this are new to MVC and EF, any advice would be much appreciated.
Without seeing what you have its not entirely mapping here in my brain - but I think I may have an idea on this.
Can you create an additional projects containing your models (data access layer) that has your entity framework edmx (or code first) and poco templates installed. This project will be shared by both applications - ie both projects get this assembly and both have the ef connect string in their web.configs.
Another approach is to put all code first into a single project (whatever.domain, whatever.models) etc. Your mapping code then goes into your DataAccess project
protected override void OnModelCreating(DbModelBuilder modelBuilder)
{
modelBuilder.Conventions.Remove();
modelBuilder.Configurations.Add(new CustomerMap());
...
}
You now have shared poco classes and a single data access layer.
Some treat their poco classes as their domain objects (theres no problem with this) and their business logic goes in the poco classes. This is fine as long as your poco objects themselves remain persistent ignorant and ideally you don't want to reference implementation specific components in your poco classes. For a good writeup here see:
POCO - if POCO means pure .net class with only properties, where i can write validations in MVC
Personally I like db first and then reverse engineer it using the EF power tools to have a code first model as if you ever want to integration test it, you can simply create the db for your integration tests and remove it when done.
I will start to code a new Web application soon. The application will be built using ASP.Net MVC 3 and Entity Framework 4.1 (Database First approach). Instead of using the default EntityObject classes, I will create POCO classes using the ADO.NET POCO Entity Generator.
When I create POCOs using this tool, it automatically adds the Virtual keyword to all properties for change tracking and navigation properties for lazy loading.
I have however read and seen from demonstrations, that Julie Lerman (EF Guru!) seems to turn off lazy loading and also modifies her POCO template so that the Virtual keyword is removed from her POCO classes. Julie states the reason why she does this is because she is writing applications for WCF services and using the Virtual keyword with this causes a Serialization issue. She says, as an object is getting serialized, the serializer is touching the navigation properties which then triggers lazy loading, and before you know it you are pulling the whole database across the wire.
I think Julie was perhaps exagarating when she said this could pull the whole database across the wire, however, even so, this thought scares me!
My question is (finally), should I also remove the Virtual keyword from my POCO classes for my MVC application and use DectectChanges for my change tracking and Eager Loading to request navigation properties.
Your help with this would be greatly appreciated.
Thanks as ever.
Serialization can indeed trigger lazy loading because the getter of the navigation property doesn't have a way to detect if the caller is the serializer or user code.
This is not the only issue: whether you have virtual navigation properties or all properties as virtual EF will create a proxy type at runtime for your entities, therefore entity instances the serializer will have to deal with at runtime will typically be of a type different from the one you defined.
Julie's recommendations are the simplest and most reasonable way to deal with the issues, but if you still want to work with the capabilities of proxies most of the time and only sometimes serialize them with WCF, there are other workarounds available:
You can use a DataContractResolver to map the proxy types to be serialized as the original types
You can also turn off lazy loading only when you are about to serialize a graph
More details are contained in this blog post: http://blogs.msdn.com/b/adonet/archive/2010/01/05/poco-proxies-part-2-serializing-poco-proxies.aspx
Besides this, my recommendation would be that you use the DbContext template and not the POCO template. DbContext is the new API we released as part of EF 4.1 with the goal of providing greater productivity. It has several advantages like the fact that it will automatically perform DetectChanges so that you won't need in general to care about calling the method yourself. Also the POCO entities we generate for DbContext are simpler than the ones that we generate with the POCO templates. You should be able to find lots of MVC exampels using DbContext.
Well it depends on your need, if you are going to serialize your POCO classes than yes you should remove them (For example: when using WCF services or basically anything that will serialize your entire object). But if you are just building a web app that needs to access your classes than I would leave them in your classes as you control the objects that you will access in your classes through your code.
After plenty of reading, I still don't understand Unity for MVC 3.
Specific points
Why use it? I can create a controller that in its constructor, it takes a new EF context for testing.
How? I keep seeing bits are parts, but is there an end to end walk through on implementing Unity on MVC 3 (Live)? There seem to be plenty on Beta and RC, but the code always seems to have a problem on live frameworks.
Currently this is not impacting my unit testing, since my controllers have overloaded constructors, as does my EF context.
If you have a small project, you may not benefit from IoC.
Lifetime management if a plus for me. I don't have to dispose a repository (or service layer) in every controller. It thins out my code and creates the object for me. In addition, I know I have a clean separation in case I ever need to change things. It almost forces me to. I use for example IRepository that is backed by entity framework. For testing I use a fake IRepository implementation. So sure, I could manually create it in my application but this leads to some bad practices in larger projects and I lose the benefits of having the interface.
I have a basic demo for a super short talk I did recently on this for (15 minutes) mvc and unity for dependency injection using the unity.mvc3 nuget package:
http://completedevelopment.blogspot.com/2011/12/using-dependency-injection-with-mvc.html
Btw. Dependency Injection in .Net - best book on the subject without a doubt.
It's very useful for wiring up all your dependencies, handling life cycle of your objects, error handling, transaction handling, testing purposes and ...
All of above point are advantages of using an IoC framework, but I strongly recommend using Ninject. it has a very friendly DSL for binding modules, has out of box library and extensions for ASP.NET MVC and is an open source lightweight DI framework.
It has also many extensions.