we use html 5/angular SPA with Webapi at the service which communicates with DAL for dataaccess operations
Layer flow would be:
presentation(html5/angular controllers/service) --> web api --> DAL - -> DB.
we do not have BLL project as such. we are thinking to make DAL as combination of BLL + DAL. And we use DTO objects created through t4 templates and they are used for transfer of data between client and web api and DAL (we dont use EF, we use ADO.Net as underlying provider)
should we require a seperate BLL project or is it ok to combine BLL and DAL proj? considering it should be testable and extendable.
as mentioned, DTO objects are used throughout. should we require any model other than DTO to transfer the data between the client and webapi/DAL?
DAL :
public List GetCustomers {} this uses Data access helper classes to get the customers and convert to DTO
above CustomerDAL.GetCustomers is being called by webapi project. At this point of time, any BL of (say. customer) is written in web Api project and sometimes at DAL project. we are thinking to move them to one project for consistency and testability.
any insights on this would be helpful.
The greatest value that I get out of having a separate BLL is that the most important / expensive bits of my application (the business logic), are in an area that has no dependencies on databases or web/http frameworks. It means that when the next big thing (database, platform, etc) comes along, I can reuse my business layer.
More importantly, DAL and UI layers are MUCH more expensive to test. When I'm writing unit tests at the UI or DAL layer, I'll end up testing 1-2 scenarios per function... When I'm testing at the BAL layer, I'll create many times more scenarios, because it's so cheap (effort-wise). This gives me much better coverage for much less cost.
Perhaps your applications don't have much business logic. If they are purely CRUD wrappers around database tables, it might not justify the expense. Most applications contain far more business logic than the developers want to admit though. Look through your validations that you run in your WebAPI... Those are likely all business rules. Look at your security constraints, those are likely business rules as well.
Whether or not to use DTOs or a more complex domain model depends on your design, environment, and team constraints, and is not something I would feel comfortably addressing in a fifteen minute posting. Fowler has some strong opinions, calling it an Anemic Domain Model antipattern, but I've seen it used quite successfully for large-scale projects. One of the nice aspects to this model is the fact that you don't need quite as much of a coherent picture of the application model, which is often the case with large, dispersed teams.
Related
In a three-layered architecture, where is the DAO pattern located? Is it in the business logic layer or in the data layer?
I'm not sure that thinking in terms of layering is useful anymore.
We used to have 2-tier client-server, with all the logic in the client and a database running on a server.
We evolved to 3-tier, usually associated with MVC model-view-controller. There wasn't a mention of data access objects in the original Smalltalk MVC pattern.
Now I think view and controller generally go together, splitting rendering of the user interface between client and server. Controllers have business logic and interact with many web and data access services. Data access objects would be used by controllers to deal with data sources. Call that whatever layer you wish.
I don't think of microservices as a layer. Perhaps the usefulness of the concept has diminished.
I have a project in which I am using NHibernate and ASP.Net MVC. The application is intended to allow users to track certain data and then produce views of statistics based upon the data entered. The structure of my application thus far looks something like this:
NHibernate Layer: Contains Repository<T> and UnitOfWork classes, as well as entity mapping definitions.
Core/Service Layer: Contains generic EntityService class. At the moment, this simply defines transaction scope via IUnitOfWork and interfaces with IRepository to provide higher-level data access services.
Presentation Layer (MVC Application): Not yet implemented, but contains the usual stuff plus dependency injection.
I have a couple of questions:
Is it poor design to allow my MVC application to handle dependency injection for ALL layers? For example, as well as dependency injection of EntityService instances into controllers, it will handle the dependency injection of IRepository into the EntityService classes. Should the service layer handle this itself, even though this would mean performing dependency injection in two distinct places?
Where should I produce my statistics? This business logic doesn't seem to belong in my service layer, which, at present, only contains entity type definitions and an interface for modifying and accessing entity properties. I have a few thoughts on this, but I'm not sure which I like best:
Keep my service layer as is and create a separate Statistics project - this is completely independent of the entity types for which it will be used, meaning my MVC controllers will have to pass raw numerical information between my business entities and my (presumably static) statistics classes. This is quite a neat separation but potentially means a lot of business logic still remaining in the presentation layer.
Create a Statistics project; however, create a tight coupling between the classes in this project and my business entities. For example, instead of passing a Reading object's values into a method, I will pass the entire object (or define them as extension methods). This will shift business logic out of my MVC app but the tight coupling seems a bit messy.
Keep all of my business logic inside my service layer. Define strongly-typed subclasses of EntityService, so my services contain both entity-specific business methods and data storage methods, while keeping the entity classes themselves as pure data containers. Create a separate Statistics project for any generic statistical processing and call its methods via my derived service classes. My service classes effectively merge business functions with the storage functionality provided created by IRepository<T>.
I am erring toward the third option but does anyone have any thoughts? Alternative suggestions?
Thanks in advance!
Preliminary observation:
I like the way in which you described your project, I just didn't get why your Data Access Layer (DAL) is called NHibernate Layer: it is odd with all the rest in which you didn't use technology name to describe a logical layer (correctly). So I suggest you to rename it DAL, and use it to abstract your app from NHibernate.
My opinions about your questions:
Absolutely no. It is good to apply Dependency Injection to All Layers. A couple or reasons for which it is good:
1.1 Testing: you can mock DAL interfaces and do unit test Service Layer w/o DAL using another DI config file. In the same way you can mock Service for Web Controllers layer and so on.
1.2 Different DAL implementations: suppose you need different DAL implementation (NOSQL, SQL or LINQ instead of NHibernate, etc..) technologies for different deployment of you project or to scale in the future. You can do that easily maintaining different DI config files.
You can have the same layer deployed in different projects. In the same way you can have a project containing different layers. I think their relation is orthogonal: project is describing a physical (development time and run time) implementation. Layers are logical. So initially I would keep it simple with the third option.
I just don't understand why you saying the following regarding this option:
Create a separate Statistics project for any generic statistical
processing and call its methods via my derived service classes. My
service classes effectively merge business functions with the storage
functionality provided created by IRepository.
I see Statistics as one or more services so you can implement it as namespace with classes inside your Service Layer. And, as any other service, you can inject DAL Repository classes. And, as any other Service/DAL, the Model classes can be shared between different Services and DAL classes.
StatsService.AverageReadingFor(Person p, DateTime start, DateTime end) sounds good.
There are several implementation options:
Using underlying repository features (for example: SQL avg function)
Using Observer Pattern which is implementable also using Dependency Injection
Using Aspect Oriented Programming. See that Spring.Net chapter as an example.
If you have more than one Service Layer instance (more than one server) than 2 and 3 must be adapted for out of process communication using a messaging system.
Just an update - Regarding my second question, I have decided to define an IStatsService<T> which expects an IEntityService<T> to be passed into its constructor. I'll use this for generic statistical processing of business entities and create further interfaces that implement IStatsService<T> where I need more type-specific information.
Hopefully this will help someone who has been scratching their head about a similar problem!
I've started using EF and LINQ in a project and I'm trying to decide on the best approach/pattern to use. Until now I've been using a custom persistence framework that was based on DataSets and XML configuration. Basically it was a VS Custom Tool that would read the XML configuration file and the DataSets and would generate Object Oriented classes with all the necessary properties/associations/methods. This auto-generated classes then were used from the UI and I had the flexibility to expose only what the UI would need.
Now with EF and LINQ, I'm not comfortable with the idea that the UI can use directly the auto-generated classes and all the LINQ stuff. It seems that this approach would have a very tight integration between UI and the database.
So I'm looking for some pattern that would "hide" all the EF and LINQ goodies and basically limit what the UI can do. Is there any standard way to do this?
What you're looking for is an n-tier application. It's not so much a pattern as an architecture. You break your app up into 2 or more pieces, typically 3 composed of UI, business and Data. You might implement this through other patterns such as the Facade or Repository patterns to keep a strong seperation of concerns.
You might also use a Service Layer, which could be implemented by a facade or as a web service.
You would, ideally, pass data through objects called DTO's or Data Transfer Objects, and you might adapt those DTO's by using a view model in your UI (not to be confused with MVVM which another poster erroneously mentioned.)
Beyond that, much of it depends on the type of app you're buiding. Desktop app, server app, web app, etc..
The pattern you're looking for is, in general, Model-View-ViewModel, or MVVM.
Here's a tutorial that seems to hit on the high points of the design pattern: http://csharperimage.jeremylikness.com/2010/04/model-view-viewmodel-mvvm-explained.html
I'm trying to architect my MVC web project and I'm running into a bit of a problem.
I am using EF4.1. I've created a DataAccess project with the EDMX file. Then I use the dbContext generator to make my POCO .tt classes.
As it is right now, my Business logic layer can access the POCO classes just fine, but the presentation layer cannot.
I think that I'm supposed to create another level of abstraction and put the dbContext .tt files into their own project so that both the BusinessLogic layer and the Presentation layer can access the POCO classes, but only the BusinessLogic has access to the entity framework. The presentation layer shouldn't need to know anything about EF.
Something like this...
POCO Classes - DataAccess
| |
|---------Business Logic
| |
|_________Presentation
Am I on the right track here, and if so, do I simply cut/paste the .tt files into the new project or is there a way to force the dbContext add-on to create these in my other project?
Your presentation layer doesn't have to know anything about the EF. Just reference that project from your presentation layer to access the models.
However - your presentation layer shouldn't ideally be using any of those POCO models. They should be using ViewModels. I dont necessarily believe in the DTOs here as DTOs have a specific purpose. Your repository/data access can return models but generally those get returned to a service layer. The service layer then would return your ViewModel representation to your controller.
This sets you up nicely for dependency injection as well, since into your controller you just inject your service layer. Into your service you can inject then any repositories you need, and so on.
Ironically I think I may be working on a book for this exact subject shortly : )
Consider sending Data Transfer Objects between your Business Logic and Presentation layers. This would allow you to shape the data for your views and and prevent information from leaking into the Presentation layer (e.g. if you have a field in your POCO that is needed for your business logic but doesn't need to be available in your Presentation layer).
The question is, how would you move data to and from the
presentation layer? Put another way, should the presentation layer
hold a reference to the domain model assembly? (In an Entity Framework
scenario, the domain model assembly is just the DLL created out of the
EDMX file.)
From a pure design perspective, DTOs are a solution really close to
perfection. DTOs help to further decouple presentation from the
service layer and the domain model. When DTOs are used, the
presentation layer and the service layer share data contracts rather
than classes.
A layer of DTOs isolates the domain model from the presentation,
resulting in both loose coupling and optimized data transfer.
If you go this route, also check out Automapper to help with mapping your DTOs to POCOs and vice-versa.
So there are several ways to structure your project. What you are referring to is one way, in which you share poco's between all layers.
Another way is to have your POCO's be in the data and business layer, then create a similar object model that's shared between UI and business layer. Finally, you might also create a third model for the UI only called teh ViewModel.
It all really depends on your needs. If your object model is very complex, then you might need to simplify it with ViewModels.
Is to possible to have a layout for web-based architecture based on MVC where SOA is the architectural style. Or to rephrase, can services be part of the M,V, C of MVC.If so, what kinds of services can be included in each of them. Also, can you give me a real world example?
In a SOA application you are typically not including the front end (presentation layer). You are consuming those services in your MVC application, or better yet in a separate "model" project that the MVC application uses.
Data Access -> Business Logic -> Services -> Models -> MVC
The point is to use the services to create an abstraction around the base of your application to allow for multiple clients to consume those services.
I tend to term the Model as represented in the client/presentation layer as the ViewModel, it is simply the presentation layers view of the model. Not the actual Domain model.This is needed in a SOA as the context of the consumer of the Model changes often
In SOA`s we try to get to a canonical schema for the contract, as it is quite likely that not all clients now and in the future will require the exact same view of the model.
Thus be it a web client, service client or a desktop client, if you think of the Model in MVC as the ViewModel, this allows you to abstract presentation layer things away from Service layer things, and you get closer to a canonical schema.
So an example View >> Controller >> ViewModel(Model) >> Data Contract >> Service
Examples of how to build a service stack like this can be found here:
SOA Design Pattern
The decision of whether to go with a REST architecture or a full WS-* SOAP is a separate concern and should not affect your choice of MVC as a presentation pattern.
There may of course be other constraints that preclude the use of one or the other.
Choosing a presentation pattern for a new or enterprise web development on the Microsoft platform is a daunting task, in my opinion there are only three; View Model, Model-View-Presenter (MVP) or ASP.NET MVC (a Model2 derivative).
You can read the full article here ASP.NET MVC Patterns
This depends on what you mean by SOA. If you are referring to WS-* standards, I would not recommend MVC, as you will need to write a lot of plumbing to get it to work.
If you are looking for something like a REST service, then the MVC pattern actually works quite well. The request is the HTTP location of the resource, which gets passed to the controller, which loads the data via the model, and then passes it to the view which returns it in whatever form is needed (JSON, XML, Binary, etc). Or, you can often return the result directly, depending on what framework you use.
Erick