how to use spring 3.0 without service? - spring

I am using spring 3.0 with annotation which i am creating dao interface , then dao implementation class,
then creating service interface and then creating service implementation class.
and using object of service implementation into controller. this process is too lengthy.
can i use only dao class and directly use it in controller?
#service
#controller
public class MyController { ... }

You can but you really really really shouldn't. If you have any plans of putting this code into production, then you need to keep your layers well defined. This is because each layer has a different responsibility.
Your DAO layer is responsible for all database access. It doesn't care what object you want, what you want to do with it, and when you want it done. It only cares about how it's done.
Your Service layer is responsible for what you want to do to your objects. It contains all of your business logic, convenience methods and may use multiple DAO's inside a single service. Your service layer is where you handle your database transactions as well.
Your controller layer is responsible for how you want to show your data to the user. It only contains services and minimal logic concerned with how to display the data returned by your service layer.

I think there are 2 distinct questions being asked here :
Is it really necessary to have all three layers : web layer, service layer, and DAO ?
and :
Do Spring beans necessarily need to implement and interface and to bve (auto)wired by interface rather thant concrete class ?
The answer to both is the same : it is not a necessity, but it is strongly recommended for anything other thant trivial projects.
For the first question, yes it is better so separate concerns in different layers. However I must admint that my controllers sometimes access the DAOs directly for simple read-only tasks in some projects. Sometimes the service layer seems like too much when all it does is to map the DAO methods. However I always use a service layer for transactional business logic that actually modifies the data.
For the second question : Spring can, without problem, instantiate and inject beans that don't implement any interface. However you will start to have problems if you use more advanced stuff linked to aspect-oriented programming, the most common being the #Transactional annotation. In this case Spring has to create proxies to your objects, and it is all simpler to create proxies of interfaces. Ohterwise it has to manipulate bytecode and it introduces limitations.
Perhaps more importantly, "programming to an interface" is a good practice regardless if you are using Spring or not, or even Java or anothrer language. A simple google search will bring up many good articles on this, or SO questions such as this one.
So again, while you can live well without both things in the short term, both will make your life much better in the long term.

Related

How to implement DDD using Spring Crud/Jpa Repository

I want to create an app by implementing DDD using Spring. Let's say I have a business entity Customer and an interface CustomerRepository.
Since spring provides CrudRepository and JpaRepository to perform basic CRUD operations and other operations like finder methods by default I want to use them. So my interface becomes
#Repository
public interface CustomerRepository extends JpaRepository<Customer, Long>{
}
But according to DDD, interfaces should be in domain layer and the implementation should be in Infrastructure layer.
Now my question is, which layer the CustomerRepository belongs to?
Short answer: although it should be any dependencies to Infrastructure in the Domain layer, for the sake of KISS, you may to do so. If you want to be a DDD purist, you define a CustomerRepository interface AND an implementation in the Infrastructure that implements both the interfaces.
Long and boring answer: In general, the Domain should not care or know about infrastructure, as in, it should not have dependencies to other layers (Infrastructure, Application, Presentation or whatever architecture you are using). Following this rule leads to a cleaner architecture.
In particular, the domain should not care about the persistence, it should behave as it runs in memory. From the Domain point's of view, the Entities mutate and that's all, no persistence is needed.
The Write side of the Domain code doesn't in fact need persistence. When the Aggregates execute commands, they are already fully loaded. And after the command is executed, the Aggregates just return the changes or the new state. The Aggregates don't persist the changes themselves. They are pure, with no observable side effects.
We, the architects, need persistence because we need to ensure that the data persist between restarts and that we can run the same code on multiple machines on the same time.
There is however, another need that the Domain code, in particular the Read and the Reactive side of the Domain (Sagas/Process managers). These components of the Domain need to query and filter the Domain Entities. The Readmodels need to return the Entities to the caller and the Sagas/Process managers need to correctly identify the right Aggregates to whom to send the Commands.
The solution is to define only the Interfaces in the Domain layer and to have implementations in the Infrastructure. In this way the Domain owns the Interface so, according to the Dependency Inversion Principle, it does not depend on the Infrastructure.
In your case, although the Domain layer depends on something from the Infrastructure part of the Spring Framework, that something is only an Interface. It is still a depency to JPA because your domain will use methods that it doesn't own but KISS could more important in this case.
The alternative would be do define an Interface that does not extend the JpaRepository with an implementation in the Infrastructure that implements this Interface and the JpaRepository Interface.
Which solution depends on you: more code duplication but less dependency or less code duplication and more dependency to JPA.

Is it appropriate to make /service classes #RestController's

I understand from documentation the distinction between entity and service creation. I'm tempted to simply decorate my services with #RestController/#RequestMapping/#[http-method]Mapping in the same way that /web/rest/resources/* are decorated to expose new rest api methods... maybe using "/api/srv/" to distinguish service-level api resources? It seems to work... when I do it the decorated service methods show up in swagger, etc. I guess I'm just looking for a sanity check that I'm not violating framework convention or misunderstanding the intended use of /service classes.
This really comes down to separation of responsibility. You can expose any class as #RestController no matter it is real Controller or a Service or even a Dao.
However, most of the time, we want each layer to do its own thing.
Controller layer handles the http request/response(request param/body, response code/body etc...) and dispatch to different Service layer methods.
Service layer takes care of real business logic and make calls to the DAO layer to retrieve data needed for business logic calculation.
Dao layer eventually has the responsibility access database with query/hibernate/jpa specifics.
As you can see, each layer is trying to do its own thing and isolate the dependencies from other layers. Like controller layer handles http related stuff so that other layers do not have to carry Servlet dependencies. Same thing for DAO, so other layers need not know any detail about data persistence.
As for swagger, it just analyze your code and annotations and expose whatever in the controller layer. :)
All in all, IMHO, it is still recommended to annotate your real Controller classes with #RestController and let your Service handle the real business.

MVC / Repository Pattern - Architecture

I have a project in which I am using NHibernate and ASP.Net MVC. The application is intended to allow users to track certain data and then produce views of statistics based upon the data entered. The structure of my application thus far looks something like this:
NHibernate Layer: Contains Repository<T> and UnitOfWork classes, as well as entity mapping definitions.
Core/Service Layer: Contains generic EntityService class. At the moment, this simply defines transaction scope via IUnitOfWork and interfaces with IRepository to provide higher-level data access services.
Presentation Layer (MVC Application): Not yet implemented, but contains the usual stuff plus dependency injection.
I have a couple of questions:
Is it poor design to allow my MVC application to handle dependency injection for ALL layers? For example, as well as dependency injection of EntityService instances into controllers, it will handle the dependency injection of IRepository into the EntityService classes. Should the service layer handle this itself, even though this would mean performing dependency injection in two distinct places?
Where should I produce my statistics? This business logic doesn't seem to belong in my service layer, which, at present, only contains entity type definitions and an interface for modifying and accessing entity properties. I have a few thoughts on this, but I'm not sure which I like best:
Keep my service layer as is and create a separate Statistics project - this is completely independent of the entity types for which it will be used, meaning my MVC controllers will have to pass raw numerical information between my business entities and my (presumably static) statistics classes. This is quite a neat separation but potentially means a lot of business logic still remaining in the presentation layer.
Create a Statistics project; however, create a tight coupling between the classes in this project and my business entities. For example, instead of passing a Reading object's values into a method, I will pass the entire object (or define them as extension methods). This will shift business logic out of my MVC app but the tight coupling seems a bit messy.
Keep all of my business logic inside my service layer. Define strongly-typed subclasses of EntityService, so my services contain both entity-specific business methods and data storage methods, while keeping the entity classes themselves as pure data containers. Create a separate Statistics project for any generic statistical processing and call its methods via my derived service classes. My service classes effectively merge business functions with the storage functionality provided created by IRepository<T>.
I am erring toward the third option but does anyone have any thoughts? Alternative suggestions?
Thanks in advance!
Preliminary observation:
I like the way in which you described your project, I just didn't get why your Data Access Layer (DAL) is called NHibernate Layer: it is odd with all the rest in which you didn't use technology name to describe a logical layer (correctly). So I suggest you to rename it DAL, and use it to abstract your app from NHibernate.
My opinions about your questions:
Absolutely no. It is good to apply Dependency Injection to All Layers. A couple or reasons for which it is good:
1.1 Testing: you can mock DAL interfaces and do unit test Service Layer w/o DAL using another DI config file. In the same way you can mock Service for Web Controllers layer and so on.
1.2 Different DAL implementations: suppose you need different DAL implementation (NOSQL, SQL or LINQ instead of NHibernate, etc..) technologies for different deployment of you project or to scale in the future. You can do that easily maintaining different DI config files.
You can have the same layer deployed in different projects. In the same way you can have a project containing different layers. I think their relation is orthogonal: project is describing a physical (development time and run time) implementation. Layers are logical. So initially I would keep it simple with the third option.
I just don't understand why you saying the following regarding this option:
Create a separate Statistics project for any generic statistical
processing and call its methods via my derived service classes. My
service classes effectively merge business functions with the storage
functionality provided created by IRepository.
I see Statistics as one or more services so you can implement it as namespace with classes inside your Service Layer. And, as any other service, you can inject DAL Repository classes. And, as any other Service/DAL, the Model classes can be shared between different Services and DAL classes.
StatsService.AverageReadingFor(Person p, DateTime start, DateTime end) sounds good.
There are several implementation options:
Using underlying repository features (for example: SQL avg function)
Using Observer Pattern which is implementable also using Dependency Injection
Using Aspect Oriented Programming. See that Spring.Net chapter as an example.
If you have more than one Service Layer instance (more than one server) than 2 and 3 must be adapted for out of process communication using a messaging system.
Just an update - Regarding my second question, I have decided to define an IStatsService<T> which expects an IEntityService<T> to be passed into its constructor. I'll use this for generic statistical processing of business entities and create further interfaces that implement IStatsService<T> where I need more type-specific information.
Hopefully this will help someone who has been scratching their head about a similar problem!

Example use-cases for using Dependency Injection with the Play Framework

I am a big fan of Dependency Injection and the Play Framework, but am having trouble seeing how the two could be exploited together.
There are modules for Spring and Guice, but the way that Play works makes it hard for me to see how DI could be beneficial beyond some quite simple cases.
A good example of this is that Play expects JPA work to be done by static methods associated with the entity in question:
#Entity
Person extends Model {
public static void delete(long id) {
em().find(id).remove();
}
//etc
}
So there is no need for a PersonManager to be injected into controllers in the way it might for a Spring J2EE application. Instead a controller just calls Person.delete(x).
Obviously, DI is beneficial when there are interfaces with external systems, as the concrete implementation can be mocked for testing etc., but I don't see much benefit for a self-contained Play application.
Does anyone have any good examples? Does anyone use it to inject a Manager-style class into Controllers so that a number of operations can be done within the same transaction, for example?
I believe from this sentence you wrote:
"Does anyone have any good examples? Does anyone use it to inject a Manager-style class into Controllers so that a number of operations can be done within the same transaction, for example?"
that before answering the DI question I should note something: transactions are managed automatically by Play. If you check the model documentation you will see that a transaction is automatically created at the beginning of a request, and committed at the end. You can roll it back via JPA or it will be rolled back if an exception is raised.
I mention this because from the wording of your sentence I'm not sure if you are aware of this.
Now, on DI itself, in my (not-so-extensive) experience with DI, I've seen it used mainly to:
Load the ORM (Hibernate) factory/manager
Load Service classes/DAOs into another class to work with them
Sure, there are more scenarios, but these probably cover most of the real usage. Now:
The first one is irrelevant to Play as you get access to your JPA object and transaction automatically
The second one is quite irrelevant too as you mainly work with static methods in controllers. You may have some helper classes that need to be instantiated, and some may even belong to a hierarchy (common interface) so DI would be beneficial. But you could just as well create your won factory class and get rid of the jars of DI.
There is another matter to consider here: I'm not so sure about Guice, but Spring is not only DI, it also provides a lot of extra functionalities which depend on the DI module. So maybe you don't want to use DI in Play, but you want to take advantage of the Spring tools and they will use DI, albeit indirectly (via xml configuration).
The problem in my humble opinion on the static initialization approach of Play! is that it makes testing harder. Once you approach the HTTP vs Object Orientation problem with static members and objects that carries the HTTP message data (request and response) you make a trade of having to create new instances for each request by the ability of make your objects loosely coupled with the rest of your project classes.
One good example of a different design are servlets, it also extends a base class but it approaches the problem with a single instance creation (by default, because there are configurations that enable more instances).
I believe that maybe a mix of the two approaches would be better, having a singleton of each controller would give the same characteristics of a full static class and would allow dependency injection of some kinds of object. But not the objects with request or session scope, once the controller would need to be created every new request. Moreover it would improve testability by inverting the control of dependency injection, thus allowing arbitrary injection points.
Dependencies would be injected by the container or by a test, probably using mocks for the heavy stuff that much likely would already have been tested before.
In my point of view, this static model pushes the developer away from testing controllers because extending FunctionalTest starts the application server, thus paying the price of heavy objects like repositories, services, crawlers, http clients, etc. I don't want to wait a lot of objects to be bootstrapped just to check if some code was executed on the controller, tests should be quick and clear to make developers love them as their programming assistant/guide.
DI is not the ultimate solution to use everywhere... Don't use DI just because you have it in your hands... In play, you don't need DI to develop controllers/models etc... but sometimes it could be a nice design: IMO, you could use it if you have a service with a well know interface but you would like to develop this service outside Play and test it outside play and even test your play project with just a dummy service in order NOT to depend on the full service implementation. Therefore DI can be interesting: you plug the service loosely in play. In fact, this is the original use case for DI afaik...
I just wrote a blog post about setting up a Play Framework application with Google Guice. http://geeks.aretotally.in/dependency-injection-with-play-framework-and-google-guice
I see some benefits, especially when a component of your application requires a different behavior based on a certain context or something like that. But I def believe people should be selective about what goes into a DI context.
It shows again that you should only use dependencies injection if you really have a benefit. If you have complex services it's useful, but in many cases it's not. Read the chapter about models in the play-documentation.
So to give you an example where you can use DI with play. Perhaps you must make a complex calculation, or you create a pdf with a report-engine. There I think DI can be useful, specially for testing. There I think the guice-module and spring-module are useful and can help you.
Niels
As of a year and some change later, Play 2.1 now has support for dependency injection in controllers. Here's their demo project using Spring 3, which lays it out pretty clearly.
Edit: here's another example using Guice and Scala, if that's your poison.

Where should "#Transactional" be placed Service Layer or DAO

Firstly it is possible that I am asking something that has been asked and answered before but I could not get a search result back. We define transactional annotations on service layer typical spring hibernate crud is usually
Controller->Manager->Dao->Orm .
I now have a situation where I need to choose between the domain model based on client site.
Say client A is using my domain model all is good but then an other client site would give me a web service and not be using our domain model.
Which layer should I be replacing . I believe it has to be DAO which will be getting me data from web service and sending it back.i.e two separately written DAO layers and plugged in based on scenario.
I have now realized that we have been doing tight coupling (if there is such a thing or say not having loose coupling) when we put #Transactional in Service layer. So many brains can not be wrong or are they (I doubt it).
So question is "Where should "#Transactional" be placed Service Layer or DAO ?" and is it service layer downwards I should be replacing.
Eleven years on and still relevant . If I look back at the project somethings were obviously wrong with my understanding of Domain model back then . I was regarding ORM layer as domain model and we wanted to work with ORM and detached entities and no have any data mapping and not have any DTOs. That was the trend those days. These days Domain Model is not the ORM and having a proper Domain model and using ORM or Webservices are datasources take care of this issue. Like many pointed out yes Service is the right place for it and have proper domain model and not regard JPA (ORM) as domain model.
Ideally, Service layer (Manager) represents your business logic and hence it should be annotated with #Transactional.
Service layer may call different DAOs to perform DB operations. Lets assume a situation where you have 3 DAO operations in a service method. If your 1st DAO operation failed, other two may be still passed and you will end up with an inconsistent DB state. Annotating Service layer can save you from such situations.
You are going to want your services to be transactional. If your DAOs are transactional, and you call different DAOs in each service, then you would have multiple transactions, which is not what you want. Make the service calls transactional, and all DAO calls inside those methods will participate in the transactions for the method.
i will suggest to put #Transactional in Service layer methods since we can have multiple DAO implementations. by using this we can made our services will be transactional. refer
best practice is to use A generic BasicService to offer common services.
The Service is the best place for putting #Transactional, service layer should hold the detail-level use case behavior for a user interaction that would logically go in a transaction. in this way we can have maintain separation between web application code and business logic.
There are a lot of CRUD applications that don't have any significant business logic, for them having a service layer that just passes stuff through between the controllers and data access objects is not useful. In these cases we can put transaction annotation on Dao.
So in practice you can put them in either place, it's up to you.
By having multiple calls in your service you need #Transactional in service. different calls to service will execute in different transactions if you put #Transactional in service.
It's of a personal choice based on application types, if application is layerd across many modules and majority of operations are #CRUD based ,then having #transactional annotation at service level makes more sence.. engine type application like schedulers , job servers,#etl report apps, where sessions and user concept does not exists, then propagational transaction at context level is most suitable... we should not end up creating clusterd transactions by putting #transactional every where ending up transactional anti patters...anyways for pragmatic transaction control JTA2 is most suitable answer...again it depends on weather you can use it in a given situations...
You should use #Transactional at service layer, if you want to change the domain model for client B where you have to provide the same data in a different model,you can change the domain model without impacting the DAO layer by providing a different service or by creating a interface and implementing the interface in different model and with the same service populate the model based on the client.This decision is based on the business requirement and the scope of the project.
i have heard in a programming class,that dao layer is responsible for interacting with database, and service is a set of operations might be with dao and therefore database or not and that set of operation is in one transaction, mean is better service be transactional.

Resources