I'm working on a project Spring Boot project where there are two separate packages named domain and persistence.
The domain package primarily contains the Domain Classes (designed based on the business requirements) whereas the persistence package contains the repository interfaces defined by extending the repositories provided by Spring Data.
I have used Spring Data JPA annotations inside the domain classes and those classes are directly used when defining the repository interfaces as well. Everything works well here.
But the issues I have is that one could argue that domain classes do not need to know about the persistence implementation and domain classes should kept clean without polluting with Spring Data JPA annotation. This makes me this that I should maybe use a different set of classes (let's say Entity classes with more or less attributes) to implement the persistence so that I can keep the domain classes clean. But if I do this;
Spring Data repositories are going to work with these Entity Classes and I will not be able to use the interface based repositories out of the box since I will always have to map the Entity objects returned by repositories to Domain Classes.
I believe that at some point, I will introduce DTOs as well and when I reach this level, there will be too many mappings (Entity Classes to Domain Classes and then Domain Classes to DTOs). I personally think this mapping will be an overhead in the long run.
Summary -
Should I maintain Domain Model Classes and Entity Classes separately or should I just use Domain Model Classes along with Spring Data JPA annotations and KISS?
I think it is a mistake to separate the repository interfaces from the domain classes. Repositories are a part of the domain. Their implementation isn't, but you are not dealing with the implementation since that is provided by Spring Data (and JPA).
If your domain classes and your entity classes should be separate things depends on if they have different needs.
You might encounter scenarios where you need to model entity classes to accommodate the limitations of JPA or whatever persistence technology you use and you don't want to leak that into you domain.
But until you encounter that I don't see the need to separate them.
If you are concerned about annotations on your entities, it might help to realise that annotations are an extremely weak dependency. You can use your entities without the annotations even on the class path. So from a purist point of view they are a smell, but in reality I still have to find a situation where they are problematic.
If you really want to get rid of them you might want to look into jMolecules, which offer technology agnostic annotations for DDD concepts that then get translated into JPA annotations or whatever you want to use.
Related
I have two sets of classes in my spring application - DTOs and Entities.
After reading Clean Code by Uncle Bob, I have got hooked on to naming things right more than ever before.
I sat down refactoring one of my Spring projects and I am not sure if adding DTO suffix for DTO classes is the right thing to do. If not, then how do you differentiate between DTO and Entity classes. I do use Service and Repository suffixes for my service classes and repository interfaces.
Merely keeping them under different packages with same names is not helpful esp. when they are to be used in same scope.
Note: Not sure if this is a precise question to be asked on Stackoverflow.
If you read Core J2EE Patterns, 2nd Edition, it is called Transfer Object with all the sample codes having the TO suffix. You can also have a look at Oracle's Core J2EE Patterns site.
To sum up: You should use either DTO or TO as a suffix to any transfer object you use in your business tier.
We are creating a web based application using, JSF (Primefaces as presentation library) and Spring Data JPA for data access tier. And the project is Spring Boot enabled.
The project is divided into multiple modules (according to tiers), and one of them is the presentation tier.
Do you suggest creating a dependency from presentation tier to Spring Data (so have access to PageRequest and Slice and ... classes) or not?
Otherwise we shall re-implement these classes in this tier and convert them to Spring Data classes, which seems some how verbose.
Do you suggest creating a dependency from presentation tier to Spring Data (so have access to PageRequest and Slice and ... classes) or not?
Every decision you make will have it's Pros and Cons and it really depends on your specific situation if this is a problem or not.
I see the following things in favor of a dependency:
reuse of PageRequest and similar classes. They represent concepts that are needed when working with persistence but aren't really persistence specific. Therefore there is really no point in duplicating them.
On the other hand, Spring Data contains many classes that don't have any business in a presentation layer. For example, those dealing with creating repositories.
Your task is to determine if the risk/damage of having those classes around is bigger than the benefit of having PageRequest and co available.
With all teams and projects I worked with so far I'd opt for just having a dependency.
Here is why:
The domain has a dependency on JPA and Spring Data anyway. So by depending on the domain-layer, you get a transient dependency, no matter if you want or not.
The persistence specific classes inside Spring Data are so specific that I never experienced anybody trying to use them directly.
Note that especially the first point assumes that you are not copying over your JPA entities in separate transport objects, which would kind of negate the benefits of JPA.
I'm a newbie to Spring Framework and of course first thing comes to mind about spring is dependency injection. Now i could be wrong since i just started learning about Spring framework (esp. about dependency injection) but i think that dependency injection of beans to said objects is not meant for transaction data. Since the bean definition is for example defined in the spring.xml (a blue print) it is not meant for transactional data but rather for static and small amount of data. I don't see that there's any way to inject thousands of transactional objects into another object using dynamic XML (created during runtime).
So did i get this right? If that is so what's the real benefit of dependency injection?
There are several benefits from using dependency injection containers rather than having components satisfy their own dependencies. Some of these benefits are:
Reduced Dependencies
Reduced Dependency Carrying
More Reusable Code
More Testable Code
More Readable Code
These benefits are explained in more detail here.
You are right, transactional data (eg: data that represents a table row) don't normally be injected declaratively. Spring DI (dependency injection) commonly use to handle collaboration between multiple classes.
Common examples I've seen is along with DAO (data access object) and MVC (model view controller) pattern. In enterprise environment it's common to have a project with dozens or hundreds of database tables -- hence dozens / hundreds of DAO classes which get injected into controller classes.
If you don't use DI, you need to conciously manage which DAO should be created first, and which DAO should be injected into which controller etc. (this is a nightmare)
Code refactoring is (should) be a common thing as well. Business requirement always changes constantly. Without DI one simple refactoring could result in a massive and tricky untangling of 'which class depends on what and where'
Of all the articles I've read about the dependency injection, this is by far the best.
If you are using Spring for DI, I would suggest you read about #primary annotation. Spring makes it even easier to choose the implementation you want(from multiple implementations) for a given service. This article is good.
Dependency Injection and Inversion of Control change the control flow adding to a specific component the responsibility to manage de dependency graph and manage how it will be connected. The result is a great decoupling between dependant and dependency, improving the code maintainability, application reliability, and testability.
I have a project in which I am using NHibernate and ASP.Net MVC. The application is intended to allow users to track certain data and then produce views of statistics based upon the data entered. The structure of my application thus far looks something like this:
NHibernate Layer: Contains Repository<T> and UnitOfWork classes, as well as entity mapping definitions.
Core/Service Layer: Contains generic EntityService class. At the moment, this simply defines transaction scope via IUnitOfWork and interfaces with IRepository to provide higher-level data access services.
Presentation Layer (MVC Application): Not yet implemented, but contains the usual stuff plus dependency injection.
I have a couple of questions:
Is it poor design to allow my MVC application to handle dependency injection for ALL layers? For example, as well as dependency injection of EntityService instances into controllers, it will handle the dependency injection of IRepository into the EntityService classes. Should the service layer handle this itself, even though this would mean performing dependency injection in two distinct places?
Where should I produce my statistics? This business logic doesn't seem to belong in my service layer, which, at present, only contains entity type definitions and an interface for modifying and accessing entity properties. I have a few thoughts on this, but I'm not sure which I like best:
Keep my service layer as is and create a separate Statistics project - this is completely independent of the entity types for which it will be used, meaning my MVC controllers will have to pass raw numerical information between my business entities and my (presumably static) statistics classes. This is quite a neat separation but potentially means a lot of business logic still remaining in the presentation layer.
Create a Statistics project; however, create a tight coupling between the classes in this project and my business entities. For example, instead of passing a Reading object's values into a method, I will pass the entire object (or define them as extension methods). This will shift business logic out of my MVC app but the tight coupling seems a bit messy.
Keep all of my business logic inside my service layer. Define strongly-typed subclasses of EntityService, so my services contain both entity-specific business methods and data storage methods, while keeping the entity classes themselves as pure data containers. Create a separate Statistics project for any generic statistical processing and call its methods via my derived service classes. My service classes effectively merge business functions with the storage functionality provided created by IRepository<T>.
I am erring toward the third option but does anyone have any thoughts? Alternative suggestions?
Thanks in advance!
Preliminary observation:
I like the way in which you described your project, I just didn't get why your Data Access Layer (DAL) is called NHibernate Layer: it is odd with all the rest in which you didn't use technology name to describe a logical layer (correctly). So I suggest you to rename it DAL, and use it to abstract your app from NHibernate.
My opinions about your questions:
Absolutely no. It is good to apply Dependency Injection to All Layers. A couple or reasons for which it is good:
1.1 Testing: you can mock DAL interfaces and do unit test Service Layer w/o DAL using another DI config file. In the same way you can mock Service for Web Controllers layer and so on.
1.2 Different DAL implementations: suppose you need different DAL implementation (NOSQL, SQL or LINQ instead of NHibernate, etc..) technologies for different deployment of you project or to scale in the future. You can do that easily maintaining different DI config files.
You can have the same layer deployed in different projects. In the same way you can have a project containing different layers. I think their relation is orthogonal: project is describing a physical (development time and run time) implementation. Layers are logical. So initially I would keep it simple with the third option.
I just don't understand why you saying the following regarding this option:
Create a separate Statistics project for any generic statistical
processing and call its methods via my derived service classes. My
service classes effectively merge business functions with the storage
functionality provided created by IRepository.
I see Statistics as one or more services so you can implement it as namespace with classes inside your Service Layer. And, as any other service, you can inject DAL Repository classes. And, as any other Service/DAL, the Model classes can be shared between different Services and DAL classes.
StatsService.AverageReadingFor(Person p, DateTime start, DateTime end) sounds good.
There are several implementation options:
Using underlying repository features (for example: SQL avg function)
Using Observer Pattern which is implementable also using Dependency Injection
Using Aspect Oriented Programming. See that Spring.Net chapter as an example.
If you have more than one Service Layer instance (more than one server) than 2 and 3 must be adapted for out of process communication using a messaging system.
Just an update - Regarding my second question, I have decided to define an IStatsService<T> which expects an IEntityService<T> to be passed into its constructor. I'll use this for generic statistical processing of business entities and create further interfaces that implement IStatsService<T> where I need more type-specific information.
Hopefully this will help someone who has been scratching their head about a similar problem!
we are using EJB3 in our application. Our design aim is to separate persistence layer from Business Layer. So we have developed XXXbean classes to be used as SLSB and XXXRepository classes to be used as persistence classes. We also have POJO that implement reusable NON business logic(get list of countries etc) and we call then service/helper Classes.
We use EJB3 JPA (using Hibernate provider) and Repository classes has all the methods for CRUD operation and the get methods for data access. Currently XXXRepository classes are all POJO and we instantiate these classes directly from the bean XXXClasses or from the service Objects.
Should the XXXRepository classes be SLSB ? what would be the benefits and pitfalls of converting them to SLSB?
An EJB is a container managed bean. This means, that the container manages a lot of options like the transaction, security, resource access (e.g. database) and offers possibilities like timers, remote access or interceptors. Another advantage is the pool and the reuse of instances.
I would say, that if you need something from this container managed options like an entity manager, then use an EJB, in your case an SLSB. But if you don't need any of the provided features, then a usual POJO will do that job.
If the XXXRepository classes are no SLSB, how do they access to the database to perform the CRUD operations? Are you using the Hibernate Session directly? How are the transactions managed? It may make sense to transform thes classes in SLSB and use the injected entity manager for this case.
Adam Bien has written a book called Real World Java EE Pattern. In this book he writes about good EJB architectures and also mentions, which classes should be an EJB (for example a ServiceFacade as a transaction boundary) and which classes can be used as POJOs.