For a project, we will develop some micro services. They will have many common parts like configurations class, filters...
I search the best practice et method to develop it.
For the moment, i had create a generic micro service, forked by all to develop module above it. But is not convenient and some modification in one web service may be would cause some difficult for the next merge of the generic MS.
Do you have any idea to develop it or sources to inspire me?
Thanks for your responses.
Create one or more libraries containing Abstract classes that implement the shared functionality.
Import one or more of the libraries into each individual project.
Extend the Abstract classes in the individual projects.
Use Spring annotations (for instance #Component or
#RestController) in the individual projects,
not in the common libraries.
Use some Spring annotations (for instance #Scheduled) in the common libraries.
If you do this, allow the individual projects to override the values in these annotations.
For example,
#Scheduled(initialDelayString = "${common.thing.initialDelayString:10_000}",
fixedDelayString = "${common.thing.fixedDelayString:60_000}")
The example annotation includes default values and allow the individual projects to override the values as desired.
Related
I'm building a project which based on microservices architecture in spring boot.The project was divided multiple modules and I used maven dependency management.
Now I want to use services from one module in other module. I have many spring applications. For example, I have 2 application which is named A and B. I want to use classes from A in B and classes of B in A. In this case I used maven dependencies but it is not completely way to using services in one another because I faced with circular dependency.
What should do to use for solve this problem?
It is not a good idea to share classes between microservices, if you want to replace microservice A, you'll have to adapt Microservice B.
Every Service must implement its own data classes which holds the fields which are needed for the service.
MicroService A and MicroService B both can contain a class Foo but this classes can be different by its fields. Perhaps both contain the field 'id' and 'name' but only Microservice A also needs a field 'date' to do his work.
If you have classes that need to be in some of your Microservices, i think it's better to make a shared library and put your shared classes in that, then use your shared library in your Microservices.
Actually i think it's a good idea to put classes that need to be in most of your Microservices in a shared library and use that library. But should be careful, because it may comes to tight coupling which isn't a good thing in Microservices Architecture.
Personally i think some Configuration classes and some Event models that most of your Microservices use are good candidates. But i don't think sharing your Service classes between your Microservices are a good idea. Instead they should use each other's services as they are completely independent and are using external services.
create one common entity application and add that entity application as a dependency. For example assume you have stored user data in micorservice1(MC1) and need this class(User) in other microservices(like MC2, MC3,MC4, and so on) then you can create one entity application like util and add this dependency in required microservices.
I have a use case where I would like build a common interface or service which can update entities of application. Example case is shown as below:
Now every application has to handle update functionality of entities. Rather than implementing update functionality in n application module. I would like to build a common interface or server in spring boot.
Service will be like below:
My question is how to design service/interface which can used for above scenario. Any api or tool which can help me to achieve this. I dont want to write code for update in every application module.
Thanks in advance.
Last year I was thinking about the similar concept to yours, but in Apache Camel framework context. I haven't got enough time and motivation to do so, but your post encouraged me to give it a try - perhaps mostly because I've found your concept very similar to mine.
This is how I see it:
So basically I considered an environment with application that might uses N modules/plugins that enriches application's features, i.e. processing feature etc. Application uses module/plugin when it is available in the classpath - considering Java background. When the module is not available application works without its functionality like it was never there. Moreover I wanted to implement it purely using framework capabilities - in this case Spring - without ugly hacks/ifs in the source code.
Three solutions come to my mind:
- using request/response interceptors and modifying(#ControllerAdvice)
- using Spring AOP to intercept method invocations in *Service proxy classes
- using Apache Camel framework to create a routes for processing entities
Here's the brief overview of POC that I implemented:
I've chosen Spring AOP because I've never been using it before on my own.
simple EmployeeService that simulates saving employee - EmployeeEntity
3 processors that simulates Processing Modules that could be located outside the application. These three modules change properties of EmployeeEntity in some way.
one Aspect that intercepts "save" method in EmployeeService and handles invocation of available processors
In the next steps I'd like to externalize these Processors so these are some kind of pluggable jar files.
I'm wondering if this is something that you wanted to achieve?
link to Spring AOP introduction here: https://docs.spring.io/spring/docs/5.0.5.RELEASE/spring-framework-reference/core.html#aop
link to repository of mentioned POC: https://github.com/bkpawlowski/spring-aop
I have read that dependency injection is good for testing, in that a class can be tested without its dependencies, but the question comes to my mind if Class A depends on Class B or C or any class, testing Class A independent of some class is yielding a test result of zero, not a failed or past test.
Class A was created to do something and if it is not fed anything whether using new key word or setting up the extra files in Spring, Class A won't do any work.
About the idea of making code modular, readable and maintainable: so business classes became cleaner, but all we did was shift confusion from dirty Java business classes to convoluted XML files and having to delete interfaces used to inject to our loosened objects.
In short, it seems we have to make edits and changes to a file somewhere,right?
Please feel free to put me in my place if my understanding is lacking, just a little irritated with learning Spring because I see the same amount of work just rearranged.
Dependency injection is good for unit testing because you can individually test each method without that method depending on anything else. That way each unit test can test exactly one method.
I would say that if the xml is what’s annoying you check out Spring boot. It’s based on a java configuration so no xml and it simplifies a lot of configuration for you based on your class path. When I first started spring I found the xml very daunting coming from a java background but the annotation based configuration and the auto configuring done by spring boot is extremely helpful for quickly getting applications working.
IMO biggest advantage of using the spring is dependency injection which makes your life easy. For example if you would like to create a new service with three dependencies, then you can create a class very easily using Spring. But without spring, you will end up writing different factory methods which will return you the instances you are looking for. This makes your code very verbose with static method calls. You may want to take a look at the code repositories before spring era.
Again if you would like to use Spring or not is your personal call based on project complexity. But it's other features/advantages cant be overlooked.
And XML files or Java configs are the ways of achieving spring configuration - where you would like to add your business logic is personal flavour. Only thing is you should be consistent all across your project.
I would suggest that you read Martin Fowler's great article on Inversion of Control and Dependency Injection to gain a better understanding of why frameworks like Spring can be really useful to solve a well known set of common dependency injection problems when writing software.
As others have mentioned, there is no obligation to use Spring; and whatever you can do with Spring, you can probably do it by other means like abstract factories, factory methods, or service locators.
If your project is small enough, then you probably wouldn't mind solving the dependency injection issues on your own using some design patterns like those mentioned above. However, depending on the size of your project, many would prefer to use a framework or a library that already packs a bunch of solutions to these recurrent head scratchers.
In regards to the advantages of dependency injection frameworks when doing unit testing is the idea that you don't need to test the dependencies of your class, but only your class.
For example, most likely your application has a layered design. It is very common to have a data access class or a repository that you use to retrieve data from a datasource. Logically, you also have a class where you use that DAO.
Evidently, you already wrote unit testing for your DAO, and therefore, when you're testing your business class (where the DAO is being used) you don't care about testing your DAO again.
Fortunately, since Spring requires some form of dependency injection for your DAO, this means your class must provide a constructor or a setter method through which we can inject that DAO into our business class, right?
Well, then during unit testing of your business class, you can conveniently use those injection points to inject your own fake DAO (i.e. a mock object). That way, you can focus on the testing of your business class and forget about retesting the DAO again.
Now compare this idea with other solutions you may have done on your own:
You inject the dependency directly by instantiating the DAO within your business class.
You use a static factory method within your code to gain access to the DAO.
You use a static method from a service locator within your code to gain access to the DAO.
None of these solutions would make your code easy to test because there is no simple manner to get in the way of choosing exactly what dependency I want injected into my business class while testing it (e.g. how do you change the static factory method to use a fake DAO for testing purposes?).
So, in Spring, using XML configuration or annotations, you can easily have different dependencies being injected into your service object based on a number of conditions. For example, you may have some configurations for testing that evidently would be different than those used in production. And if you have a staging environment, you may even have different XML configurations of dependencies for your application depending on whether it is running in production or integration environments.
This pluggability of dependencies is the key winning factor here in my opinion.
So, as I was saying, my suggestion to you is that you first expand your understanding of what problems Spring core (and in general all dependency injection frameworks) is trying to solve and why it matters, and that will give you a broader perspective and understanding of these problems in a way that you could to determine when it is a good idea to use Spring and when it is not.
I have a project in which I am using NHibernate and ASP.Net MVC. The application is intended to allow users to track certain data and then produce views of statistics based upon the data entered. The structure of my application thus far looks something like this:
NHibernate Layer: Contains Repository<T> and UnitOfWork classes, as well as entity mapping definitions.
Core/Service Layer: Contains generic EntityService class. At the moment, this simply defines transaction scope via IUnitOfWork and interfaces with IRepository to provide higher-level data access services.
Presentation Layer (MVC Application): Not yet implemented, but contains the usual stuff plus dependency injection.
I have a couple of questions:
Is it poor design to allow my MVC application to handle dependency injection for ALL layers? For example, as well as dependency injection of EntityService instances into controllers, it will handle the dependency injection of IRepository into the EntityService classes. Should the service layer handle this itself, even though this would mean performing dependency injection in two distinct places?
Where should I produce my statistics? This business logic doesn't seem to belong in my service layer, which, at present, only contains entity type definitions and an interface for modifying and accessing entity properties. I have a few thoughts on this, but I'm not sure which I like best:
Keep my service layer as is and create a separate Statistics project - this is completely independent of the entity types for which it will be used, meaning my MVC controllers will have to pass raw numerical information between my business entities and my (presumably static) statistics classes. This is quite a neat separation but potentially means a lot of business logic still remaining in the presentation layer.
Create a Statistics project; however, create a tight coupling between the classes in this project and my business entities. For example, instead of passing a Reading object's values into a method, I will pass the entire object (or define them as extension methods). This will shift business logic out of my MVC app but the tight coupling seems a bit messy.
Keep all of my business logic inside my service layer. Define strongly-typed subclasses of EntityService, so my services contain both entity-specific business methods and data storage methods, while keeping the entity classes themselves as pure data containers. Create a separate Statistics project for any generic statistical processing and call its methods via my derived service classes. My service classes effectively merge business functions with the storage functionality provided created by IRepository<T>.
I am erring toward the third option but does anyone have any thoughts? Alternative suggestions?
Thanks in advance!
Preliminary observation:
I like the way in which you described your project, I just didn't get why your Data Access Layer (DAL) is called NHibernate Layer: it is odd with all the rest in which you didn't use technology name to describe a logical layer (correctly). So I suggest you to rename it DAL, and use it to abstract your app from NHibernate.
My opinions about your questions:
Absolutely no. It is good to apply Dependency Injection to All Layers. A couple or reasons for which it is good:
1.1 Testing: you can mock DAL interfaces and do unit test Service Layer w/o DAL using another DI config file. In the same way you can mock Service for Web Controllers layer and so on.
1.2 Different DAL implementations: suppose you need different DAL implementation (NOSQL, SQL or LINQ instead of NHibernate, etc..) technologies for different deployment of you project or to scale in the future. You can do that easily maintaining different DI config files.
You can have the same layer deployed in different projects. In the same way you can have a project containing different layers. I think their relation is orthogonal: project is describing a physical (development time and run time) implementation. Layers are logical. So initially I would keep it simple with the third option.
I just don't understand why you saying the following regarding this option:
Create a separate Statistics project for any generic statistical
processing and call its methods via my derived service classes. My
service classes effectively merge business functions with the storage
functionality provided created by IRepository.
I see Statistics as one or more services so you can implement it as namespace with classes inside your Service Layer. And, as any other service, you can inject DAL Repository classes. And, as any other Service/DAL, the Model classes can be shared between different Services and DAL classes.
StatsService.AverageReadingFor(Person p, DateTime start, DateTime end) sounds good.
There are several implementation options:
Using underlying repository features (for example: SQL avg function)
Using Observer Pattern which is implementable also using Dependency Injection
Using Aspect Oriented Programming. See that Spring.Net chapter as an example.
If you have more than one Service Layer instance (more than one server) than 2 and 3 must be adapted for out of process communication using a messaging system.
Just an update - Regarding my second question, I have decided to define an IStatsService<T> which expects an IEntityService<T> to be passed into its constructor. I'll use this for generic statistical processing of business entities and create further interfaces that implement IStatsService<T> where I need more type-specific information.
Hopefully this will help someone who has been scratching their head about a similar problem!
We are writing a new set of services and have decided to make them share a common interface... calling it a BaseService. The idea is that whenever anyone wants to develop a new service in our organization, they should be just able to extend and use this BaseService.
We have written a few other classes which also form a part of this base jar, it does things like handle transactions and connect to database using hibernate etc.
Right now all the services that extend the BaseService are a part of the same project (Eclipse + Maven), and some of the services are dependent on each other, but because they are in the same project we don't have a problem with dependencies.However, we expect 40-50 services to be written which would extend base service and would also be interdependent.
I am worried that the size of the base project would be huge and that just because when someone has to use one service they might have to depend on my base jar which has 50 services.
Is there a way that we can make some projects dynamically dependent on others?
Lets say I have a service A which depends on service B, when I build/compile Service A,it should be able to realize that it has a dependency on service B and automatically use the Service B jar.
I have heard of OSGi, will it solve my problem or is there a way I can do it with Maven or is there a simpler solution ?
Sorry about the long post !
Thanks in advance for your suggestions
It doesn't make any sense to "dynamically" manage project dependencies, since build dependencies are by definition not dynamic.
Your question, at least for the moment, seems to be entirely about how to build your code rather than about how to run it. If you are interested in creating a runtime system in which new service implementations can be dynamically added then OSGi may be the solution to look at. An extra advantage here would be that you could enforce the separation of API from implementation, and prevent the implementing services from invalidly depending on parts of your core module that you do not want them to have visibility of.
You could also use OSGi to manage evolution of your core service API through versioning; for example how do you communicate the fact that a non-breaking change has been made to the API versus a breaking change etc.
I would say there are two options depending if i understand your question correct. First one. You have already defined an interface (java term) and now you have different implementations of that. The simple solution for Maven would be to a have a module which is called for example: service-api and than this will be released and can be used by others as dependencies. On their side they simply implement the interface. No problem with the dependencies. If you are more talking about OSGi than you should take a look to maven-tycho.