We are developing a application using eclipse, spring, ddd and the repository pattern
Our current secenario is composed by the following plugins
Plug-in Domain.project: contains the interface Repository.class.
Plug-in Repository.project: contains the different implementations of the interface Repository.class, for instance ExampleRepositoryImpl.class. So this plug-in has Domain.project plug-in on its dependencies.
We have created Service.class, in Plug-in Domain.project, which is calling through injection, one of the implementations of Repository implemented on the Plug-in Repository.project. But the injection is not solved properly.
We are no able to add a dependecy to Repository.project from Domain.project, cause this would throw a redundancy cyclic error.
Also, since we are following the DDD approach the Domain.project could see the rest but opposite.
Thank you so much,
Kind regards,
Eclipse, Spring, DDD and the repository pattern
As you said, the repository interfaces are on the domain project.
We have create one project per every implementation of the interfaces included on the domain project.
For example, we have created a project for the implementation for the JBDCRepository, another one for the PureQueryRepository, another one for the JsonRepository, and so on.
For this reason the repository project implementations have a dependency ("see") the domain project, but the domain project doesn't have any dependency to the repository project implementations.
So, our problem takes place when we would like to choose/inject through Spring any of these repositories, since the domain project doesn't see any of the repository project implementations we get a ClassNoFoundException
Kind regards,
Brais CidrĂ¡s.
The domain shouldn't care which implementation it uses - that's why you separate repository interface from repository implementation in the first place.
In order for you to decide which implementation to use, think of how dynamic the selection of the implementation is:
Decide at server startup -> use e.g. Spring Profiles: Use a profile called "jdbc", another called "json", and so on, and activate the desired profile when starting the application. In this way, only the repository implementations of the specified profile will be instantiated and injected.
Decide at class level -> use e.g. Spring Qualifiers
If one Spring bean needs the "jdbc" implementation of your repository, whereas another one needs the "json" implementation of the same repository, then instantiate each implementation with the respective qualifier name and inject the desired repository implementation by specifying its qualifier.
Related
I have the following doubt, probably a very basic one, that I have already managed to work out but I would like to listen if there is a different approach or actually if I am getting something wrong.
Background
I have an implementation with Springboot with a classic layered approach using Spring StereoTypes and wiring all up using Field DI (yes... I am aware it is not the best approach)
Service -> Repository -> (Something else)
In my case (something else) is a third party Rest API which I am calling using a RestTemplate with a specific configuration.The current solution has many services and repositories to deal with each of the Third Party domain entities. All of them using the same RestTemplate bean. The bean is inyected at the repository level.
Problem
So now I have been told from the Third Party System that depending on which business scenario my local services are executing, repositories need to use one of two different users, therefore, I assume that a different restTemplate config needs to be added. At first glance it drives me to move even higher the decision of which restTemplate to use. At Service level, not at the repo level. So I would need to have, lets say, a service A under a specific context whose dependencies (the repository) will need to have a specific template, and the same service A given another context, with a different dependency config.
Approach
The approach that I took is to have a configuration class where I generate different versions of the same service with different dependencies, in particular, their repositories using a specific template. Github Example
This approach seems like odd to me because up till now I have never had to do something like this ...and leaves me with the doubt if something different can be done to achive the same.
Another approach would be to inject both RestTemplates in the base repository and with an extra parameter to decide which to use in each method that it is being use at service level and repo level. Which I dislike.
Is it not possible to include dependencies based on class properties? E.g. if I am building a framework that I want to integrate with any customer system, the type of DB the customer uses could be a variable but my framework may use it if it can acquire a data source. So in this case, my Maven project should be able to integrate with any DB by declaring the corresponding DB war as dependency.
E.g.
<dependency>
<artifactId>${database.artifactId}</artifactId>
....
</dependency>
But this database.artifactId in itself will be read from properties file that may be accesible to customer code, so the idea of having parent pom declare the versions and artifactId as mentioned here may not suit my case.
Is there a work around or is this use case itself so wrong? I strongly think if we build a framework that is more like a product the customer can integrate with, this flexibility of declaring any runtime dependency based on propertie should be there.
Thanks,
Paddy
This is not how it is done with Maven.
If, as in your case, you write a framework that may use different dependencies, then you do not somehow conditionally depend on the concrete implementation. This would not work, as the exact list of dependencies is constructed at build time (i.e. when the Maven artifact of you framework is built and installed).
Rather, there should be a special Maven artifact which describes just the interface of the functionality that you need. This artifact will typically contain the (Java) interfaces your framework needs; this is what your framework will depend on.
Then, when using your framework, a concrete implementation of these interfaces must be included - by the people using your framework, because only they know which implementation they use.
This is explained for example in "Maven by Example", chapter "7.10.1. Programming to Interface Projects".
Example:
JDBC: The interface is part of the JDK, so a framework that uses JDBC does not need to declare any special dependencies (if JDBC were not part of the JDK, you would depend on a Maven artifact like "jdbc-api" or similar). A software that actually uses JDBC will have to depend on whatever JDBC driver it actually uses (Oracle, HSQL etc.).
I have MailTransport.java and two classes extending it: LiveMailTransport.java and TestMailTransport.java.
LiveMailTransport will really send emails while TestMailTransprot will only write them to the log for testing purpose.
Somewhere I do new MailTransport(); and I would like to replace every usage of MailTransport in my server-side code either with Live- or with TestMailTransport depending on the profile used for compiling (local, production, etc..).
(Similar to gwts "replace-with" on client side...)
How could I do that with maven?
Thanks!
What you want is a factory which accepts a system property. If the system property isn't set, create an instance of LiveMailTransport. If the property is there, create an instance of TestMailTransport.
Proposed name of property: com.pany.app.enableTestMails
Boolean.getBoolean(String) is your friend.
Now configure the surefire plugin to set the property and you're done.
That sounds like a misuse of Maven, cause this looks more like dependency injection task (guice for example) but there is no relationship with Maven.
If you're using Spring or some other dependency injection framework you could manipulate dependencies injected based on inclusion of corresponding configuration.
But if you want to do it with a plain bare bone Java application you could create multiple factories that will create corresponding instances of yoor MailTransport and place these factories into a different source folders. Then use build-helper-maven-plugin to add correspoinding source folder based on active profiles.
I have a Maven 3 project that uses Hibernate 3. In the Hibernate properties file, there is an entry for hibernate.connection.provider_class with the class corresponding to the C3P0 connection provider (org.hibernate.connection.C3P0ConnectionProvider). Obviously, this class is only used at runtime, so I don't need to add the corresponding dependency in my POM with the compile scope. Now, I want to give the possibility to use any connection pooling framework desired, so I also don't add a runtime dependency to the POM.
What is the best practice?
I thought about adding an entry to the classpath corresponding to the runtime dependency (in this case, hibernate-c3p0) when the application is run (for example, using the command line). But, I don't know if it's possible.
This is almost (maybe exactly) the same problem as with SLF4J. I don't know if Hibernate also uses the facade pattern for connection pooling.
Thanks
Since your code doesn't depend on the connection pooling (neither the main code nor the tests need it), there is no point to mention the dependency anywhere.
If anyone should mention it, then that would be Hibernate because Hibernate offers this feature in its config.
But you can add it to your POM with optional: true to indicate:
I support this feature
If you use it, then I recommend this framework and this version
That will make life slightly more simple for consumers of your project.
But overall, you should not mention features provided/needed by other projects unless they have some impact on your code (like when you offer a more simple way to configure connection pooling for Hibernate).
[EDIT] Your main concern is probably how to configure the project for QA. The technical term for this new movement is "DevOps" - instead of producing a dump WAR which the customer (QA) has to configure painstakingly, configuration is part of the development process just like everything else. What you pass on is a completely configured, ready-to-run setup.
To implement this, create another Maven module called "project-qa" which depends on your project and everything else you need to turn the dead code into a running application (so it will depend on DBCP plus it will contain all the necessary config files).
Maven supports overlayed WARs which will allow you to implement this painlessly.
You can mark your dependency as optional. In this case it will not be packaged into archives. In this case you have to ensure that your container provides required library.
You could use a different profile for each connection provider. In each profile you put the runtime dependency that correspond to the connection provider you want to use and change the hibernate.connection.provider_class property accordingly.
For more details about how to configure dependencies in profiles, see Different dependencies for different build profiles in maven.
To see how to change the value of the hibernate.connection.provider_class property see How can I change a .properties file in maven depending on my profile?
In the following senario
I'm wrapping an external jar file (read a dependency I've no control over) and wrapping this in a service to be exposed over RMI.
I'd like my service interface to also be exported as a maven dependency however as it will be returning classes defined in the dependency this means that the dependency itself will be used as a dependency of my service interface.
Unfortunatly the origional jar file contains many classes that are not relevant to my exposed service.
Is it possible to depend on just a few classes in that jar file in maven (possibly by extracting and repackaging the few classes that are relevant)?
uberbig_irrelevant.jar
com.uberbig.beans <-- Need this package or a few classes in it.
com.uberbig.everythingElse
Service project includes all of uberbig jar. But exposes a service BeanService which has a call which returns an insance of com.uberbig.beans.IntrestingLightWeightSerialiasbleBean.
Service interface project needs to have a bean definition that looks like
interface BeanFetcher {
public IntrestingLightWeightSerialiasbleBean fetchBeanById(long beanId);
}
So ideally my serviceInterface jar file would only include the BeanFetcher interface. The definition of IntrestingLightWeightSerialiasbleBean and any direct dependencies of IntrestingLightWeightSerialiasbleBean.
The project is for use internally and won't be publically exposed so there should be no problems repackaging so long as the repackaged bean definitions are binary and searially compatable with the external jar file.
Any Suggestions?
Possibly related question Maven depend on project - no jar but classes
Maybe I could use something from the dependency:copy section of the maven-dependency-plugin but I haven't figured out how to do that.
I think you got the plugin right, but not the goal. You should use dependency:unpack instead.
You should be able to use an inclusion filter to extract only the classes you need, and then repack them into your own jar. (The service interface jar if you do it in the service interface project, but you can just as well set up a separate project.)
Create your own repackaged jar and put it in your local repo. And hope you've actually identified all dependencies, accounting for reflection, etc. IMO not really worth it.
You may be able to do it automatically (with the associated increased risk) by using ProGuard/etc. to pull out unused classes etc. That could be done on your own artifact as well, for example, by making an all-in-one jar via jarjar/onejar/etc.