How to create JNDI for Unittests with Spring from XML Configuration? - spring

Is there a simple way to configure a JNDI Environment for Unittests from a Spring XML File and export some stuff (e.g. Datasources)? There is SimpleNamingContextBuilder for that, but this would require code to set up the JNDI Environment, and I would like to be able to just include an XML file into the #ContextConfiguration .

I'd recommend using SimpleJNDI. You can get it from Maven Central.

In many cases you can avoid the underlying problem to this question by these solutions:
Use Spring profiles to encapsulate the JNDI references into a profile and use another profile in JUnit-Tests that declares the replacements. (A little annoying here is that you introduce test related stuff into the production configu[ration files.)
After the XML-configuration files you include in the Unittests another XML-configuration file that overrides the bean definitions that declare JNDI-references.
An advantage of these workarounds is that you avoid JVM-wide constructs like JNDI that might accidentially live longer than the test and do not allow parallel execution of the tests.

Related

Any way to split Spring Boot configuration into multiple properties files without having to specify an environment variable/system property

New to Spring Boot here, long-time Spring Framework user though.
I'm looking for a way to split my externalised configuration into multiple .properties files, for better readability and manageability.
I already saw this SO answer: having the ability to specify a list of configuration file names in spring.config.name (which, by the way, doesn't seem to be mentioned in Boot reference documentation, correct me if I'm wrong) would solve my problem perfectly, however that configuration property can be specified only via system properties or environment variables. If I try to specify it inside my application.properties file, it gets ignored. The same happens for spring.config.additional-location. I understand this happens because, when application.properties is read, it's too late to tell Spring Boot to search for different externalised configuration file names. However this is not a proper solution, because the way I split my configuration should be an "implementation detail" that the consumer of my application shouldn't be aware of, so I don't expect the consumer to specify an external parameter otherwise my application breaks out-of-the-box.
I think that a way to do this should be provided. Perhaps some import mechanism for .properties files or the ability to specify spring.config.name even in application.properties (some known and reasonable limitations would be acceptable).
The best I could find out is to use #PropertySource, but this is not profile aware: unless you use some ugly nested class hack, or you put spring.profiles.active variable in the resource name (which will break if multiple profiles have been activated), you won't get the benefit you have for application.properties profile-specific files.
I was not able to find an "official way" to do this, apart from some statements from Spring Boot devs that say that they're rather promoting the use of a single (possibly giant...) externalised configuration file. It seems like this position is not so popular, judging from the post reactions on GitHub, and IMHO it really seems to be a basic feature missing. I have been working with multiple properties files in Spring Framework (using XML configuration) for years and I never felt that having an only huge file would have been better.
If I understand it right, in Boot 1.x this was in some way possible using the location attribute of #ConfigurationProperties, which is however missing in Boot 2.x.
Any suggestion?
Have you tried with Spring Profile?
What you can do is create application-file1.properties/yml, application-file2.properties/yml and put it in config location and then add spring.profile.active=<your env profiles>,file1,file2.
It will load the files.
This profile entry can be in bootstrap.yml, or JVM args to application, in Manifest-<env>.yml in case of Pivotal Cloud Foundry. Not sure on AWS and other cloud provider.
Hope this will help.

Spring Resource Loading

Can anyone explain how Spring decides where to look for resources when one uses the ResourceLoader.getResource(...) method?
I am having a problem with a multi-module maven application built using Spring Boot whereby in my integration tests my code is able to find resources using resourceLoader.getResource("templates/") or even resourceLoader.getResource("classpath:templates/"). So far so good...
However, when the module is eventually packaged into the executable JAR and run with embedded Tomcat the resources can no longer be resolved. I also tried resourceLoader.getResource("classpath*:templates/") with no success.
What I find concerning is that when I add a logging statement to output the URL being used in the search i get a path to one of the other modules in the project (not the one that actually contains the resource in question). E.g: jar:file:/Users/david/exmaple/target/spring-boot-0.0.1-SNAPSHOT.jar!/lib/module1-0.0.1-SNAPSHOT.jar!/templates/ whereas I believe the resource is in jar:file:/Users/david/exmaple/target/spring-boot-0.0.1-SNAPSHOT.jar!/lib/module2-0.0.1-SNAPSHOT.jar!/templates/
The resource loader was obtained from an Autowired constructor param.
Thanks in advance for any hints.
Edit
Just in case it isn't clear or is of importance, my integration tests for the module in question aren't aware of the other module. I have module1, module2 and a spring-boot module which has dependencies on module1 & module2. Essentially, when I run the integration tests for module 2 the classpath isn't aware of module1 - so I suspect that this has something to do with why it works in the tests.
When you use classpath: or classpath*: prefix, internally, this essentially happens via a ClassLoader.getResources(…​) call in spring.
The wildcard classpath relies on the getResources() method of the underlying classloader. As most application servers nowadays supply their own classloader implementation, the behavior might differ especially when dealing with jar files. A simple test to check if classpath* works is to use the classloader to load a file from within a jar on the classpath: getClass().getClassLoader().getResources("<someFileInsideTheJar>"). Try this test with files that have the same name but are placed inside two different locations. In case an inappropriate result is returned, check the application server documentation for settings that might affect the classloader behavior.
Do not use classpath: form as you have multiple classloader locations of templates/ .
Refer to: resources-classpath-wildcards

Spring boot application.properties maven multi-module projects

We are using spring boot in a multi-module project.
We have a Domain access module which has the common domain object classes, repositories, together with configuration for the datasource, JPA, Hibernate, etc. These are configured using a application.properties. We put all this configuration into the common module to save duplicating these common configurations in the higher level modules.
This all works fine when building the domain module, so the configurations are loaded correctly in the test units.
However the problems start when we try to use the domain module in the higher layer modules; they have their own application.properties which means Spring loads them and not the the Domain module application.properties, which this means the data source is not configured because only the higher module application.properties are loaded.
What we would like is both the domain module and higher level application properties to be loaded by Spring. But we can't see any easy way to do this.
I'm thinking this must be a common problem, and wonder if there any recommended solutions for this problem?
As we are using spring-boot the solution should ideally use annotations instead of applictionContext.xml.
Maybe you should only use application.properties in the top-level aggregator project?
You can always use #PropertySource in the child projects to configure them with a name that is specific to their use case.
Or you can use different names for each project and glue them together in the top-level project using spring.config.location (comma-separated).
I agree with #Dave Syer. The idea of splitting an application into multiple modules is that each of those is an independent unit, in this case a jar file. Theoretically you could split each of those jar files into their own source repositories, and then use them across multiple projects. Let's say you want to reuse these domain classes in both a web and batch application, if all the APPLICATION level configuration is stored within each of the individual modules, it severely reduces their reusability.
IMO only the aggregating module should contain all of the configuration necessary to run as an application, everything else is simply a dependency that can be remixed and reused as necessary.
Maybe another approach could be to define specific profiles for each module and use the application.properties file just to specify which profiles are active
using the spring.profiles.include property.
domain-module
- application.properties
- application-domain.properties
app-module
- application.properties
- application-app.properties
and into the application.properties file of app-module
spring.profiles.include=domain,app
Another thing you can do (besides only using application.properties at the top-level as Dave Syer mentions) is to name the properties file of the domain module something like domainConfig.properties.
That way you avoid the name clash with application.properties.
domainConfig.properties would contain all the data needed for the domain module to be able to tested on it's own. The integration with the rest of the code can easily be done either using multiple #PropertySource (one for domainConfig.properties and one for application.properties) or configuring a PropertySourcesPlaceholderConfigurer bean in your Java Config (check out this tutorial) that refers to all the needed property files
in spring-boot since 2.4 support spring.config.import
e.g
application.name=myapp
spring.config.import=developer.properties
# import from other module
spring.config.import=classpath:application-common.properties
or with spring.config.activate.on-profile
spring.config.activate.on-profile=prod
spring.config.import=prod.properties
ref: https://spring.io/blog/2020/08/14/config-file-processing-in-spring-boot-2-4

Maven 2 replace class implementation depending on profile?

I have MailTransport.java and two classes extending it: LiveMailTransport.java and TestMailTransport.java.
LiveMailTransport will really send emails while TestMailTransprot will only write them to the log for testing purpose.
Somewhere I do new MailTransport(); and I would like to replace every usage of MailTransport in my server-side code either with Live- or with TestMailTransport depending on the profile used for compiling (local, production, etc..).
(Similar to gwts "replace-with" on client side...)
How could I do that with maven?
Thanks!
What you want is a factory which accepts a system property. If the system property isn't set, create an instance of LiveMailTransport. If the property is there, create an instance of TestMailTransport.
Proposed name of property: com.pany.app.enableTestMails
Boolean.getBoolean(String) is your friend.
Now configure the surefire plugin to set the property and you're done.
That sounds like a misuse of Maven, cause this looks more like dependency injection task (guice for example) but there is no relationship with Maven.
If you're using Spring or some other dependency injection framework you could manipulate dependencies injected based on inclusion of corresponding configuration.
But if you want to do it with a plain bare bone Java application you could create multiple factories that will create corresponding instances of yoor MailTransport and place these factories into a different source folders. Then use build-helper-maven-plugin to add correspoinding source folder based on active profiles.

Runtime dependency (e.g. connection pooling) and classpath?

I have a Maven 3 project that uses Hibernate 3. In the Hibernate properties file, there is an entry for hibernate.connection.provider_class with the class corresponding to the C3P0 connection provider (org.hibernate.connection.C3P0ConnectionProvider). Obviously, this class is only used at runtime, so I don't need to add the corresponding dependency in my POM with the compile scope. Now, I want to give the possibility to use any connection pooling framework desired, so I also don't add a runtime dependency to the POM.
What is the best practice?
I thought about adding an entry to the classpath corresponding to the runtime dependency (in this case, hibernate-c3p0) when the application is run (for example, using the command line). But, I don't know if it's possible.
This is almost (maybe exactly) the same problem as with SLF4J. I don't know if Hibernate also uses the facade pattern for connection pooling.
Thanks
Since your code doesn't depend on the connection pooling (neither the main code nor the tests need it), there is no point to mention the dependency anywhere.
If anyone should mention it, then that would be Hibernate because Hibernate offers this feature in its config.
But you can add it to your POM with optional: true to indicate:
I support this feature
If you use it, then I recommend this framework and this version
That will make life slightly more simple for consumers of your project.
But overall, you should not mention features provided/needed by other projects unless they have some impact on your code (like when you offer a more simple way to configure connection pooling for Hibernate).
[EDIT] Your main concern is probably how to configure the project for QA. The technical term for this new movement is "DevOps" - instead of producing a dump WAR which the customer (QA) has to configure painstakingly, configuration is part of the development process just like everything else. What you pass on is a completely configured, ready-to-run setup.
To implement this, create another Maven module called "project-qa" which depends on your project and everything else you need to turn the dead code into a running application (so it will depend on DBCP plus it will contain all the necessary config files).
Maven supports overlayed WARs which will allow you to implement this painlessly.
You can mark your dependency as optional. In this case it will not be packaged into archives. In this case you have to ensure that your container provides required library.
You could use a different profile for each connection provider. In each profile you put the runtime dependency that correspond to the connection provider you want to use and change the hibernate.connection.provider_class property accordingly.
For more details about how to configure dependencies in profiles, see Different dependencies for different build profiles in maven.
To see how to change the value of the hibernate.connection.provider_class property see How can I change a .properties file in maven depending on my profile?

Resources