In my spring + maven app, I have created some tests for the Data Access Layer that I would like now to run against multiple datasources. I have something like:
#ContextConfiguration(locations={"file:src/test/resources/testAppConfigMysql.xml"})
public class TestFooDao extends AbstractTransactionalJUnit38SpringContextTests {
public void testFoo(){
...
}
}
It has currently the config location hardcoded, so it can be used only against one datasource.
What is the best way to invoke the test twice and pass two different configs (say testAppConfigMysql.xml and testMyConfigHsqlDb.xml)?
I've seen suggestions to do this via system properties. How can I tell maven to invoke the tests twice, with different values of a system property?
I don't know if there is some sexy and fancy solution, being simple as well, for this. I would just implement base class with all testing stuff and then inherit it into 2 classes with different annotation-based configuration, like this:
#ContextConfiguration(locations={"firstDs.xml"})
public class TestFooDaoUsingFirstDs extends TestFooDao {
}
#ContextConfiguration(locations={"secondDs.xml"})
public class TestFooDaoUsingSecondDs extends TestFooDao {
}
Unless you have to handle really high number of different datasources this way, that is OK for me.
Rather than file:..., you can use classpath:... (remove the src/test/resources, it's implicit if you use classpath). Then you can have a single master context with the line:
<import resource="dao-${datasource}.xml" />
If you run the Maven build with the option -Ddatasource=foo, it will replace the ${datasource} in the master context with the whatever you specify. So you can have datasource-foo.xml, datasource-bar.xml etc. for your different configurations.
(You need to enable Maven resource filtering in the POM for this to work).
Alternatively, check out the new stuff in Spring 3.1: http://www.baeldung.com/2012/03/12/project-configuration-with-spring/
Edit: A third option would be to have all the test classes extend some superclass, and use
Junit's #Parameterised, where the parameters are the different Spring contexts. You couldn't use #ContextConfiguration in that case, but you can always create the Spring context manually, then autowire the test class using org.springframework.beans.factory.config.AutowireCapableBeanFactory.autowireBean()
Check maven invoker plugin. It supports profiles also.
Related
I have multiple spring projects as part of a single umbrella project. Two of them are AuthServer and BackendApplication. AuthServer, as name suggests is used only for auth purposes and rest is handled by BackendApplication. Now I am trying to write tests inside BackendApplication that also need to use auth related work. For that I have added AuthServer as a test dependency to BackendApplication. Now the problem is that, both projects have beans names Utility because of which I get DuplicateBeanException when I am including both contexts in my test. But I can disable any of them as they are necessary. Is there a way around it?
Could you name your beans, for example:
#Bean(name = "my-utility-1")
public Utility createUtility1() {
return new Utility();
}
// or
#Component(value = "my-utility-2")
public class Utility {
...
}
and refer to them by #Qualified
#Autowired #Qualified("my-utility-1")
private Utility myUtility;
Not related to your question, but i think you can mock AuthServer when testing BackendApplication.
We are using spring boot 2.0.0. We have three environments dev, staging, production. Our current config structure
dev
application-dev.yml
application-dev.properties
Likewise, we have a yml and properties file for each environment. After a year of development now the single yml file for a profile become a large monolithic config.
is it possible to have a multiple config files for a profile like below?
application-dev.yml
application-dev-sqs.yml
application-dev-redis.yml
I think there are 2 ways you can achieve this requirement.
spring.profiles.active accepts a comma-separated list of active profiles, so you can always provide dev,dev-sqs,dev-redis as the value.
Another approach is by making use of #PropertySource and a custom PropertySourceFactory to achieve this requirement. You can find an implementation which takes the value from spring.profiles.active to load one corresponding YAML file in the article below. It should be super easy to adapt the implementation to load multiple files by looking for the profile id in the name of the YAML files.
[How-to] Read profile-based YAML configurations with #PropertySource
I was dealing with a similar problem and I'd recommend using yaml configuration.
Let's describe .properties file:
Initital approach
One can use it like this:
#Component
#PropertySources({
#PropertySource("classpath:application.properties"),
#PropertySource("classpath:application-${spring.profiles.active}.properties")
})
public class AppProperties {
}
This is very easy to configure. Limitation is, that you cannot combine profiles. I mean, that when you want to use profile as dev,local where local just alters some config properties for dev profile, Spring will try to load application-dev,local.properties file, which is very likely not what you want.
Btw, this is what Spring will do for you automatically, this is useful for topics as you described.
There is no way to configure it per profile (and not for whole list). Other possibility would be, that one can specify the list in spring.config.name which is not the case at the moment.
Better approach
In short, use:
#Profile("dev")
#Configuration
#PropertySources({
#PropertySource("classpath:topic1-dev.properties"),
#PropertySource("classpath:topic2-dev.properties")
})
public class AppPropertiesDev {
}
Disadvantage is, you have to have several such config classes (dev, staging), but know you have the topics. Also you can use mutliple profiles, which are (as of my testing) loaded in order you specified. That way, your developer can easily use dev configuration and alter just what's needed for his/her testing.
Yaml approach
You can see the approach with yaml in question I asked earlier - Property resolving for multiple Spring profiles (yaml configuration), benefit is smaller amount of files - yaml has all the profiles in one file, which may or may not be what you want.
Yes, it's possible. spring.config.location is used to externalize the config file location in Spring boot applications. This can be used to provide a location of the file in the filesystem or even in the classpath. Based on how you want to provide your application access to the files, you can choose the URI.
Doing it programmatically:
#SpringBootApplication
public class Application {
public static void main(String[] args) {
ConfigurableApplicationContext applicationContext = new SpringApplicationBuilder(Application.class)
.properties("spring.config.location:classpath:/application-dev.yml,classpath:/application-dev-sqs.yml,classpath:/application-dev-redis.yml")
.build()
.run(args);
}
}
Doing it via environment variables:
set SPRING_CONFIG_LOCATION=classpath:/application-dev.yml, \
classpath:/application-dev-sqs.yml, \
classpath:/application-dev-redis.yml
So, you can provide your files as comma-separated values.
I've used classpath here, it can also be a location in the file system:
/home/springboot-app/properties/application-dev.yml,/home/springboot-app/properties/application-sqs.yml,/home/springboot-app/properties/application-redis.yml
Have you tried including profiles yet ?
Example with profile default, you want to load additional properties for redis and db. Within application.properties file, add:
spring.profiles.include=redis, db
This will load files application-redis.properties and application-db.properties respectively
We have multiple test classes in our spring boot application. Some of the classes contain integration tests, some contain unit tests.
These means that if I (e.g. with maven) let all tests to be executed, it will run all tests in all classes.
What I like to achieve is that the integration tests are executed only, if a specific spring profile is set, e.g. via application.yml.
I like e.g. to annotate the whole test class to define that the tests in this class are only executed if the specified spring profile is set.
If it is not set, these tests shall be ignored.
The topic How can I use #IfProfileValue to test if a Profile is active? goes in exactly this direction. #IfProfileValue looks at first glance exactly like it is what I need.
But as it is pointed out, it is not. I could use it, if I would set a specific system property. But I need to use a real spring profile (and not the system property spring.profiles.active - this would ignore a profile set via application.yml)
#Profile seems to look also to be what I need but as the topic Use #Profile to decide to execute test class shows, we should not use it.
So what can be done to achieve this?
Note that there are a lot of questions about tests and spring profiles on stack overflow. But most of them point out how to set configurations in tests specific to spring profiles. That is not would I am looking for.
I would like to execute or ignore the tests.
I don't know exactly how you want to achieve it, but here is a way if you are using junit to conditionally ignore some tests at runtime simply using a configuration property:
application.properties:
test.enabled=true
then in your test code you can use org.junit.Assume and a property like the following:
#Value("${test.enabled}")
private Boolean testEnabled;
#Test
public void test {
org.junit.Assume.assumeTrue(testEnabled);
// your test code
}
now if you set the property test.enabled to true the test will run, otherwise it will be ignored.
Source: Conditionally ignoring tests in JUnit 4
Using JUnit 5, you can use an #Autowired Environment to check if a profile is active #BeforeEach test is run:
Assumptions.assumeTrue(Arrays.asList(this.environment.getActiveProfiles()).contains("integration"));
This checks for a profile named "integration" and works regardless of how the profile was set (system property, environment variable, application.yml, etc.).
If the profile is not active, the test will be ignored, which is similar to using the #Disabled annotation.
It is very easy. My solution in kotlin:
Create annotation
import org.springframework.test.context.junit.jupiter.EnabledIf
import kotlin.annotation.AnnotationRetention.RUNTIME
import kotlin.annotation.AnnotationTarget.CLASS
import kotlin.annotation.AnnotationTarget.FUNCTION
#Target(CLASS, FUNCTION)
#Retention(RUNTIME)
#EnabledIf(
expression = "#{environment.acceptsProfiles('integration')}",
reason = "🏋🏻 Because spring.profiles.active = integration",
loadContext = true)
annotation class Integration
Use it:
import by.package.Integration
#Integration
internal class IntegrationTest {
#Test
// #Integration
fun test() {
assertEquals(4, 2 + 2)
}
#DisableIf annotation has opposite logic
I am trying to run a spring JUnit test case using -
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration({ "classpath:some.xml" })
The xml has bean defined along with in memory db details
<bean id="orderService" class="com.example.OrderServiceImpl">
<!-- set properties, etc. -->
</bean>
I am doing #Value injection inside the bean class OrderServiceImpl, but it does not happen while executing the test case but the same runs fine when I run the application. Can you please help ?
You need to add a PropertySourcesPlaceholderConfigurer or PropertyPlaceholderConfigurer to your test context. This SO question may give you a hint: Populating Spring #Value during Unit Test.
Check to see if a some.xml exists in both main and test trees. If it exists in both, the one in the test tree should override the one in the main branch.
Make sure the some.xml you are actually loading has a property-placeholder, such as
<context:property-placeholder location="classpath:some.properties"/>
I realize that there are more modern ways to manage properties, but this is simple and easy to specify for unit tests.
I find that multiple tests become very awkward with config files on the classpath, so I like to take advantage of a feature of the #ContextConfiguration that lets me create a dedicated minimal config for each test. The way it works is that for each test class, by convention, it can look for a config file in the same relative directory path as your test class package, and named after your test case. That way you can completely control the config and properties for each test case. You might try it--it can eliminate confusion caused by shared config files. To do it, remove the value in the#ContextConfiguration. Then, say you have a test case com.myCompany.SomeTest located in src/test/java/com/myCompany/. Create a file called SomeTest-context.xml in directory src/test/resources/com/myCompany and put the minimal config you need for that unit in the file. #ContextConfiguration will, by convention, find the config file of that name in that location and use it for your test. Although not part of the conventions I just spoke of, I put a properties file for each test in the same directory with just the properties I need for that test, named after the test case as well (e.g. SomeTest.properties). In your test case-specific context, add a property-placeholder line like this to get your test-specific properties:
<context:property-placeholder location="classpath:com/myCompany/SomeTest.properties"/>
At the top of your test case, you would put
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration // no parameters
public class FileEncryptionUtilsTest { ...
If you do that, you'll be able to inject beans and values to your heart's content, without worrying about side-effects of things being added to a shared context or dealing with errors arising from multiple context files with the same name.
The key advantage is that you are testing a unit here, not the integration represented by an application context file. If you share an application context file in your unit tests, you're testing the application context along with your bean, and that's more of an integration test goal, not a unit test need.
I am currently deploying my custom controls as OSGi plugins and I wanted to do the same thing with my beans. I have tried putting them into the OSGi plugin and it works fine but the only problem I have is the faces-config.
It seems it has to be called faces-config in the OSGi plugin to work but that means i can't use beans in the NSF anymore because it seems to ignore the local faces-config.
Is there a way to change the name of the faces-config in the OSGi plugin?
Something like FEATURE-faces-config.xml?
In the class in your plugin that extends AbstractXspLibrary, you can override "getFacesConfigFiles", which should return an array of strings representing paths within the plugin to additional files of any name to load as faces-config additions. For example:
#Override
public String[] getFacesConfigFiles() {
return new String[] {
"com/example/config/beans.xml"
};
}
Then you can put the config file in that path within your Java source folder (or another folder that is included in build.properties) and it will be loaded in addition to your app's normal faces-config, beans and all.
The NSFs are running as separate, distinct Java applications. The OSGi plugin is running in the OSGi layer, above all those distinct Java applications, as a single code base. Consequently, the faces-config is only at that level.
It's possible to load them dynamically, by using an ImplicitObjectFactory, loaded from an XspContributor. That's what is done in OpenNTF Domino API for e.g. userScope (which is a bean stored in applicationScope of an NSF). See org.openntf.domino.xsp.helpers.OpenntfDominoImplicitObjectFactory, which is referenced in OpenntfDominoXspContributor, loaded via the extension point of type "com.ibm.xsp.library.Contributor".
A few caveats:
You have no control over what happens if you try to register your bean with a name the developer also uses for a different variable in that scope.
Unless you add code to check if the library is enabled, as we do, you'll be adding the bean to every database on the server.
You still need to add the library to the NSF. Unless you also provide a component that those databases will all use, there's no way you can programmatically add it, as far as I know.
It might be easier to skip the bean approach and just add an instance of the Java class in beforePageLoad, page controller class, or however you're managing the backing to the relevant XPage (if viewScope) or application (if sessionScope / applicationScope).