I'm using one mapper generated with MapStruct:
#Mapper
public interface CustomerMapper {
Customer mapBankCustomerToCustomer(BankCustomerData bankCustomer);
}
The default component model is spring (set in pom.xml)
<compilerArg>-Amapstruct.defaultComponentModel=spring</compilerArg>
I have a service in which I inject the customer mapper and works fine when I run the application
#Autowired
private CustomerMapper customerMapper;
But when I run unit tests that involves #SpringBootTest
#SpringBootTest
#AutoConfigureMockMvc
#RunWith(SpringRunner.class)
public class SomeControllerTest {
#Mock
private SomeDependency someDependency;
#InjectMocks
private SomeController someController;
#Test
public void shouldDoSomething() {
...
}
}
I get an org.springframework.beans.factory.UnsatisfiedDependencyException
Unsatisfied dependency expressed through field 'customerMapper'
I followed this answer and my problem was solved as quickly as I pasted proposed lines in my build.gradle file
As you are running your tests via the IDE there are 2 possibilities:
Eclipse or IntelliJ is picking up the Annotation Processors, you need to set them up correctly.
Eclipse or IntelliJ does not pick up the compiler options from the maven compiler
To rule out the possibilities do the following for each:
Make sure the IDE is configured to run APT. Have a look here how you can set it up. Run a build from the IDE and check if there are generated mapper classes
If there are they are most probably generated with the default component model. To solve this you have two options:
Use #Mapper(componentModel = "spring"). I personally prefer this option as you are IDE independent. You can also use a #MapperConfig that you can apply
Configure the IDE with the annotation options. For IntelliJ add the compiler argument in Settings -> Build, Execution, Deployment -> Compiler -> Annotation Processors, there is a section called Annotation Processor Options there add mapstruct.defaultComponentModel as option name and spring as value. I am not sure how to do it for Eclipse
Related
I am using the prometheus library for getting metrics of my Spring Boot application (REST API). I am using the library io.prometheus.simpleclient:0.4.0 and I am including it in my Maven pom.xml. I am using the Counter and #Autowiring (I've tried both field and constructor injection) it to one of my own classes, like such
MyCustomMetricsClass.java
#Component
public MyCustomMetricsClass {
#Autowire
private Counter counterBean;
public void myOwnMetricsMethod() {
counterBean.inc();
// do some stuff
}
THEN, I am #Autowiring this MyCustomMetricsClass into my Service class, MyServiceClass.java, where it seems to run fine when I run my API locally using Spring Boot embedded tomcat on port 8080 (localhost:8080). I can hit endpoints and the metrics are being reported correctly at the actuator endpoint (localhost:8080/actuator/metrics). e.g.
MyServiceClass.java
public MyServiceClass {
#Autowire
private MyCustomMetricsclass myMetrics;
public void genericServiceMethod() {
myMetrics.MyOwnMetricsMethod(); // NULL POINTER EXCEPTION ONLY DURING TEST SCOPE (GROOVY)
}
The problem is, when I run mvn install, which triggers the local Groovy unit tests I have written, I keep getting a NULL POINTER EXCEPTION. With the debugger, I can debug the Groovy unit tests and see in my Service class, the myMetrics variable is NULL. But I don't understand why it works fine at runtime, also, I have annotated the MyCustomMetricsClass as a #Component annotation, so it should be a bean being scanned by Spring Component scan.
This is a multi-module project ; with the structure below
my-project (root, contains root pom.xml)
- my-api (module, contains RestController. has its own pom.xml)
- my-service (module. contains service classes, has its own pom.xml)
- my-model (module, contains all POJO/DTO model classes, has its own pom.xml)
Am I missing some dependency on my classpath? Why does it work at runtime but not during tests? (all my dependencies should have default scope) Is the autowiring broken?
Can you share the code from your unit test?
At a guess, you're using a mocking framework, maybe Mockito?
If this assumption is true, remember, your unit test won't be running up the full Spring context, so no auto-wiring will take place. You will need to inject mocks for the autowired components.
e.g.:
#RunWith(MockitoJUnitRunner.class)
public class MyServiceClassTest {
#Mock
private MyCustomMetricsClass myCustomMetricsClass;
#InjectMocks
private MyServiceClass myServiceClass;
#Test
public void shouldDoTesting() {
myServiceClass. genericServiceMethod();
verify(myMetrics).MyOwnMetricsMethod();
}
}
I have a spring boot application which runs fine via Maven's mvn spring-boot:run command. However, when I try to run it through the IDE, which is Intellij IDEA 2017.2.1 in my case, it fails because it could not #Autowire a data source.
***************************
APPLICATION FAILED TO START
***************************
Description:
Parameter 0 of constructor in com.myApp.Application required a bean of type 'javax.sql.DataSource' that could not be found.
Action:
Consider defining a bean of type 'javax.sql.DataSource' in your configuration.
The original authors of this code base have the main class, which starts the application, accepting constructor arguments for the data source, an approach I am unfamiliar with as I am used to just doing it through the application.properties file and letting Spring Boot wire up it's own DataSource.
#EnableTransactionManagement
#SpringBootApplication
#EnableCaching
public class Application extends JpaBaseConfiguration {
protected Application(DataSource dataSource, JpaProperties properties,
ObjectProvider<JtaTransactionManager> jtaTransactionManagerProvider,
ObjectProvider<TransactionManagerCustomizers> transactionManagerCustomizers) {
super(dataSource, properties, jtaTransactionManagerProvider, transactionManagerCustomizers);
}
In IDEA, I've noticed that the datasource and the properties arguments to this constructor are underlined in red. For datasource the IDE is complaining that two beans exist and it doesn't know which to autowire between XADataSourceAutoConfiguration.class and DataSourceConfiguration.class. As for the other argument to the construction which is underlined in red, properties, it can't find any beans, the IDE complains that no bean of type JpaProperties is found. Here are some other methods which are overridden in the main application starter class,
#Override
protected AbstractJpaVendorAdapter createJpaVendorAdapter() {
return new HibernateJpaVendorAdapter();
}
#Override
protected Map<String, Object> getVendorProperties() {
Map<String, Object> vendorProperties = new LinkedHashMap<>();
vendorProperties.putAll(getProperties().getHibernateProperties(getDataSource()));
return vendorProperties;
}
public static void main(final String[] args) {
SpringApplication.run(Application.class, args);
}
Unfortunately, because I am not familiar with this approach of using the constructor to configure/auto-configure the application in Spring Boot, I am unsure of a few things, but my exact question is why does the application run fine with Maven but not in Intellij IDEA? Moreover, since I don't have access to the original authors to this proprietary code base, I'd love to know why, if anyone can even give me a hint, they have configured the constructor as such as opposed to default autoconfiguration. I also have an integration test which I wrote that I am trying to run but this test, whether run through the IDE or via Maven's failsafe plugin also results in the same error with the DataSource not being #Autowired. So this is another question as to why this test won't run through Maven when the main application will. Here's my integration test,
#RunWith(SpringJUnit4ClassRunner.class)
#WebMvcTest(value = TransactionController.class, secure = false)
public class TransactionControllerIT {
#Autowired
MockMvc mockMvc;
#Test
public void shouldInitiateTransfer() {
String transferTransaction =
"some json string I can't show here on stack overflow";
RequestBuilder requestBuilder = MockMvcRequestBuilders
.post("/begin-transfer")
.accept(MediaType.APPLICATION_JSON).content(transferTransaction)
.contentType(MediaType.APPLICATION_JSON);
MvcResult result = null;
try {
result = mockMvc.perform(requestBuilder).andReturn();
} catch (Exception e) {
fail("Exception in integration test!");
}
MockHttpServletResponse response = result.getResponse();
assertEquals(HttpStatus.CREATED.value(), response.getStatus());
}
}
Thank you for reading my question.
You can easily run any Spring Boot app from IDEA doing the following:
In Maven panel, go to Plugins, unfold spring-boot and right-click on "spring-boot:run". Then click on "Create your-project..." as shown in the image.
This way you can just start the application in a comfortable way from IDEA from the main toolbar:
This way you are still using the maven way, but integrated in IDEA. I don't really know why are you having those problems. I also experience some problems when trying to execute the spring boot app directly.
Your test is failing because it's using a slice test, #WebMvcTest these tests (#DataJpaTest, JsonTest) only loads a small part of the overall application context rather than everything that the application does on startup or a (#SpringBootTest) would.
When using a slice test it will use any annotations, and require any beans, defined within the #SpringBootApplication class.
E.g. because you have autowired beans defined and and two additional annotations for any slice test caching and transaction management will be enabled and it will always require these dependencies passed.
I would not make your main application class extend a configuration class in this way, it's overly complex and smells like XY problem. You should externalize configurations (and Enable annotations) to their own #Configuration class and leave the #SpringBootApplication as vanilla a possible to avoid these sort of errors.
We’ve configured multiple Spring Boot profiles in our application and data source changes according to the profile activated.
We’ve separate data source for Junit Tests. Now, we want to use this data source for JUnit Tests irrespective of the profile activated.
We were able to achieve this using #TestPropertySource in every test class but we need this configuration in one place in pom.xml. I’m aware of maven-surefire-plugin which is used to execute Junit Tests but not sure how we can configure particular data source there.
Is there any way to achieve this?
Any help would be much appreciated.
One simple way of doing this is to create profile-dependable configuratons as wrappers over normal configurations, like this:
Profile 1:
#Profile("ds1")
#Import(Ds1Configuration.class)
public class Ds1ProfileConfiguration {}
Profile 2:
#Profile("ds2")
#Import(Ds2Configuration.class)
public class Ds2ProfileConfiguration {}
where the imported configurations contain actual bean definitions:
public class Ds1Configuration {
#Bean
public DataSource dataSource(){...}
}
and
public class Ds2Configuration {
#Bean
public DataSource dataSource(){...}
}
This will separate your profiles and configurations, so you will be able to use these in your tests:
#SpringBootTest(classes = Ds2Configuration.class)
I'm having an issue with compilation ordering in a mixed Java/Groovy environment. We're using Gradle 2.1, JDK 7, and Groovy 2.3. The code compiles fine in STS (Spring Tool Suite), using the Gradle plugin and the same build.gradle files, but fails when the build is run on the command line. STS is configured to use the Groovy Eclipse plugin, which if I understand things correctly, uses its own compiler. So I think this problem stems from a compilation ordering problem when we use the Groovy compiler from Gradle's Groovy plugin. This is the Groovy class:
#Component
#ToString(includeNames = true, includePackage = false)
class ManagedCloseableHttpClientFactory implements ClientHttpRequestFactory {
#Delegate
HttpComponentsClientHttpRequestFactory factory
...
}
The ClientHttpRequestFactory is a Spring interface that is implemented by the Spring class HttpComponentsClientHttpRequestFactory. Somewhere else in the system, we have a Java class annotated with #Configuration, where the ManagedCloseableHttpClientFactory is injected using #Autowired. Like this:
#Configuration
public class FooConfiguration {
#Autowired
private ManagedCloseableHttpClientFactory httpClientFactory;
...
}
When the build is run from the command line, we get the following error message: /Users/xyz/source/prj/common/build/tmp/compileGroovy/groovy-java-stubs/common/web/client/ManagedCloseableHttpClientFactory.java:10: error: ManagedCloseableHttpClientFactory is not abstract and does not override abstract method createRequest(URI,HttpMethod) in ClientHttpRequestFactory. If we move the field marked with #Autowired to a Groovy class that is annotated with #Configuration, everything works, but not when it's declared inside a Java class. I'm guessing that this is a compilation ordering issue. In our Gradle files, we're using the groovy plugin, and have modified the source directories as follows:
project.sourceSets.main.java.srcDirs = []
project.sourceSets.test.java.srcDirs = []
project.sourceSets.main.groovy.srcDirs = ["src/main/java", "src/main/groovy"]
project.sourceSets.main.resources.srcDirs += ["config"]
project.sourceSets.test.groovy.srcDirs += ["src/test/java","src/test/groovy"]
What's the best approach here? Thanks.
The Groovy compiler's stub generator has some limitations. My best guess is that you can't have Java call a Groovy method materialized by #Delegate. I'd try to get rid of this particular Java->Groovy dependency or this particular usage of #Delegate (i.e. implement the delegation by hand).
If possible inject the interface instead of the concrete class. Since the injection happens on run time the class will be full created then, and in compile time compiler will recognize the interface as having all the required fields.
#Configuration
public class FooConfiguration {
#Autowired
private ClientHttpRequestFactory httpClientFactory;
...
}
In Guice, I have a ProductionModule with my bindings. In my tests, I can load that PLUS a module that overrides a few of the production bindings with mock objects. How do I do such a thing in spring....
For example, load production-spring.xml in test files and then have the test load in test-spring.xml which would ONLY override some of the bindings in production-spring.xml
This tests the integration and make sure changes in production-spring.xml don't break things. These are more automated integration tests then testing a unit and work extremely well.
You can override beans by listing multiple xml files. Beans in the later files will override those loaded before.
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration(
locations = {"classpath:prodDB.xml",
"classpath:applicationContext.xml",
"classpath:testDb.xml"})
public class SpringTest {
#Autowired
protected DataSource dataSource; //uses the datasource from testDb.xml
}
So in this case testDB.xml overrides the DataSource configured in prodDb.xml. This applies even if you don't use the SpringJUnit4ClassRunner as well:
new ClassPathXmlApplicationContext(new String[]
{"classpath:prodDb.xml",
"classpath:testDb.xml"});
Using a tool like Constretto you can do the same with annotated beans:
#Service
public class FooService...
#Service
#Environment("test")
public class FakeFooService ...
Now, if you run a test with the #Environment("test") annotation on the class, the FakeFooService will be used.
In test environment, you can add your Spring context xml with with overwritten beans before other declarations. And there is also external properties you can change for each environment.