I am moving my application to kotlin, and one of my files has an autowire map for the implementation of a strategy pattern. Spring falls to inject the beans when I change this file to kotlin
I have already tried lateinit, #jvmfield and others. I have been making changes and looking at the resulting decompilied java to see if its clear why there is an error. It look like its becuase the hashmap in the java version does not have show the type.
Hashmap vs HashMap<String,Object>
Java version before change. This gathered all beans of type AudienceService and injected them into this map
#Autowired
private Map<String, AudienceService> audienceServiceMap = new HashMap<>();
Kotlin version:
#Autowired
private lateinit var audienceServiceMap : HashMap<String, AudienceService>
Java decompiled version of the above kotlin code
#Autowired
private HashMap audienceServiceMap;
Error by spring
org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'audienceContext': Unsatisfied dependency expressed through field 'audienceServiceMap'; nested exception is org.springframework.beans.factory.NoSuchBeanDefinitionException: No qualifying bean of type 'java.util.HashMap<java.lang.String,
If you refer to audienceServiceMap as a Map<String, AudienceService> instead of HashMap<...>, Spring will have an easier time of finding your bean and injecting it. Generally, it is a good idea to program to an interface and not an implementation.
Without seeing where you are declaring the audienceServiceMap Bean, I'm only guessing, but I suspect Spring considers it a Map, not a (Java) HashMap because you do something like this:
#Bean
fun audienceServiceMap() = mapOf(...)
By doing that (or something like it), Spring sees audienceServiceMap as a Map, not the more specific HashMap.
Related
I have implemnted an interface for using MapStruct:
#Mapper(componentModel = "spring")
public interface MapStructMapper {
MapStructMapper INSTANCE = Mappers.getMapper( MapStructMapper.class );
MyApiModel myInternalClassToMyApiModel(MyDocument.MyInternalClass myInternalClass);
MyDocument.MyInternalClass myApiModelToMyInternalClass(MyApiModel myApiModel);
}
When running the gradle build I get the following exception when tests are executed:
Caused by: org.springframework.beans.factory.NoSuchBeanDefinitionException: No qualifying bean of type 'MapStructMapper' available: expected at least 1 bean which qualifies as autowire candidate. Dependency annotations: {#org.springframework.beans.factory.annotation.Autowired(required=true)}
In my test class I currently have only:
#Autowired
protected MapStructMapper mapper;
and in my build.gradle
implementation 'org.mapstruct:mapstruct:1.4.2.Final'
annotationProcessor 'org.mapstruct:mapstruct-processor:1.4.2.Final'
How can I solve this problem and how can I invoke the mapping if I use MapStruct using an interface?
Based on the information you provided, it's hard to give a definitive answer.
Please check that you not only included the mapstruct dependencies but also the annotation processor in your build such that the MapStructMapperImpl is actually generated.
If it is indeed generated, you must make sure that it is included in the application context of your test. If you use #SpringBootTest, you need to make sure that the interface is declared in a package that is scanned by the component scan. If you construct a dedicated context with #ContextConfiguration, you need to list MapStructMapperImpl.class in the list that you pass to the parameter classes like you would with other classes annotated with #Component.
I came across a tutorial which seemed to be fitting my usecase and tried implementing it. I failed but wasn't sure why. So I tried to find another example with similar code and looked at the book "Spring in Action, Fourth Edition by Craig Walls"
The books describes at page 300 the same basic approach. Define a JdbcTemplate Bean first.
#Bean
NamedParameterJdbcTemplate jdbcTemplate(DataSource dataSource) {
return new NamedParameterJdbcTemplate(dataSource);
}
Then a Repository implementing an Interface
#Repository
public class CustomRepositoryImpl implements CustomRepository {
private final NamedParameterJdbcOperations jdbcOperations;
private static final String TEST_STRING = "";
#Autowired
public CustomRepositoryImpl(NamedParameterJdbcOperations jdbcOperations) {
this.jdbcOperations = jdbcOperations;
}
So I did like the example in the book suggests, wrote a test but got the error message
Error creating bean with name 'de.myproject.config.SpringJPAPerformanceConfigTest': Unsatisfied dependency expressed through field 'abc'; nested exception is org.springframework.beans.factory.NoSuchBeanDefinitionException: No qualifying bean of type 'de.myproject.CustomRepository' available: expected at least 1 bean which qualifies as autowire candidate. Dependency annotations: {#org.springframework.beans.factory.annotation.Autowired(required=true)}
To my understanding as book and tutorial describe, the Repository should be recognized as a Bean definition by the component scan.
To test this I created an context and asked for all registered Beans.
AnnotationConfigApplicationContext
context = new AnnotationConfigApplicationContext();
context.getBeanDefinitionNames()
As assumed my Repository wasn't among them. So I increased, for test purposes only, scope of the search in my project, and set it to the base package. Every other Bean was shown, except the Repository.
As an alternative to component scanning and autowiring, the books describes the possibility to simply declare the Repository as a Bean, which I did.
#Bean
public CustomRepository(NamedParameterJdbcOperations jdbcOperations) {
return new CustomRepositoryImpl(jdbcOperations);
}
After that Spring was able to wire the Repository. I looked at the github code of the book in hope for a better understanding, but unfortunately only the Bean solution, which runs, is implemented there.
So here are my questions:
1.) what possible reasons are there for a Bean definition, is a scenario like this one, not to be recognized by the component scan?
2.) this project already uses Spring JPA Data Repositories, are there any reasons not to use both approaches at the same time?
The problem is naming of your classes. There are many things to understand here.
You define a repository Interface #Repository is optional provided it extends CRUDRepository or one of the repositories provided by spring-data. In this class you can declare methods(find By....). And spring-data will formulate the query based on the underlying database. You can also specify your query using #Query.
Suppose you have a method which involves complex query or something which spring-data cannot do out of the box, in such case we can use the underlying template class for example JdbcTemplate or MongoTemplate..
The procedure to do this is to create another interface and a Impl class. The naming of this interface should be exactly like Custom and your Impl class should be named Impl.. And all should be in same package.
For example if your Repository name is AbcRepository then Your custom repository should be named AbcRepositoryCustom and the implementation should be named AbcRepositoryImpl.. AbcRepository extends AbcRepositoryCustom(and also other spring-data Repositories). And AbcRepositoryImpl implements AbcRepositoryCustom
I was able to "solve" the problem myself.
As we also have a front end class annotated with the same basePackage for the #ComponentScan
#EnableWebMvc
#Configuration
#ComponentScan(basePackages = {"de.myproject.*"})
so there were actually two identical #ComponentScans annotations which I wasn't aware off and this did lead to a conflict. It seams the ordering how the whole application had to be loaded had changed, but thats only me guessing.
I simply moved my Repository and its Impl to a subpackage, and changed the
#ComponentScan(basePackages = {"de.myproject.subpackage.*"})
and now everything works fine. Though it escapes me, what the exact reason behind this behavior is.
I inspect some autoconfiguration classes from Spring boot.
In LiquibaseAutoConfiguration.class i noticed that LiquibaseProperties is autowired and at the same time created using new operator:
#Autowired
private LiquibaseProperties properties = new LiquibaseProperties();
#Autowired
private ResourceLoader resourceLoader = new DefaultResourceLoader();
This does not apply for all configuration classes, i also noticed this in JooqAutoConfiguration. Why new operator is used here?
It's only really of any use with #Autowired(required=false). In that case the instance that's been created by new would be used as a default value if an instance was not available for injection.
In the example you've shown an injected instance is always required so the instance created bynew will either be replaced with injected instance or a failure will occur if there was no instance to inject. In short it's redundant and the code could have been written like this:
#Autowired
private LiquibaseProperties properties;
#Autowired
private ResourceLoader resourceLoader;
Spring Boot 1.4 has fixed this by moving to constructor injection. Support for constructor injection in configuration classes was introduced in Spring Framework 4.3. The code in question now declares the fields as final and assigns their values in the constructor.
If you think about the construction of the object there is a period of time where that field would otherwise be null if the new operator was not used. I tried to find a diagram that showed the lifecycle of a managed bean but only found the documentation around the callbacks (Spring Reference).
Spring will normally instantiate the bean with the default constructor and do the "normal" things to the object (initialize fields). So in this case the fields get assigned new instances of the classes for those fields. Then Spring comes along and autowires the fields based on the instance of the class that is being managed within the ApplicationContext.
It does look odd but it may be due to some initialization within the class where a default properties and resource loader objects have to exist before Spring can do its autowiring?
I try do all things as here
https://spring.io/guides/gs/accessing-data-jpa/
But i get error, when try repeat this line
ConfigurableApplicationContext context = SpringApplication.run(Application.class);
in my app this seems so
#Autowired
private ConfigurableApplicationContext appContext;
error that i get when call getBean function
org.springframework.beans.factory.NoSuchBeanDefinitionException: No unique bean of type [ru.tcsbank.target.core.service.TestRepository] is defined: expected single bean but found 0:
What is the problem with this example?
There are a number of issues that could be causing this. Code from the TestRepository object would be helpful.
From what you've provided, my best guess is that some object, likely the TestRepository, is being wired in, but is missing the annotation to tell Spring it's a bean. Check that the proper objects have #Entity, #Service, #Component, and #Repository.
You have to add a #Repository annotation to your TestRepository.
i have a little trouble in Spring with two component of a service.
I have this component:
#Component
public class SmartCardWrapper
and this one:
#Component
public class DummySmartCardWrapper extends SmartCardWrapper
The service autowire both but spring fails due this expection:
org.springframework.beans.factory.NoSuchBeanDefinitionException: No unique bean of type [com.cinebot.smartcard.SmartCardWrapper] is defined: expected single matching bean but found 2: [dummySmartCardWrapper, smartCardWrapper]
Why it doesn't use class names?
That's one of the most basic concepts of Spring - Inversion of Control.
You don't need to declare your dependencies using their implementation types (to avoid coupling with implementation). You can declare them using interfaces or superclasses instead, and make Spring find the proper implementation class in the context.
In other words, bean are not distinguished by their implementation classes, because you may want to change implementation class of a bean without changing the beans that depend on it. If you want to distinguish between different beans of the same type, use logical bean names instead:
#Autowired #Qualifier("smartCardWrapper")
private SmartCardWrapper smardCardWrapper;
#Autowired #Qualifier("dummySmartCardWrapper")
private SmartCardWrapper dummySmardCardWrapper;