Annotation-specified bean name conflicts with existing, non-compatible bean def - spring

I'm having a problem with some Spring bean definitions. I have a couple of context xml files that are being loaded by my main() method, and both of them contain almost exclusively a context:component-scan tag. When my main method starts up, I get this error from Spring:
Caused by: org.springframework.context.annotation.ConflictingBeanDefinitionException: Annotation-specified bean name 'converterDAO' for bean class [my.package.InMemoryConverterDaoImpl] conflicts with existing, non-compatible bean definition of same name and class [my.other.package.StaticConverterDAOImpl]
Both DAO classes are annotated this way:
#Repository("converterDAO")
public class StaticConverterDAOImpl implements ConverterDAO {
...
}
The in-memory dao also has the #Repository("converterDAO") annotation. The dao is referenced in other classes like this:
...
private #Autowired #Qualifier("converterDAO") ConverterDAO converterDAO;
...
I want one DAO to override the definition of the other one, which as I always understood it was one of the principal reasons to use a DI framework in the first place. I've been doing this with xml definitions for years and never had any problems. But not so with component scans and annotated bean definitions? And what does Spring mean when it says they are not "compatible"? They implement the same interface, and they are autowired into fields that are of that interface type. Why the heck are they not compatible?
Can someone provide me with a way for one annotated, component-scanned bean to override another?

I had a similar issue with Spring 4.x using #RestController. Two different packages had a class with the same name...
package com.x.catalog
#RestController
public class TextureController {
...
package com.x.cms
#RestController
public class TextureController {
...
The fix was easy...
package com.x.catalog
#RestController("CatalogTextureController")
public class TextureController {
...
package com.x.cms
#RestController("CMSTextureController")
public class TextureController {
...
The problem seems to be that the annotation gets autowired and takes the class name by default. Giving it an explicit name in the #RestController annotation allows you to keep the class names.

In an XML file, there is a sequence of declarations, and you may override a previous definition with a newer one. When you use annotations, there is no notion of before or after. All the beans are at the same level. You defined two beans with the same name, and Spring doesn't know which one it should choose.
Give them a different name (staticConverterDAO, inMemoryConverterDAO for example), create an alias in the Spring XML file (theConverterDAO for example), and use this alias when injecting the converter:
#Autowired #Qualifier("theConverterDAO")

I had a similar problem, with two jar libraries (app1 and app2) in one project. The bean "BeanName" is defined in app1 and is extended in app2 and the bean redefined with the same name.
In app1:
package com.foo.app1.pkg1;
#Component("BeanName")
public class Class1 { ... }
In app2:
package com.foo.app2.pkg2;
#Component("BeanName")
public class Class2 extends Class1 { ... }
This causes the ConflictingBeanDefinitionException exception in the loading of the applicationContext due to the same component bean name.
To solve this problem, in the Spring configuration file applicationContext.xml:
<context:component-scan base-package="com.foo.app2.pkg2"/>
<context:component-scan base-package="com.foo.app1.pkg1">
<context:exclude-filter type="assignable" expression="com.foo.app1.pkg1.Class1"/>
</context:component-scan>
So the Class1 is excluded to be automatically component-scanned and assigned to a bean, avoiding the name conflict.

I had a similar problem, and it was because one of my beans had been moved to another directory recently. I needed to do a "build clean" by deleting the build/classes/java directory and the problem went away. (The error message had the two different file paths conflicting with each other, although I knew one should not actually exist anymore.)

Sometimes the problem occurs if you have moved your classes around and it refers to old classes, even if they don't exist.
In this case, just do this :
mvn eclipse:clean
mvn eclipse:eclipse
This worked well for me.

I had the same issue. I solved it by using the following steps(Editor: IntelliJ):
View -> Tool Windows -> Maven Project. Opens your projects in a
sub-window.
Click on the arrow next to your project.
Click on the lifecycle.
Click on clean.

I also had a similar problem. I built the project again and the issue was resolved.
The reason is, there are already defined sequences for the Annotation-specified bean names, in a file. When we do a change on that bean name and try to run the application Spring cannot identify which one to pick. That is why it shows this error.
In my case, I removed the previous bean class from the project and added the same bean name to a new bean class. So Spring has the previous definition for the removed bean class in a file and that conflicts with the newly added class while compiling. So if you do a 'build clean', previous definitions for bean classes will be removed and compilation will success.

If none of the other answers fix your problem and it started occurring after change any configuration direct or indirectly (via git pull / merge / rebase) and your project is a Maven project:
mvn clean

Explanation internal working on this error
You are getting this error because after instantiation the container is trying to assign same object to both classes as class name is same irrespective of different packages......thats why error says non compatible bean definition of same name ..
Actually how it works internally is--->>>>.
pkg test1;
….
#RestController
class Test{}
pkg test2;
….
#RestController
class Test{}
First container will get class Test and #RestController indicates it to instantiate as…test = new Test(); and it won’t instantiate twice
After instantiating container will provide a reference variable test(same as class name) to both the classes and while it provide test reference
To second class it gets non compatible bean definition of same name ……
Solution—>>>>
Assign a refrence name to both rest controller so that container won’t instantiate with default name and instantiate saperately for both classes irrespective
Of same name
For example——>>>
pkg test1;
….
#RestController(“test1”)
class Test{}
pkg test2;
….
#RestController(“test2”)
class Test{}
Note:The same will work with #Controller,#Service,#Repository etc..
Note: if you are creating reference variable at class level then you can also annotate it with #Qualifier("specific refrence name") for example
#Autowired #Qualifier("test1")
Test test;

I had the same issue on IntelliJ after moving an existing file to a new package, solved cleaning caché, when trying to run with maven got that error. I managed solve it with:
cache:clean

Using Eclipse, I had moved classes into new packages, and was getting this error. What worked for me was doing:
Project > Clean
and also cleaning my TomCat server by right-clicking on it and selecting clean

Scenario:
I am working on a multi-module Gradle project.
Modules are:
- core,
- service,
- geo,
- report,
- util and
- some other modules.
So primarily we have prepared a Component[locationRecommendHttpClientBuilder] in geo module.
Java Code:
import org.springframework.stereotype.Component
#Component("locationRecommendHttpClientBuilder")
class LocationRecommendHttpClientBuilder extends PanaromaHttpClientBuilder {
#Override
PanaromaHttpClient buildFromConfiguration() {
this.setURL(PanaromaConf.getInstance().getString("locationrecommend.url"))
this.setMethod(PanaromaConf.getInstance().getString("locationrecommend.method"))
this.setProxyHost(PanaromaConf.getInstance().getString("locationrecommend.proxy.host"))
this.setProxyPort(PanaromaConf.getInstance().getInt("locationrecommend.proxy.port", 0))
return super.build()
}
}
application-context.xml
<bean id="locationRecommendHttpClient"
class="au.co.google.panaroma.platform.logic.impl.PanaromaHttpClient"
scope="singleton" factory-bean="locationRecommendHttpClientBuilder"
factory-method="buildFromConfiguration" />
Then it is decided to add this component in core module.
One engineer has previous code for geo module and then he has taken the latest module of core but he forgot to take the latest geo module.
So the component[locationRecommendHttpClientBuilder] is double times in his project and he was getting the following error.
Caused by:
org.springframework.context.annotation.ConflictingBeanDefinitionException:
Annotation-specified bean name 'LocationRecommendHttpClientBuilder'
for bean class
[au.co.google.app.locationrecommendation.builder.LocationRecommendHttpClientBuilder]
conflicts with existing, non-compatible bean definition of same name
and class
[au.co.google.panaroma.platform.logic.impl.locationRecommendHttpClientBuilder]
Solution Procedure:
After removal the component from geo module, component[locationRecommendHttpClientBuilder] is only available in core module. So there is no conflicting situation. Issue is solved by this way.

I faced this issue when I imported a two project in the workspace. It created a different jar somehow so we can delete the jars and the class files and build the project again to get the dependencies right.

In my case, issue was with pom.xml
I had dependency added in my application pom.xml for two different packages, which were reflecting to same class name.
Check your pom.xml or annotations which can be the possible injection point for same class.

if you build server with file jar and you use mvn clean install then you change branch with git you have to use command mvn clean either it throw exception as on the article.
key word: mvn clean

Refresh gradle project on Eclipse solved this problem for me

Related

Spring not recognising some #Configuration and #Component classes

I have - or rather had - a working Spring application, running within IntelliJ. It contains several classes annotated with #Configuration, and several #Component beans. At some point, the following happened:
Intelli started showing errors in the code editor stating "Could not autowire. No bean of 'xxx' type found". But there are such beans which are annotated with #Component.
Breakpoints in the constructor of specific #Component beans are not reached. But that is not true for all #Component beans.
When running in debug mode, breakpoints in certain #Configuration files are not reached, even though the debugger was stopping there before. The application will fail if it is autowired with one of these #Component beans.
The application starts without errors, but obviously without several beans configured in #Configuration classes being called.
The class which contains the main method which runs the Spring Boot application is annotated with #SpringBootApplication. #Component classes which live in the same package as this class are recognised and can be autowired, even into classes in other packages.
I am not aware of anything in the code or project which would have changed.
Under File -> Project Settings -> Modules, under Spring Application Context have now selected all #Configuration files. However this makes no difference.
Have also tried Build -> Rebuild Project.
The packages in which the classes reside have not changed. Has anyone seen anything like this before?
Thanks
If few classes are not getting recognised #Component. Then it could be the case that those classes don't come under the same package. You must have observed that the classes under the same package as of Main class of #SpringBootApplication, got recognised with #Component because #SpringBootApplication defines an automatic #ComponentScan on the package.
So other classes which were defined in some other package are not recognised because there is no #ComponentScan for those classes' package.
You can do the following to get those classes recognised(add the other packages which are not directly under the hierarchy of #SpringBootApplication):
#ComponentScan({"com.example.springboot.anything","com.example.springboot.somethingelse"})
#SpringBootApplication
public class AnySpringBootApplication {
I am sure it will not be a common case, but for me the problem was that my class had a relatively generic name. Although it was located in the package mentioned in the ComponentScan, on the same level with other classes all found and used, I kept having problems that the ApplicationContext failed to load. After I renamed the class it worked, I found that two other classes in org.springframework had the same name.

Spring Boot Bean 'Required bean of type that could not be found'

I have been working on learning how to use spring data and I have created a very simple project to test it. The folder structure and applicationcontext.xml is shown here:applicationcontext.xml and folder structure
The error I am getting is shown here:console error output
.
I have the applicationContext on my classpath and have a bean of that class declared, any idea as to what my problem could be ? Thank you.
EDIT:
I have updated my post to show the main class and the dao class, as well as my pom.xml contents (as at this point, I am wondering if I need to include another dependency . . . )
main class
dao (repository)
I miss the following line in your application log:
... o.s.b.f.xml.XmlBeanDefinitionReader : Loading XML bean definitions from class path resource [applicationContext.xml]
So I assume your applicationContext.xml file is not loaded at all.
Either add
#ImportResource("classpath:applicationContext.xml")
to your application class or add the
#Repository
annotation to your UserRepository class.
In my opinion you should avoid mixing Java and XML Spring configuration if possible.

#Qualifier and #Resource doesn't work when running test case under Spring test framework

I have a test case which has a dependency of 'ticketDao', like below:
import javax.annotation.Resource;
import org.springframework.beans.factory.annotation.Qualifier;
public class LfnSaleCancellationIntegrationTest extends BaseIntegrationTest {
//#Resource(name = "baseTicketDao")
private BaseTicketDao ticketDao;
....
public void setTicketDao(#Qualifier("baseTicketDao") BaseTicketDao ticketDao) {
this.ticketDao = ticketDao;
}
}
and BaseIntegrationTest extends from spring test framework's AbstractJpaTests, Spring is v3.0.5
When run this test case, I got a similar exception:
Caused by: org.springframework.beans.factory.NoSuchBeanDefinitionException:
No unique bean of type [com.mpos.lottery.te.gamespec.sale.dao.BaseTicketDao]
is defined: expected single matching bean but found 2:
[baseTicketDao, extraballTicketDao]
My project has evolved a long time, in fact when I encountered this exception at the first time, #Qualifier solved it. Till today this project has changed much, but I really have no idea why #Qaulifier and #Resource don't work any more.
And if i remove the dependency of 'ticketDao', the test case will pass. I am wondering whether there are some change of spring configuration cause this exception? or ... i have googled much, but seem no other people ever faced such a problem, pls give your comments, thanks very much!
You are using AbstractJPATests which is part of old spring test framework and (indirect) subclass of AbstractDependencyInjectionSpringContextTests. By default the injection is not annotation based but it discovers setters and fields and attempts injection by type. It would be recommended to switch to newer annotation based tests, refer to spring documentation for details.
As a workaround try to change autowire mode. Call it in test constructor as this.setAutowireMode(AutowireCapableBeanFactory.AUTOWIRE_BY_NAME), rename your field to baseTicketDao and remove setter.
I knew the reason. In my new project, there are a statement of context:component-scan in spring configuration file, which will register 4 BeanPostProcessors by default:
AutowiredAnnotationBeanPostProcessor(#Autowired)
RequiredAnnotationBeanPostProcessor(#Require)
CommonAnnotationBeanPostProcessor(JSR-250 annotations, #Resource, #PostConstruct etc, #WebServiceRef )
PersistenceAnnotationBeanPostProcessor(#PersistenceUnit and #PersistenceContext)
While in my old project, only the default BeanPostProcessor(internalAutoProxyCreator) has been registered. My understanding is AutowiredAnnotationBeanPostProcessor will always wire by type. Anyway if remove context:component-scan, my test case can pass now.
In fact i have migrate all my test cases to spring test context framework now, and context:component-scan must be stated, otherwise #Autowired, #Resource etc annotation will be ignored, and you will get a great many of NullPointerException of those automaticaly injected dependencies.
NOTE: <context:annotation-config/> will register those 4 BeanPostProcessors too.

conflicts with existing, non-compatible bean definition of same name and class after proguard obfuscation

after Proguard obfuscation i get the following error :
Unexpected exception parsing XML document from ServletContext resource
[/WEB-INF/applicationContext.xml]; nested exception is
java.lang.IllegalStateException: Annotation-specified bean name 'a'
for bean class [com.company.project.b.a.a.a] conflicts with existing,
non-compatible bean definition of same name and class
[com.company.project.a.a]
i'm using annotation based spring configuration , how can i avoid having two classes with the same name using Proguard because Spring doesn't allow two beans to have the same name.
I'm not sure if this is what you want, but you can specify bean name in #Component (and stereotypes #Repository, #Service and #Controller) value:
#Component("myBeanName")
public class MyBean {
}
I had the same problem and nothing else was helping out. Sometimes the problem occurs if you have moved your classes around and it refers to old classes, even if they don't exist.
In this case, just do this :
mvn eclipse:clean
mvn eclipse:eclipse
This worked well for me.
Another cause; you may have different versions of Spring in your classpath, for example, spring 2.x with spring 3.x. In such condition, beans seem to be loaded twice. If you use maven, check if a module does not import an old version of Spring (mvn dependency:tree) and remove it by excluding the involved spring artifact (exclusions).

Getting Spring Error "Bean named 'x' must be of type [y], but was actually of type [$Proxy]" in Jenkins

I have been debugging this for awhile now, and I'm hoping someone could shed some light here.
I have a Maven project that is added into Jenkins, using JDK 1.6. I'm using AOP in this project to handle the database transaction.
When I run the build in Jenkins, my testcase fails with the following exceptions:-
Caused by: org.springframework.beans.factory.BeanCreationException:
Error creating bean with name 'dataHandlerClassificationImpl':
Injection of resource dependencies failed; nested exception is
org.springframework.beans.factory.BeanNotOfRequiredTypeException:
Bean named 'writerDataLocationImpl' must be of type [xxx.script.WriterData],
but was actually of type [$Proxy17]
...
...
Caused by: org.springframework.beans.factory.BeanNotOfRequiredTypeException:
Bean named 'writerDataLocationImpl' must be of type [xxx.script.WriterData],
but was actually of type [$Proxy17]
...
...
The DataHandlerClassificationImpl class looks something like this:-
#Service
public class DataHandlerClassificationImpl extends DataHandler {
#Resource(name="writerDataLocationImpl")
private WriterData writerData;
...
}
WriterData is an interface with multiple implementations.
I am able to execute the code without problem from the IDE. To determine whether it is a Maven problem or Jenkins problem, I navigated to the Jenkins' project job folder using command line and I'm able to run mvn test without any errors.
I know the proxy error has something to do with AOP, and that I can only autowire to an interface instead of a concrete class... but that's not the case here since I'm able to run my code fine outside Jenkins.
Any ideas? Thanks.
Excerpt from question comments above:
Are you running Cobertura, Sonar or other code-instrumenting tool on Jenkins? Note that mvn site might also be configured to include Cobertura report in generated site.
The problem with Cobertura is that it performs pretty heavy byte-code instrumentation including the addition of some custom interfaces. When Spring starts up it generates proxies for beans. If bean has at least one interface, it uses standard Java proxy. Otherwise it tries to create class-based proxy.
I guess in your case the CGLIB class proxy was used but after Cobertura instrumentation Spring fall back to java proxies. This caused startup error because dependency injection expected class (or CGLIB subclass).
To cut long story short, force CGLIB class proxies and you'll be fine:
<aop:config proxy-target-class="true"/>
Got the same problem using AspectJ.
There was a bean w
#Configuration public class MyConfig{
#Value("classpath:some.properties")
private Resource theResource;
#Bean
public SomeResource getSomeResource()
{
return SomeResource.getOne(theResource);
}
/******/
#Component
public class SomeResource{
public SomeResource(Resource r) {...}
public static getOne(Resource r} { return new SomeResource(r); }
This works fine until AOP/AspectJ is enabled. The injection validates that the SomeResource bean is from class SomeResource, but since it is a Proxy it crashes.
SOlution: use GLIBC proxy for that Bean instead of AspectJ proxies.
#EnableAspectJAutoProxy(proxyTargetClass=false)
public class SomeResource{...}
Makes no sense, but now got a clearer message
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.springframework.cglib.core.ReflectUtils
(file:/path/spring-core/5.2.10.RELEASE/spring-core-5.2.10.RELEASE.jar) to method
java.lang.ClassLoader.defineClass(java.lang.String,byte[],int,int,java.security.ProtectionDomain)
WARNING: Please consider reporting this to the maintainers of org.springframework.cglib.core.ReflectUtils
Meaning Java prevent reflection on this method.Either Spring or Java needs to fix that.

Resources