Surefire class loaders - maven

In most ClassLoaders
ClassLoader classLoader = Thread.currentThread().getContextClassLoader()
classLoader.getResources("");
will return an enumeration that includes directories that contain the class files as well as any jars on the class path. However when executing this code within a maven surefire execution the only items returned are the class and test-class directories. It seems to make no difference if I use the useSystemClassLoader or useManifestOnlyJar properties to adjust the class loader. In addition the class loader (as seen via an attached debugger) appears to have a number of jars attached.
To make things stranger, the debugger shows that the surefire class loader is an instance of the same type as when not running inside surefire.
Does anyone have any pointers for diagnosing the differences or ideas as to how to fix the problem?
Claude

Related

Unit test in multi-module spring boot project

I have multi-module project with similar structure as below:
server (which includes Application Context Configuration) and other configurations
shared (Utility classes used by other modules)
service (module with various repository and services)
transaction (module which handles transaction)
I need to write test for the project but I cannot change the project structure. I created a test in my transaction module.
First I got
Unable to find a #SpringBootConfiguration, you need to use #ContextConfiguration or #SpringBootTest(classes=...) with your test
I solved it by Creating a #Configuration file in the test folder like so
#Configuration
#ComponentScan("com.mohen")
public class TestConfig {
}
And then I used it in the #SpringBootTest(TestConfig.class) .I was able to autowire, the IDE did not show any sign of error. But when I run my tests I get NoSuchBeanDefinitionException from a different class that is trying to autowire a dependency from the service module.
How to solve these issues?
The main configuration file of the application looks like
#SpringBootApplication(scanBasePackages = "com.mohen")
#EnableScheduling
#EnableAsync
#Import(value = {SSIpFilter.class, MainConfig.class})
public class Application extends SpringBootServletInitializer {...}
The MainConfig.class contains componentScan and Import annotation.
If I try to Import the MainConfig.class in my test I get a suggestion to add a dependency to the server module, which I would not want to do.
Also the entire application uses a single property file (yml). Where should I keep my property file for the test?
EDIT
I managed to run the tests, a dataJpaTest and an integration test, but it loads the entire application context.
Now the problem is, the tests that pass normally , fail when I build my project ./gradlew clean build
I get
java.lang.NoClassDefFoundError
in some classes and
Caused by: javassist.NotFoundException
in other.
I have tried adding the javaassist library but it doesn't work.
Any idea?
I found the solution to my question. Due to the project being multi module, the classes and the packages were not being recognized by other modules.
I made a few changes in my build.gradle files of the modules.
testRuntime project(':shared')
I added the above in the dependencies and also added
jar {
enabled = true
}
bootRepackage{
enabled = false
}
The jar creates a simple non executable jar file while the bootRepackage disables the creation of an executable jar which by default is its nature.

Bean scope is generating java.lang.UnsatisfiedLinkError

I have multi module maven project,which has one common module which is being used by other modules. In common module I have few beans(Beans are having singleton scope) that are used by other modules application context. Now the problem is having after using those beans in one module(m1) which was not previously using these beans. Importing the common.xml(Beans are defined here) in Application context of module(m1) generates lot of issue(Cannot find the beans). So I decided to directly use those beans in application context of module(m1). If I keep the scope of beans to singelton , I get java.lang.UnsatisfiedLinkError: no jzmq in java.library.path. This issue is solved by using prototype as scope.
Any idea about this issue.
1 Using native methods make your Java application code platform dependent.
2 The System.loadLibrary method is equivalent as executing the Runtime.getRuntime().loadLibrary method.
3 The System.loadLibrary method shall be used in a static initializer block, in order to be loaded only once, when the JVM loads the class for the first time.

Running testng with Spring context in Intellij Idea

When I run tests in my maven module from cmd I see that Spring context is available for all my tests even when they don't extend AbstractTestNGSpringContextTests and are not annotated with #ContextConfiguration.
However, when I run all tests in the test dir from Idea, some of the tests fail with NPE because #Autowired fields are not initialized. Most confusing is that, as I said, some tests pass and others don't even though they all don't extend AbstractTestNGSpringContextTests and are not annotated with #ContextConfiguration but all require Spring-injected fields in some of the classes. When I run tests separately in Idea, they always fail with NPE because there's no Spring injection. I'm new to testng and can't understand how the Suites are created and run with Spring context.
By the way, we tried it on Ubuntu machine, and the behaviour is not the same. Separate tests failed, but running the package succeeded without injection-related NPE issues.
Anyone encountered anything similar?
The problem seems to revolve around IntelliJ/testNG not processing 'groups' correctly. If a class "A" defines an autowired object, the #beforeGroups method in that class cannot use that object. But a different class "B" that extends class "A", and "B" has a #Test that uses the autowired object, then "B" can use the object (which is defined in "A", where it doesn't work!!!). Problem does not occur in Eclipse or with a Maven goal where surefire controls testNG.
The workaround for IntelliJ is to run an entire suite, instead of just a method or class. If we create a new suite file and strip out all the other classes, then we can run just one class.
See these issues in YouTrack
https://youtrack.jetbrains.com/issue/IDEA-135384
https://youtrack.jetbrains.com/issue/IDEA-125775
https://youtrack.jetbrains.com/issue/IDEA-110703
https://youtrack.jetbrains.com/issue/IDEA-111084

Does ComponentScan order matter?

I'm setting up a very small Spring/REST/JPA project with Boot, using annotations.
I'm getting some Bean not found errors in my REST controller class that has an Autowired repository variable, when I move my JPA repository class out to a different package, and calling componentscan on its package. However, everything was working fine when all my files(5 total) were in the same package.
So I was wondering, however unlikely, if the component scan order matters? For example, if a class is AutoWiring some beans from a package that has not been 'component scanned' yet, will that cause a Bean not found error?
No, Spring loads all configuration information, from files and annotations and the environment when appropriate. It then creates beans (instances of classes) according to a dependency tree that it calculates in memory. In order to do this it has to have a good idea of the entire configuration at startup. The whole model derived from all the aggregated configuration information is called the Application Context.
In modern versions of spring the application context is flexible at runtime and so it's not quite the case that all the configuration is necessarily known up front, but the configuration that is flexible is limited in scope and must be planned for carefully.
Maybe you need to share some code. When you move that stuff, you also need to tell Spring where they went. My guess would be you haven't defined #EntityScan and #EnableJpaRepositories (which default to the location of #EnableAutoConfiguration).
There could be several problems:
You moved your class out of the some package where you have #ComponentScan without arguments. That basically means that components are scan only in this package and its children. Thus, moved class are not scanned and there is no bean to wire.
Wrong package name in #ComponentScan args.
The order isn't matter at all. There is an #Order annotation, but it's purpose is more about loading multiple implementations of sth in a different order.
At first Bean Definitions are created and they have nothing to do with wiring. Then via bean post processors, autowired beans are injected. Since there were no bean definition. There is nothing to inject.
In a well structured program it doesn't, because first each bean gets instantiated, then autowired and then you can actually use them.
However there could be situations where the order does matter and I had an issue figuring out what was going on. So this is an example where it would matter:
You have some Repository that you want to fill with data initially, call it SetupData component.
Then you use #PostConstruct to save the default objects.
You have some component that this Repository depends on but isn't managed by Spring, for example a #Converter.
And that #Converter depends on some other component which you would statically inject.
In this case #PostConstruct methods will be executed before the components into your #Converter get autowired which will result in an exception.
Relying on ComponentScan order is a bad habit, because it's not intuitive especially when you are working with multiple people who may not know about. Or there might be such dependencies that you can't fix the code by changing the scan order.
The best solution in this case was using a task executor service that takes care of running initialization functions.

Annotation-specified bean name conflicts with existing, non-compatible bean def

I'm having a problem with some Spring bean definitions. I have a couple of context xml files that are being loaded by my main() method, and both of them contain almost exclusively a context:component-scan tag. When my main method starts up, I get this error from Spring:
Caused by: org.springframework.context.annotation.ConflictingBeanDefinitionException: Annotation-specified bean name 'converterDAO' for bean class [my.package.InMemoryConverterDaoImpl] conflicts with existing, non-compatible bean definition of same name and class [my.other.package.StaticConverterDAOImpl]
Both DAO classes are annotated this way:
#Repository("converterDAO")
public class StaticConverterDAOImpl implements ConverterDAO {
...
}
The in-memory dao also has the #Repository("converterDAO") annotation. The dao is referenced in other classes like this:
...
private #Autowired #Qualifier("converterDAO") ConverterDAO converterDAO;
...
I want one DAO to override the definition of the other one, which as I always understood it was one of the principal reasons to use a DI framework in the first place. I've been doing this with xml definitions for years and never had any problems. But not so with component scans and annotated bean definitions? And what does Spring mean when it says they are not "compatible"? They implement the same interface, and they are autowired into fields that are of that interface type. Why the heck are they not compatible?
Can someone provide me with a way for one annotated, component-scanned bean to override another?
I had a similar issue with Spring 4.x using #RestController. Two different packages had a class with the same name...
package com.x.catalog
#RestController
public class TextureController {
...
package com.x.cms
#RestController
public class TextureController {
...
The fix was easy...
package com.x.catalog
#RestController("CatalogTextureController")
public class TextureController {
...
package com.x.cms
#RestController("CMSTextureController")
public class TextureController {
...
The problem seems to be that the annotation gets autowired and takes the class name by default. Giving it an explicit name in the #RestController annotation allows you to keep the class names.
In an XML file, there is a sequence of declarations, and you may override a previous definition with a newer one. When you use annotations, there is no notion of before or after. All the beans are at the same level. You defined two beans with the same name, and Spring doesn't know which one it should choose.
Give them a different name (staticConverterDAO, inMemoryConverterDAO for example), create an alias in the Spring XML file (theConverterDAO for example), and use this alias when injecting the converter:
#Autowired #Qualifier("theConverterDAO")
I had a similar problem, with two jar libraries (app1 and app2) in one project. The bean "BeanName" is defined in app1 and is extended in app2 and the bean redefined with the same name.
In app1:
package com.foo.app1.pkg1;
#Component("BeanName")
public class Class1 { ... }
In app2:
package com.foo.app2.pkg2;
#Component("BeanName")
public class Class2 extends Class1 { ... }
This causes the ConflictingBeanDefinitionException exception in the loading of the applicationContext due to the same component bean name.
To solve this problem, in the Spring configuration file applicationContext.xml:
<context:component-scan base-package="com.foo.app2.pkg2"/>
<context:component-scan base-package="com.foo.app1.pkg1">
<context:exclude-filter type="assignable" expression="com.foo.app1.pkg1.Class1"/>
</context:component-scan>
So the Class1 is excluded to be automatically component-scanned and assigned to a bean, avoiding the name conflict.
I had a similar problem, and it was because one of my beans had been moved to another directory recently. I needed to do a "build clean" by deleting the build/classes/java directory and the problem went away. (The error message had the two different file paths conflicting with each other, although I knew one should not actually exist anymore.)
Sometimes the problem occurs if you have moved your classes around and it refers to old classes, even if they don't exist.
In this case, just do this :
mvn eclipse:clean
mvn eclipse:eclipse
This worked well for me.
I had the same issue. I solved it by using the following steps(Editor: IntelliJ):
View -> Tool Windows -> Maven Project. Opens your projects in a
sub-window.
Click on the arrow next to your project.
Click on the lifecycle.
Click on clean.
I also had a similar problem. I built the project again and the issue was resolved.
The reason is, there are already defined sequences for the Annotation-specified bean names, in a file. When we do a change on that bean name and try to run the application Spring cannot identify which one to pick. That is why it shows this error.
In my case, I removed the previous bean class from the project and added the same bean name to a new bean class. So Spring has the previous definition for the removed bean class in a file and that conflicts with the newly added class while compiling. So if you do a 'build clean', previous definitions for bean classes will be removed and compilation will success.
If none of the other answers fix your problem and it started occurring after change any configuration direct or indirectly (via git pull / merge / rebase) and your project is a Maven project:
mvn clean
Explanation internal working on this error
You are getting this error because after instantiation the container is trying to assign same object to both classes as class name is same irrespective of different packages......thats why error says non compatible bean definition of same name ..
Actually how it works internally is--->>>>.
pkg test1;
….
#RestController
class Test{}
pkg test2;
….
#RestController
class Test{}
First container will get class Test and #RestController indicates it to instantiate as…test = new Test(); and it won’t instantiate twice
After instantiating container will provide a reference variable test(same as class name) to both the classes and while it provide test reference
To second class it gets non compatible bean definition of same name ……
Solution—>>>>
Assign a refrence name to both rest controller so that container won’t instantiate with default name and instantiate saperately for both classes irrespective
Of same name
For example——>>>
pkg test1;
….
#RestController(“test1”)
class Test{}
pkg test2;
….
#RestController(“test2”)
class Test{}
Note:The same will work with #Controller,#Service,#Repository etc..
Note: if you are creating reference variable at class level then you can also annotate it with #Qualifier("specific refrence name") for example
#Autowired #Qualifier("test1")
Test test;
I had the same issue on IntelliJ after moving an existing file to a new package, solved cleaning caché, when trying to run with maven got that error. I managed solve it with:
cache:clean
Using Eclipse, I had moved classes into new packages, and was getting this error. What worked for me was doing:
Project > Clean
and also cleaning my TomCat server by right-clicking on it and selecting clean
Scenario:
I am working on a multi-module Gradle project.
Modules are:
- core,
- service,
- geo,
- report,
- util and
- some other modules.
So primarily we have prepared a Component[locationRecommendHttpClientBuilder] in geo module.
Java Code:
import org.springframework.stereotype.Component
#Component("locationRecommendHttpClientBuilder")
class LocationRecommendHttpClientBuilder extends PanaromaHttpClientBuilder {
#Override
PanaromaHttpClient buildFromConfiguration() {
this.setURL(PanaromaConf.getInstance().getString("locationrecommend.url"))
this.setMethod(PanaromaConf.getInstance().getString("locationrecommend.method"))
this.setProxyHost(PanaromaConf.getInstance().getString("locationrecommend.proxy.host"))
this.setProxyPort(PanaromaConf.getInstance().getInt("locationrecommend.proxy.port", 0))
return super.build()
}
}
application-context.xml
<bean id="locationRecommendHttpClient"
class="au.co.google.panaroma.platform.logic.impl.PanaromaHttpClient"
scope="singleton" factory-bean="locationRecommendHttpClientBuilder"
factory-method="buildFromConfiguration" />
Then it is decided to add this component in core module.
One engineer has previous code for geo module and then he has taken the latest module of core but he forgot to take the latest geo module.
So the component[locationRecommendHttpClientBuilder] is double times in his project and he was getting the following error.
Caused by:
org.springframework.context.annotation.ConflictingBeanDefinitionException:
Annotation-specified bean name 'LocationRecommendHttpClientBuilder'
for bean class
[au.co.google.app.locationrecommendation.builder.LocationRecommendHttpClientBuilder]
conflicts with existing, non-compatible bean definition of same name
and class
[au.co.google.panaroma.platform.logic.impl.locationRecommendHttpClientBuilder]
Solution Procedure:
After removal the component from geo module, component[locationRecommendHttpClientBuilder] is only available in core module. So there is no conflicting situation. Issue is solved by this way.
I faced this issue when I imported a two project in the workspace. It created a different jar somehow so we can delete the jars and the class files and build the project again to get the dependencies right.
In my case, issue was with pom.xml
I had dependency added in my application pom.xml for two different packages, which were reflecting to same class name.
Check your pom.xml or annotations which can be the possible injection point for same class.
if you build server with file jar and you use mvn clean install then you change branch with git you have to use command mvn clean either it throw exception as on the article.
key word: mvn clean
Refresh gradle project on Eclipse solved this problem for me

Resources