org.springframework.beans.factory.BeanDefinitionStoreException Failed to read candidate component class - spring-boot

I've read similar post but still cannot resolve this problem. I am using JDK 8 and Spring 5. So it's not due to version issue.
org.springframework.beans.factory.BeanDefinitionStoreException: Failed to read candidate component class: file [G:\githome\product-aggregation\target\classes\com\roger\product\ipc\viewobject\product\MultiUnitVO.class]; nested exception is java.lang.ArrayIndexOutOfBoundsException: 90
at org.springframework.context.annotation.ClassPathScanningCandidateComponentProvider.scanCandidateComponents(ClassPathScanningCandidateComponentProvider.java:454)
at org.springframework.context.annotation.ClassPathScanningCandidateComponentProvider.findCandidateComponents(ClassPathScanningCandidateComponentProvider.java:316)
at org.springframework.context.annotation.ClassPathBeanDefinitionScanner.doScan(ClassPathBeanDefinitionScanner.java:275)
at org.springframework.context.annotation.ComponentScanAnnotationParser.parse(ComponentScanAnnotationParser.java:132)
at org.springframework.context.annotation.ConfigurationClassParser.doProcessConfigurationClass(ConfigurationClassParser.java:288)
Caused by: java.lang.ArrayIndexOutOfBoundsException: 90
at org.springframework.asm.ClassReader.readUTF(ClassReader.java:2646)
at org.springframework.asm.ClassReader.readUTF8(ClassReader.java:2618)
at org.springframework.asm.ClassReader.readMethod(ClassReader.java:1110)
at org.springframework.asm.ClassReader.accept(ClassReader.java:729)
at org.springframework.asm.ClassReader.accept(ClassReader.java:527)
at org.springframework.core.type.classreading.SimpleMetadataReader.<init>(SimpleMetadataReader.java:65)
at org.springframework.core.type.classreading.SimpleMetadataReaderFactory.getMetadataReader(SimpleMetadataReaderFactory.java:103)
at org.springframework.core.type.classreading.CachingMetadataReaderFactory.getMetadataReader(CachingMetadataReaderFactory.java:123)
at org.springframework.context.annotation.ClassPathScanningCandidateComponentProvider.scanCandidateComponents(ClassPathScanningCandidateComponentProvider.java:430)
... 20 common frames omitted
It's like this. I developed a maven plugin and included it in my project. This plugin is used to generate fields for specific classes in process-classes phase. To achieve this, I used ASM(version 5.0) to modify classes. Maven dependency like this:
<dependencies>
<dependency>
<groupId>org.apache.maven</groupId>
<artifactId>maven-plugin-api</artifactId>
<version>2.0</version>
</dependency>
<dependency>
<groupId>com.roger</groupId>
<artifactId>permission</artifactId>
<version>1.0-SNAPSHOT</version>
</dependency>
<!-- https://mvnrepository.com/artifact/commons-io/commons-io -->
<dependency>
<groupId>commons-io</groupId>
<artifactId>commons-io</artifactId>
<version>2.6</version>
</dependency>
<dependency>
<groupId>javassist</groupId>
<artifactId>javassist</artifactId>
<version>3.12.1.GA</version>
</dependency>
<dependency>
<groupId>org.apache.maven.plugin-tools</groupId>
<artifactId>maven-plugin-annotations</artifactId>
<version>3.2</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.codehaus.plexus</groupId>
<artifactId>plexus-utils</artifactId>
<version>3.0.8</version>
</dependency>
<dependency>
<groupId>org.ow2.asm</groupId>
<artifactId>asm</artifactId>
<version>5.0</version>
</dependency>
<dependency>
<groupId>org.ow2.asm</groupId>
<artifactId>asm-tree</artifactId>
<version>5.0</version>
</dependency>
Then I included it in my project.
<plugin>
<groupId>com.roger</groupId>
<artifactId>authorization-maven-plugin</artifactId>
<version>1.0-SNAPSHOT</version>
<executions>
<execution>
<id>execution1</id>
<phase>process-classes</phase>
<goals>
<goal>addFieldPermission</goal>
</goals>
</execution>
</executions>
</plugin>
My project also has indirect dependency on ASM(version 5.0.4). I even upgraded ASM to 5.0.4(same as project), it's still going wrong.
I doubted that it's due to ASM conflict but have no idea how to fix it. Any idea?
EDIT Seems it has problem with the modified class. I debugged ClassPathScanningCandidateComponentProvider(spring-context-5.0.8) and found that exception thrown from line 430 just for the modified resources(as I said I added some fields in them by maven plugin). But I decompiled the modified classes using javap and found it's valid. I guess there is problem with the modified classes but don't know what's it.

As stated in the error log this is clear index out of bound exception.
Caused by: java.lang.ArrayIndexOutOfBoundsException: 90

Finally I got a temp solution. I made the fields public and removed the methods generated and everything goes well.
The generated methods are the root cause. When I keep only the generated fields, Spring Boot started up successfully. Seems javassist 3.12.1.GA might go wrong with JDK 1.8. I will further investigate.

Related

Unable to get liquibase to read changes on my JPA entities

I'm trying to set up liquibase on a spring boot project. I want to set my JPA entities as a reference so that when I run mvn liquibase:diff any changes are written into a changelog. I'm able to run mvn liquibase:generateChangeLog and the file is written and read on application start. However whenever I make some change on one of my entities the diff result is always No changes found, nothing to do.
This is my liquibase.properties
changeLogFile=src/main/resources/db/changelog/changelog-master.yaml
outputChangeLogFile=src/main/resources/db/changes/init.xml
diffChangeLogFile=src/main/resources/db/db.changelog-master.xml
username=postgres
password=postgres
url=jdbc:postgresql://localhost:5432/my_db
driver=org.postgresql.Driver
verbose=true
defaultSchemaName=public
referenceDriver=liquibase.ext.hibernate.database.connection.HibernateDriver
referenceUrl=hibernate:spring:com.domain.api.models\
?dialect=org.hibernate.dialect.PostgreSQL82Dialect\
&hibernate.physical_naming_strategy=org.springframework.boot.orm.jpa.hibernate.SpringPhysicalNamingStrategy\
&hibernate.implicit_naming_strategy=org.springframework.boot.orm.jpa.hibernate.SpringImplicitNamingStrategy
and my pom.xml
<plugin>
<groupId>org.liquibase</groupId>
<artifactId>liquibase-maven-plugin</artifactId>
<version>4.3.4</version>
<configuration>
<propertyFile>./src/main/resources/liquibase.properties</propertyFile>
</configuration>
<dependencies>
<dependency>
<groupId>org.liquibase.ext</groupId>
<artifactId>liquibase-hibernate5</artifactId>
<version>4.3.3</version>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-jpa</artifactId>
<version>${project.parent.version}</version>
</dependency>
<dependency>
<groupId>javax.validation</groupId>
<artifactId>validation-api</artifactId>
<version>2.0.1.Final</version>
</dependency>
</dependencies>
</plugin>
what am I missing here?

Transitive Dependency: Using Elasticsearch Rest High Client problem in AEM

I am trying to use Java High Level Rest Client in Adobe Experience Manager to finish project of comparison between Lucene, Solr and Elasticsearch search engines.
I am having some problems with elasticsearh implementation.
Here is the code:
Dependency in the parent pom.xml (the same is defined in core pom.xml)
<!-- Elasticseach dependencies -->
<dependency>
<groupId>org.elasticsearch.client</groupId>
<artifactId>elasticsearch-rest-high-level-client</artifactId>
<version>7.4.0</version>
</dependency>
The only line of code that I am using that is from dependencies above
try (RestHighLevelClient client = new
RestHighLevelClient(RestClient.builder(new HttpHost(server, port,
protocol),
new HttpHost(server, secondPort, protocol)));)
{
}
catch (ElasticsearchException e)
{
LOG.error("Exception: " + e);
}
protocol = "http", server = "localhost", port = 9200, secondPort =
9201
Error
Dependencies from IntelliJ
I know that there is usually problem with dependencies versions, but all are 7.4.0 in this case. Also elasticsearch 7.4.0v is running locally on 3 nodes.
This project is done on We.Retail project so it is easy to replicate. Also all the code with this error is available here:
https://github.com/tadijam64/search-engines-comparison-on-we-retail/tree/elasticsearch-integration
AEM 6.4v.
Any info or idea is appreciated.
UPDATE
I tried with adding the following to embed these dependencies externally since they are not OSGi dependencies:
<build>
<plugins>
<plugin>
<groupId>org.apache.felix</groupId>
<artifactId>maven-scr-plugin</artifactId>
</plugin>
<plugin>
<groupId>org.apache.felix</groupId>
<artifactId>maven-bundle-plugin</artifactId>
<extensions>true</extensions>
<configuration>
<instructions>
<Embed-Dependency>org.apache.servicemix.bundles.solr-solrj, log4j, noggit, zookeeper,
elasticsearch-rest-high-level-client
</Embed-Dependency>
<Embed-Transitive>true</Embed-Transitive>
<Embed-Directory>OSGI-INF/lib</Embed-Directory>
<Export-Package>we.retail.core.model*</Export-Package>
<Import-Package>
*;resolution:=optional
</Import-Package>
<Private-Package>we.retail.core*</Private-Package>
<Sling-Model-Packages>
we.retail.core.model
</Sling-Model-Packages>
</instructions>
</configuration>
</plugin>
</plugins>
</build>
The error remains. I also tried adding it to the "export-package", but nothing helps.
And by Elasticsearch documentation, all I need to use Elasticsearch is
<dependency>
<groupId>org.elasticsearch.client</groupId>
<artifactId>elasticsearch-rest-high-level-client</artifactId>
<version>7.4.0</version>
</dependency>
but then NoClassDefFoundErrors occurs. It seems like a problem with transitive dependencies maybe. Not sure, but any idea is appreciated.
Some other suggestions can be found here: https://forums.adobe.com/thread/2653586
I have also tried adding it's transitive dependencies like org.elasticsearch and org.elasticsearch.client, but it does not work. The same error, just other class.
AEM version 6.4, Java version: jdk1.8.0_191.jdk
So my guess was right, transitive dependencies were not included altho <Embed-Transitive>true</Embed-Transitive> exists.
The following is necessary when running elasticsearch as a search engine on AEM the problem:
I have added all transitive dependencies in pom.xml (versions are defined in parent/pom.xml):
<!-- Elasticsearch -->
<dependency>
<groupId>org.elasticsearch.client</groupId>
<artifactId>elasticsearch-rest-high-level-client</artifactId>
</dependency>
<dependency>
<groupId>org.elasticsearch.client</groupId>
<artifactId>elasticsearch-rest-client</artifactId>
</dependency>
<dependency>
<groupId>org.elasticsearch</groupId>
<artifactId>elasticsearch</artifactId>
</dependency>
<dependency>
<groupId>org.elasticsearch</groupId>
<artifactId>elasticsearch-x-content</artifactId>
</dependency>
<dependency>
<groupId>org.elasticsearch.plugin</groupId>
<artifactId>rank-eval-client</artifactId>
</dependency>
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-imaging</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.elasticsearch.plugin</groupId>
<artifactId>lang-mustache-client</artifactId>
</dependency>
<dependency>
<groupId>org.apache.httpcomponents</groupId>
<artifactId>httpasyncclient</artifactId>
</dependency>
It is important to add all third-party dependencies as <Embed-Dependency> inside maven-bundle-plugin like this:
<plugin>
<groupId>org.apache.felix</groupId>
<artifactId>maven-bundle-plugin</artifactId>
<extensions>true</extensions>
<configuration>
<instructions>
<Embed-Dependency>org.apache.servicemix.bundles.solr-solrj, noggit,
elasticsearch-rest-high-level-client,
elasticsearch,
elasticsearch-rest-client,
elasticsearch-x-content,
elasticsearch-core,
rank-eval-client,
lang-mustache-client,
httpasyncclient;
</Embed-Dependency>
<Embed-Transitive>true</Embed-Transitive>
<Embed-Directory>OSGI-INF/lib</Embed-Directory>
<Export-Package>we.retail.core.model*</Export-Package>
<Import-Package>
*;resolution:=optional
</Import-Package>
<Private-Package>
we.retail.core*
</Private-Package>
<Sling-Model-Packages>
we.retail.core.model
</Sling-Model-Packages>
<_fixupmessages>"Classes found in the wrong directory";is:=warning</_fixupmessages>
</instructions>
</configuration>
</plugin>
Important to notice:
All third-party dependencies (the ones outside of OSGi) must be included in the "Embed-Dependency"
"Embed-Transitive" must be set to true to include transitive dependencies
"Import-Package" must include "*;resolution:=optional" to exclude all dependencies that could not be resolved so that the program can run
normally
For some reason, there was an error in compile time when "elasticsearch" dependency was added which is not important for this
task, so I've decided to ignore it this way:
<_fixupmessages>"Classes found in the wrong directory";is:=warning</_fixupmessages>
Though challenging, I finally resolved it. There are many similar or the same problems on Google, so I hope this will help someone. Thanks to everyone that tried to help.

java.lang.NoSuchMethodError: org.apache.hadoop.conf.Configuration.reloadExistingConfigurations()V

It looks like I am again stuck on the running a packaged spark app jar using spark submit. Following is my pom file:
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<parent>
<artifactId>oneview-forecaster</artifactId>
<groupId>com.dataxu.oneview.forecast</groupId>
<version>1.0.0-SNAPSHOT</version>
</parent>
<modelVersion>4.0.0</modelVersion>
<artifactId>forecaster</artifactId>
<dependencies>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.module</groupId>
<artifactId>jackson-module-scala_${scala.binary.version}</artifactId>
</dependency>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>${scala.version}</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_${scala.binary.version}</artifactId>
<version>${spark.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_${scala.binary.version}</artifactId>
<version>${spark.version}</version>
<!--<scope>provided</scope>-->
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-hive -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-hive_2.11</artifactId>
<version>2.2.0</version>
<!--<scope>provided</scope>-->
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-aws</artifactId>
<version>2.8.3</version>
<!--<scope>provided</scope>-->
</dependency>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk</artifactId>
<version>1.10.60</version>
</dependency>
<!-- https://mvnrepository.com/artifact/joda-time/joda-time -->
<dependency>
<groupId>joda-time</groupId>
<artifactId>joda-time</artifactId>
<version>2.9.9</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-common -->
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.8.0</version>
<!--<scope>provided</scope>-->
</dependency>
</dependencies>
<build>
<sourceDirectory>src/main/scala</sourceDirectory>
<testSourceDirectory>src/test/scala</testSourceDirectory>
<plugins>
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<version>${scala-maven-plugin.version}</version>
<executions>
<execution>
<goals>
<goal>compile</goal>
<goal>testCompile</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<configuration>
<archive>
<manifest>
<mainClass>com.dataxu.oneview.forecaster.App</mainClass>
</manifest>
</archive>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
</configuration>
<executions>
<execution>
<id>make-assembly</id>
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
Following is a simple snippet of code which fetches data from s3 location and prints it:
def getS3Data(path: String): Map[String, Any] = {
println("spark session start.........")
val spark = getSparkSession()
val configTxt = spark.sparkContext.textFile(path)
.collect().reduce(_ + _)
val mapper = new ObjectMapper
mapper.registerModule(DefaultScalaModule)
mapper.readValue(configTxt, classOf[Map[String, String]])
}
When I run it from intellij, everything works fine. the log is clear and looks good. However, when I package it using mvn package and try to run it using spark submit, I end up getting the following error at the .collect.reduce(_ + _). Following is the error I encounter:
"main" java.lang.NoSuchMethodError: org.apache.hadoop.conf.Configuration.reloadExistingConfigurations()V
at org.apache.hadoop.fs.s3a.S3AFileSystem.addDeprecatedKeys(S3AFileSystem.java:181)
at org.apache.hadoop.fs.s3a.S3AFileSystem.<clinit>(S3AFileSystem.java:185)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
...
I am not understanding which dependency was not packaged or what might be the issue as I did set the versions correctly expecting the hadoop aws should have all of them.
Any help will be appreciated.
The dependencies between hadoop and AWS JDK are very sensitive, and you should stick to using the correct versions that your hadoop dependency version was built with.
The first problem you need to solve is pick one version of Hadoop. I see you're mixing versions 2.8.3 and 2.8.0.
When I look at the dependency tree for org.apache.hadoop:hadoop-aws:2.8.0, I see that it is built against version 1.10.6 of the AWS SDK (same for hadoop-aws:2.8.3).
This is probably what's causing mismatches (you're mixing incompatible versions). So:
Choose the version of hadoop you want to use
Include hadoop-aws with the version compatible with your hadoop
Remove other dependencies, or only include them with versions matching the one compatible with your hadoop version.
In case anybody else is still stumbling on this error... it took me a while to find out, but check if your project has a dependency (direct or transitive) on the package org.apache.avro/avro-tools.
It was brought into my code by a transitive dependency.
Its problem is that it ships with a copy of org.apache.hadoop.conf.Configuration
that is much older than all current versions of hadoop, so it may end up being the one picked up in the classpath.
In my scala project, I just had to exclude it with
ExclusionRule("org.apache.avro","avro-tools")
and the error (finally!) disappear.
I am sure that the avro-tools coders had some good reason to include a copy of a file that belongs to another package (hadoop-common), I was really surprised to find it there and made me waste an entire day.
In my case, I was running a local Spark installation on a Cloudera edge node and was hitting this conflict (even though I made sure to download Spark with the correct hadoop binaries precompiled). I just went into my Spark home and moved the hadoop-common jar so it wouldn't be loaded:
mv ~/spark-2.4.4-bin-hadoop2.6/jars/hadoop-common-2.6.5.jar ~/spark-2.4.4-bin-hadoop2.6/jars/hadoop-common-2.6.5.jar.XXXXXX
After that, it ran... in local mode anyway.

ClassNotFoundException on spring beans inside jboss fuse camel application

I'm trying to create a spring-based fuse integration making external soap calls.
Using the code in a standalone java apps works fine but when importing it in my Fuse Integration project, I have the following error:
...
Caused by: java.lang.ClassNotFoundException: org.springframework.beans.factory.InitializingBean not found by org.apache.servicemix.bundles.spring-ws-core [439]
I don't know where to start to debug this.
Here are relevant part of my pom:
<dependencies>
<dependency>
<groupId>org.apache.camel</groupId>
<artifactId>camel-core</artifactId>
</dependency>
<dependency>
<groupId>org.apache.camel</groupId>
<artifactId>camel-spring</artifactId>
</dependency>
<dependency>
<groupId>org.apache.camel</groupId>
<artifactId>camel-spring-ws</artifactId>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.felix</groupId>
<artifactId>maven-bundle-plugin</artifactId>
<version>${version.maven-bundle-plugin}</version>
<extensions>true</extensions>
<configuration>
<instructions>
<Bundle-SymbolicName>myBundle</Bundle-SymbolicName>
<Bundle-Name>Empty Camel Spring Example [myBundle]</Bundle-Name>
<DynamicImport-Package>*</DynamicImport-Package>
<Import-Package>*,org.springframework.beans.factory</Import-Package>
</instructions>
</configuration>
</plugin>
</plugins>
</build>
I also double-checked that spring components are loaded in fuse:
At this point, I just don't know what to do to get this working!
Many thanks for your help !
edit:
Checking by bundles imports shows:
Which tends to confirm that
org.springframework.beans.factory.InitializingBean
is loaded !
In fact it's the following bug here: https://issues.jboss.org/browse/ENTESB-6856 which is already fixed and will be made available through 6.3.0 R4 (which is due to be released these days).
Since you are using Fuse 6.3 187 I'd highly recommend to follow their patch cycle and apply the updates regularly (schedule can be seen here: https://access.redhat.com/articles/2939351).
If you are brave, you can also play around with the internal builds (http://repository.jboss.org/nexus/content/groups/ea/org/jboss/fuse/jboss-fuse-karaf/6.3.0.redhat-280/), however these versions won't be supported unless they are part of an official patch release.

Is there any impact if dependency was duplicated in pom.xml

in pom.xml file, I found that the below dependency was repeated twice. Is there any bad impact if a dependency was defined multiple timese? Thanks.
<dependency>
<groupId>org.apache.xmlgraphics</groupId>
<artifactId>batik-all</artifactId>
<version>1.7</version>
</dependency>
<dependency>
<groupId>batik</groupId>
<artifactId>batik-ext</artifactId>
<version>1.7</version>
</dependency>
<dependency>
that is not repeating, see the <artifactId>, if you really mean that you have duplicated dependencies, it doesn't harm but clean it up

Resources