Spring + Hibernate + Tomcat Dependency problems - spring

when I run tomcat and the war is deployed I get :
NoClassDefFoundError : org/apache/commons/collections/map/LRUMap
Invocation of init method failed; nested exception is
java.lang.NoClassDefFoundError:
org/apache/commons/collections/map/LRUMap
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:527)
~[spring-beans-3.1.0.RELEASE.jar:3.1.0.RELEASE]
What is strange is that I got the commons-collections-2.1.jar (I even tried 3.1) in my WEB-INF lib folder.
Edit :
I did copy the commons-collections from WEB-INF/lib to Tomcat lib and it seems to work. However I won't be able to do that on the production server, why isn't it taking my WEB-INF/lib version ?

Ok so I did put version 3.2.1 of commons-collections and the error disappeared. I unfortunately still don't know which library is depending on this version. Even mvn dependency:tree didn't help ...

I had this exception when I was with xdoclet on dependencies.
If you are with this dependency, just exclude it.

I have the same probleme, maybe it's too late to approve the answer but it's still benefitial for people who will have this problem in the futur.
So I exclude commons-collections from net.sf.jasperreports, after that the tomcat runs perfectly whithout any problem.
<dependency>
<groupId>net.sf.jasperreports</groupId>
<artifactId>jasperreports</artifactId>
<version>4.1.1</version>
<type>jar</type>
<scope>compile</scope>
<exclusions>
<exclusion>
<artifactId>commons-collections</artifactId>
<groupId>commons-collections</groupId>
</exclusion>
</exclusions>
</dependency>

Related

Databricks local test fail with java.lang.NoSuchMethodError: org.apache.hadoop.security.HadoopKerberosName.setRuleMechanism

I have a unit test to databricks code, and I want to run it locally on windows. Unluckily when I run pytest with PyCharm, it throws the following exception:
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.hadoop.security.HadoopKerberosName.setRuleMechanism(Ljava/lang/String;)V
at org.apache.hadoop.security.HadoopKerberosName.setConfiguration(HadoopKerberosName.java:84)
at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:315)
at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:300)
at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:575)
at org.apache.spark.util.Utils$.$anonfun$getCurrentUserName$1(Utils.scala:2747)
at scala.Option.getOrElse(Option.scala:189)
at org.apache.spark.util.Utils$.getCurrentUserName(Utils.scala:2747)
at org.apache.spark.SecurityManager.<init>(SecurityManager.scala:79)
at org.apache.spark.deploy.SparkSubmit.secMgr$lzycompute$1(SparkSubmit.scala:368)
at org.apache.spark.deploy.SparkSubmit.secMgr$1(SparkSubmit.scala:368)
at org.apache.spark.deploy.SparkSubmit.$anonfun$prepareSubmitEnvironment$8(SparkSubmit.scala:376)
at scala.Option.map(Option.scala:230)
at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:376)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:894)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1039)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1048)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
And from source code it is from the initialization:
spark = SparkSession.builder \
.master("local[2]") \
.appName("Helper Functions Unit Testing") \
.getOrCreate()
I do search the above error and most of them are related to maven configure to add dependency of hadoop auth. However, for pyspark, I don't know how to deal with it. Does anyone have experience or insight for this error?
Here my workaround is to have python version to 3.7 and change pyspark version to 3.0, and then it seems ok. So it is related to the environment and dependency inconsistent.
This is just limit to my case, and from my search on web most is related to maven to add hadoop-auth.jar dependency for hadoop configuration.
Encountered this error for a Maven project written in Scala, not Python. What did it for me was adding not only the hadoop-auth dependency like OP specified but also the hadoop-common dependency in my pom file like so,
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>3.1.2</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-auth</artifactId>
<version>3.1.2</version>
</dependency>
Replace 3.1.2 with whatever version you're using. However, I also found that I had to find other dependencies that conflicted with hadoop-common and hadoop-auth and add exclusions to them like so,
<exclusions>
<exclusion>
<artifactId>hadoop-common</artifactId>
<groupId>org.apache.hadoop</groupId>
</exclusion>
<exclusion>
<artifactId>hadoop-auth</artifactId>
<groupId>org.apache.hadoop</groupId>
</exclusion>
</exclusions>

Servlet 500 error ClassNotFound exception

I'm building a web app using Vaadin and it needs to communicate with several REST APIs. I've set it up in IntelliJ with Maven. I was thinking for the REST client I would use GSON to parse the JSON objects I'd be receiving from the open APIs, however, the application crashes due to a servlet exception error.
Caused by:
java.lang.ClassNotFoundException: com.google.gwt.json.client.JSONObject
I've added the GSON dependency to the pom.xml:
<dependency>
<groupId>com.google.code.gson</groupId>
<artifactId>gson</artifactId>
<version>2.3.1</version>
</dependency>
And have tried changing the module settings from Provided to Compiled to Runtime but with no change.
I'm just stumped as to why the GSON jar appears in the project dependencies within IntelliJ using Maven but fails on run time. I've seen references to Eclipse and including the jar in the classpath but, again, I'm using IntelliJ/Maven to build my Vaadin project and satisfy dependencies.
Any help is greatly appreciated!
Add below dependency in your classpath:
<dependency>
<groupId>com.google.gwt</groupId>
<artifactId>gwt-user</artifactId>
<version>2.3.0</version>
</dependency>

Spring boot and apache spark - container conflict

I am trying to use spring boot 1.1.5 and apache spark 1.0.2 together in project. Look like apache spark uses Jetty container internally and I have configured spring-boot to use Tomcat container. However application startup fails with some securityException at root cause. If I see full stack trace looks like spring boot trying to initialize "jettyEmbeddedServletContainerFactory" which it shouldn't in first place. It probably picks it up from classpath due to jetty presence via spark. If I exclude jetty from spark and run again I don't see same error again but then SparkContext initialization fails due to not finding jetty. How do i tell spring-boot runtime to look for "TomcatEmbeddedServletContainerFactory" instead of jetty one?
I got "java.lang.SecurityException: class "javax.servlet.http.HttpSessionIdListener"'s signer information does not match signer information of other classes in the same package"
To fix this issue I was need to remove all javax.servlet dependencies.
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.3.1</version>
<exclusions>
<exclusion>
<groupId>javax.servlet</groupId>
<artifactId>javax.servlet-api</artifactId>
</exclusion>
<exclusion>
<groupId>org.glassfish</groupId>
<artifactId>javax.servlet</artifactId>
</exclusion>
<exclusion>
<groupId>org.eclipse.jetty.orbit</groupId>
<artifactId>javax.servlet</artifactId>
</exclusion>
</exclusions>
</dependency>
#Joakim Erdfelt, Thanks.
I was just waiting to see if someone is familiar with this situation and if it's just a small configuration change. As it turns out it is! #Configuration
#EnableAutoConfiguration(exclude={EmbeddedServletContainerFactory.class})
public class MyConfiguration { }
I defined my own "EmbeddedServletContainerFactory" bean as a "org.springframework.boot.context.embedded.tomcat.TomcatEmbeddedServletContaine‌​rFactory" and it started working as I expected.

Glassfish incremental deployment failes when including Selenium

I have a Java EE project which is meant to run on Glassfish 4.1. I want to use Selenium to collect information from some web pages, i.e. I need to include Selenium in the deployment (not just for tests).
I am using Eclipse IDE and have previously utilized the incremental deployment function in Eclipse to automatically deploy all saved changes to the project. But when I included (with Maven) the dependencies for Selenium incremental deployment stopped working. The project can still be deployed to Glassfish but I have to restart Glassfish between every change. I get the following error in Eclipse:
Exception while loading the app : java.lang.IllegalStateException: ContainerBase.addChild: start: org.apache.catalina.LifecycleException: java.lang.RuntimeException: com.sun.faces.config.ConfigurationException: java.util.concurrent.ExecutionException: com.sun.faces.config.ConfigurationException: Unable to parse document 'bundle://136.0:1/com/sun/faces/jsf-ri-runtime.xml': DTD factory class org.apache.xerces.impl.dv.dtd.DTDDVFactoryImpl does not extend from DTDDVFactory.. Please see server.log for more details.
org.apache.xerces.impl.dv.dtd.DTDDVFactoryImpl is included with Selenium as a transitive dependency (xerces:xercesImpl:2.11.0).
Here are my Maven dependencies:
<dependency>
<groupId>org.jboss.arquillian.selenium</groupId>
<artifactId>selenium-bom</artifactId>
<version>2.44.0</version>
<type>pom</type>
<scope>import</scope>
</dependency>
<dependency>
<groupId>org.seleniumhq.selenium</groupId>
<artifactId>selenium-htmlunit-driver</artifactId>
</dependency>
I hope there is a solution to this but after reading Jens Schauder's response in Dealing with "Xerces hell" in Java/Maven? I'm afraid there might not be. Anyone?
I currently can't reproduce the issue with a simple project, did you make sure that you don't have any other dependencies which are importing another version of xercesImpl?
You can try to place the xercesImpl-2.11.0.jar and the transitive dependency xml-apis-1.4.01.jar in the lib folder of your Glassfish domain and exclude it from your dependencies like this:
<dependency>
<groupId>org.seleniumhq.selenium</groupId>
<artifactId>selenium-htmlunit-driver</artifactId>
<version>2.44.0</version>
<exclusions>
<exclusion>
<artifactId>xercesImpl</artifactId>
<groupId>xerces</groupId>
</exclusion>
</exclusions>
</dependency>
See also:
org.apache.xerces.impl.dv.DVFactoryException: DTD factory class org.apache.xerces.impl.dv.dtd.DTDDVFactoryImpl does not extend from DTDDVFactory
Xerces error: org.apache.xerces.impl.dv.dtd.DTDDVFactoryImpl

jetty-maven-plugin and loadTimeWeaver

can't seem to have my spring webapp working with jetty-maven pluging
i always get
org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'loadTimeWeaver': Initialization of bean failed; nested exception is java.lang.IllegalStateException: ClassLoader [org.eclipse.jetty.webapp.WebAppClassLoader] does NOT provide an 'addTransformer(ClassFileTransformer)' method. Specify a custom LoadTimeWeaver or start your Java virtual machine with Spring's agent: -javaagent:org.springframework.instrument.jar
though i have:
set MAVEN_OPTS to javaagent:/Users/blabla/.m2/repository/org/springframework/spring-instrument/3.1.3.RELEASE/spring-instrument-3.1.3.RELEASE.jar
set JAVA_OPTIONS to the same thing
added dep to spring-instrument and spring-aspects
added jvmArgs with -javaagent:.... to jetty-maven-plugin configuration
Probably you are missing few jars aspectjweaver aspectjrt spring-instrument
Additionally you may want to try explicitly defining the bean loadTimeWeaver in applicationcontext.xml file.
<property name="loadTimeWeaver">
<bean id="instrumentationLoadTimeWeaver" class="org.springframework.instrument.classloading.InstrumentationLoadTimeWeaver"/>
</property>
When launching Jetty from Maven (using mvn jetty:run), Jetty will run in the same JVM as maven does, so you'll need to pass any options using MAVEN_OPTS.
(Be sure to include the minus sign before javaagent, as I didn't see that in your snippet).
export MAVEN_OPTS=-javaagent:org.springframework.instrument-3.0.5.RELEASE.jar
A complete example of load time weaving in jetty using Maven can be found on Github.
https://github.com/zzantozz/testbed/tree/master/spring-aspectj-load-time-weaving-in-jetty
Without more details about your pom.xml file... it's not easy. But one common issue with jetty plugin are the dependencies.
One rule that always worked for me is to put all dependencies of your war with scope provided as direct dependencies of the maven-jetty-plugin.
I suggest you to put spring-instrument and spring-aspects as direct dependencies of the maven-jetty-plugin too.
According my understanding:
set MAVEN_OPTS to javaagent:/Users/blabla/.m2/repository/org/springframework/spring-instrument/3.1.3.RELEASE/spring-instrument-3.1.3.RELEASE.jar
is the correct way to pass jvm args to the jetty JVM (since jetty run in the same JVM as maven)
I've had same issue I use load time weaving in spring. How can i set class loader in jetty?. I've resolved it by adding a "-Xbootclasspath/a:[path to jar]" as JvmArgs param.
Now it's looks like
<extraJvmArgs>-Xmx4g -XX:MaxPermSize=512m -javaagent:C:\Users\auldanov\.m2\repository\org\springframework\spring-instrument\3.1.4.RELEASE\spring-instrument-3.1.4.RELEASE.jar -Xbootclasspath/a:C:\Users\auldanov\.m2\repository\org\springframework\spring-instrument\3.1.4.RELEASE\spring-instrument-3.1.4.RELEASE.jar</extraJvmArgs>
I finally made it work following the example from ddewaele. So apart from doing
set MAVEN_OPTS to
javaagent:/Users/blabla/.m2/repository/org/springframework/spring-instrument/3.1.3.RELEASE/spring-instrument-3.1.3.RELEASE.jar
You have to check the dependencies you have added. I was missing spring-tx. You need to have these dependencies:
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-context</artifactId>
<version>3.0.5.RELEASE</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-aspects</artifactId>
<version>3.0.5.RELEASE</version>
</dependency>
<dependency>
<groupId>org.aspectj</groupId>
<artifactId>aspectjweaver</artifactId>
<version>1.6.10</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-web</artifactId>
<version>3.0.5.RELEASE</version>
</dependency>
<!-- Following dependencies are required because of spring-aspects -->
<dependency>
<groupId>org.hibernate.javax.persistence</groupId>
<artifactId>hibernate-jpa-2.0-api</artifactId>
<version>1.0.0.Final</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-tx</artifactId>
<version>3.0.5.RELEASE</version>
</dependency>
Otherwise you get that non-intuitive error
Specify a custom LoadTimeWeaver or start your Java virtual machine with Spring's agent: -javaagent:org.springframework.instrument.jar
NOTE: You can use the spring version you want. I am using 3.2.5 for everything.
Want to warn everybody! This problem is very unclear.
Don't include this dependency to your project or include with provided scope.
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-instrument</artifactId>
<version>${thirdparty.spring.version}</version>
</dependency>
Otherwise agent class InstrumentationSavingAgent will load twice (first when loading agent, second while linking libraries) and spring will use second Class instance without injected Instrumentation

Resources