are maven dependency exclusions necessary when using spring (mvc) + hadoop + hive? - spring

I have a web app working great. Tried to connect to hadoop using hive. Tests work fine, but I can't run the web app. I get an error from transitive maven dependencies on hadoop-core bringing in j2ee jars that override Tomcat and mess up when trying to run the web app (specifically in loading the context).
Foolishly I thought maybe if I just use Spring Data built for CDH5 they would have covered all that. No such luck. I was following their docs here: https://github.com/spring-projects/spring-hadoop/wiki/Build-with-Cloudera-CDH5
Here's my current POM:
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-hadoop</artifactId>
<version>2.0.4.RELEASE-cdh5</version>
</dependency>
<dependency>
<groupId>org.apache.hive</groupId>
<artifactId>hive-jdbc</artifactId>
<version>${hive.version}</version>
<scope>runtime</scope>
</dependency>
Here is the error:
SEVERE: Servlet.service() for servlet [jsp] in context with path [] threw exception [java.lang.AbstractMethodError: javax.servlet.jsp.JspFactory.getJspApplicationContext(Ljavax/servlet/ServletContext;)Ljavax/servlet/jsp/JspApplicationContext;] with root cause
java.lang.AbstractMethodError: javax.servlet.jsp.JspFactory.getJspApplicationContext(Ljavax/servlet/ServletContext;)Ljavax/servlet/jsp/JspApplicationContext;
at org.apache.jasper.compiler.Validator$ValidateVisitor.<init>(Validator.java:515)
at org.apache.jasper.compiler.Validator.validateExDirectives(Validator.java:1817)
at org.apache.jasper.compiler.Compiler.generateJava(Compiler.java:217)
at org.apache.jasper.compiler.Compiler.compile(Compiler.java:373)
I also got this error when building direct from cloudera's repos
I could start stuffing exclusions in there, but that feels hacky, and I'm paranoid about other transitive dependency errors cropping up that I may not know about.
I've pored over the docs and the sample code and pom files here: https://github.com/spring-projects/spring-hadoop-samples/blob/master/hive/pom.xml
They don't seem to have exclusions in their POM files. However, I've seen other people do it, such as here: Spring + Maven + Hadoop
Is that the accepted way to work with these technologies? This is my first time so am seeking some confirmation here. Perhaps I'm missing something?
Is it canonical to simply have exclusions

Related

Dependency listed in pom file not found in deployed project

I asked a question here that I think I may have found the root of. I have a Spring Boot app using a datasource, net.sourceforge.jtds.jdbc.Driver, that is supposed to be included transitively by Spring Boot 2.0.2 with spring-boot-starter-jpa. However, when I run
jar tf my.jar | grep jtds
the driver class isn't found (we don't have a maven executable on the server to list the classpath). Everything I do to inspect the classpath reflects that the jar isn't there.
I've done this in 2 scenarios: 1) When I didn't explicitly add the jar to my pom, I got the error reported in my previous post. 2) When I do add it explicitly to the pom, I get this error:
java.lang.IllegalStateException: Cannot load driver class: net.sourceforge.jtds.jdbc.Driver
Can someone tell me what's going on?? I am confounded as to why this class can't be found and loaded.
Please mind, that in the Spring Boot Parent POM the jtds dependency is only included in test scope.
If you want to use classes of this dependency also in your production code, please change the Maven scope to compile.
Ok, the problem was solved by adding the dependency with a runtime scope.
In child pom where jar is packaged, you should have
spring-boot-maven-plugin. and dependency as below:
<dependency>
<groupId>net.sourceforge.jtds</groupId>
<artifactId>jtds</artifactId>
</dependency>
In parent pom :
<dependency>
<groupId>net.sourceforge.jtds</groupId>
<artifactId>jtds</artifactId>
<version>${jtds.version}</version>
</dependency>

javax validation can't find Hibernate Validator in Karaf

I've implemented some code using javax.validation and Hibernate Validator. The unit tests using validation are working fine. The build produces OSGi bundles and features, and runs in Karaf.
When I run my PaxExam integration test, I get "Unable to create a Configuration, because no Bean Validation provider could be found. Add a provider like Hibernate Validator (RI) to your classpath." As far as I can tell, I AM adding it to my classpath. I have a features.xml file that I've been incrementally adding dependencies to. It finally got past Karaf bundle resolution, but now it's failing with an exception with this message:
"Unable to create a Configuration, because no Bean Validation provider could be found. Add a provider like Hibernate Validator (RI) to your classpath."
I would include the stacktrace, but it doesn't look useful to me. Most of it is in my code, junit, and paxexam.
I'm attempting to use version 5.4.1.Final of HV. Note again that the unit tests doing validation are working fine. It took a while to get the dependencies correct to get that far (like using version "3.0.1-b08" of "javax.el".
I've seen some mentions of a "hibernate-validator-osgi-karaf-features" artifact, but I'm not sure if that's relevant. I'm attempting to use both the hibernate-validator artifact and the hibernate-validator-annotation-processor artifact.
I don't know that it's going to matter, but here's an excerpt of my POM dependencies:
<dependency>
<groupId>javax.validation</groupId>
<artifactId>validation-api</artifactId>
<version>1.1.0.Final</version>
</dependency>
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-validator</artifactId>
<version>5.4.1.Final</version>
</dependency>
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>
hibernate-validator-annotation-processor
</artifactId>
<version>5.4.1.Final</version>
</dependency>
<dependency>
<groupId>org.glassfish</groupId>
<artifactId>javax.el</artifactId>
<version>3.0.1-b08</version>
</dependency>
The following is an excerpt of a "features.xml" file from a bundle that is a dependency (by feature) of the bundle containing the test class:
<bundle start-level="100">wrap:mvn:javax.validation/validation-api/1.1.0.Final$Bundle-Name=javax.validation&Bundle-SymbolicName=javax.validation&Bundle-Version=1.1.0.Final</bundle>
<bundle>mvn:org.hibernate/hibernate-validator/5.4.1.Final</bundle>
<bundle start-level="100">wrap:mvn:org.hibernate/hibernate-validator-annotation-processor/5.4.1.Final$Bundle-Name=hibernate-validator-annotation-processor&Bundle-SymbolicName=hibernate-validator-annotation-processor&Bundle-Version=5.4.1.Final</bundle>
What else can I do at this point?
Update:
I've made some changes according to the answer that refers to the "hibernate-validator-osgi-karaf-features" artifact, but I'm now getting a different unexpected error.
In the pom dependencies I added the following:
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>
hibernate-validator-osgi-karaf-features
</artifactId>
<version>5.4.1.Final</version>
<type>pom</type>
</dependency>
In the base features.xml file that is referred to by the features.xml file in my module, I removed the annotation-processor, and added this:
<bundle>wrap:mvn:org.hibernate/hibernate-validator-osgi-karaf-features/5.4.1.Final$Bundle-Name=hibernate-validator-osgi-karaf-features&Bundle-SymbolicName=hibernate-validator-osgi-karaf-features&Bundle-Version=5.4.1.Final</bundle>
I tried not having the "wrap:" and everything after the "$", but the result was the same.
When I ran my test, I saw this:
Caused by: shaded.org.eclipse.aether.transfer.ArtifactNotFoundException: Could not find artifact org.hibernate:hibernate-validator-osgi-karaf-features:jar:5.4.1.Final in central (http://repo1.maven.org/maven2/)
at shaded.org.eclipse.aether.connector.basic.ArtifactTransportListener.transferFailed(ArtifactTransportListener.java:39)[7:org.ops4j.pax.url.mvn:2.4.7]
at shaded.org.eclipse.aether.connector.basic.BasicRepositoryConnector$TaskRunner.run(BasicRepositoryConnector.java:355)[7:org.ops4j.pax.url.mvn:2.4.7]
at shaded.org.eclipse.aether.util.concurrency.RunnableErrorForwarder$1.run(RunnableErrorForwarder.java:67)[7:org.ops4j.pax.url.mvn:2.4.7]
at shaded.org.eclipse.aether.connector.basic.BasicRepositoryConnector$DirectExecutor.execute(BasicRepositoryConnector.java:581)[7:org.ops4j.pax.url.mvn:2.4.7]
at shaded.org.eclipse.aether.connector.basic.BasicRepositoryConnector.get(BasicRepositoryConnector.java:249)[7:org.ops4j.pax.url.mvn:2.4.7]
at shaded.org.eclipse.aether.internal.impl.DefaultArtifactResolver.performDownloads(DefaultArtifactResolver.java:520)[7:org.ops4j.pax.url.mvn:2.4.7]
at shaded.org.eclipse.aether.internal.impl.DefaultArtifactResolver.resolve(DefaultArtifactResolver.java:421)[7:org.ops4j.pax.url.mvn:2.4.7]
... 16 more
It's curious that it says it can't find it in central. I can verify the artifact is my local maven cache, because my build likely copied it there after I added the maven dependency as described above.
Update:
I'm guessing that part of my problem is that this artifact is a POM artifact, not a JAR artifact, but I don't understand how I need to reference it.
Update:
Someone on karaf-user pointed out that I need to reference it as a feature, not a bundle, so I now replaced my bundle references with the following:
<feature>hibernate-validator-osgi-karaf-features</feature>
Along with the following repository definition next to two other repository definitions:
<repository>mvn:org.hibernate/hibernate-validator-osgi-karaf-features/5.4.1.Final/xml/features</repository>
However, after installing the features file and then rerunning my test, it fails with this:
org.osgi.service.resolver.ResolutionException: Unable to resolve root: missing requirement [root] osgi.identity; osgi.identity=usl-fraudcheck; type=karaf.feature; version="[2.5.0.SNAPSHOT,2.5.0.SNAPSHOT]"; filter:="(&(osgi.identity=usl-fraudcheck)(type=karaf.feature)(version>=2.5.0.SNAPSHOT)(version<=2.5.0.SNAPSHOT))" [caused by: Unable to resolve usl-fraudcheck/2.5.0.SNAPSHOT: missing requirement [usl-fraudcheck/2.5.0.SNAPSHOT] osgi.identity; osgi.identity=usl-base; type=karaf.feature [caused by: Unable to resolve usl-base/2.5.0.SNAPSHOT: missing requirement [usl-base/2.5.0.SNAPSHOT] osgi.identity; osgi.identity=hibernate-validator-osgi-karaf-features; type=karaf.feature]]
As I'm used to now, the last identity referenced here is the thing it can't find, unsurprisingly this new feature I'm referencing.
I then verified that the following file exists:
~/.m2/repository/org/hibernate/hibernate-validator-osgi-karaf-features/5.4.1.Final/hibernate-validator-osgi-karaf-features-5.4.1.Final-features.xml
However, I found it curious that it begins with the following content:
<?xml version="1.0" encoding="UTF-8"?>
<features xmlns="http://karaf.apache.org/xmlns/features/v1.4.0"
name="hibernate-validator-osgi-features"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://karaf.apache.org/xmlns/features/v1.4.0">
The “name” property of the top-level features element is “hibernate-validator-osgi-features”, not “hibernate-validator-osgi-karaf-features”. Is that a problem?
Update:
I now understand that my features file has to reference a feature named "hibernate-validator", which is defined in that "hibernate-validator-osgi-karaf-features" artifact. That appears to have resolved my Karaf package resolution problems. However, that simply puts me back to my original problem of:
Unable to create a Configuration, because no Bean Validation provider could be found. Add a provider like Hibernate Validator (RI) to your classpath.
I tried changing the "LogLevel" in my Pax Exam config from WARNING to DEBUG. This gave me a lot more karaf debug output, but it didn't give me any significant info about why HV is not found in the classpath.
Is there some other debugging I can configure, or configuration in Pax Exam, that can help here?
Just take a look at our Karaf integration tests for the 5.4 branch:
https://github.com/hibernate/hibernate-validator/tree/5.4/osgi/integrationtest
We also use Pax Exam. You should be able to make it work if you follow what we did.
BTW, I see you mention the annotation processor in your dependency for the 2nd time, it's not something you need at runtime. The annotation processor is used to check that the annotations you used make sense at compile time. You should only enable it when you build your project.
See the Maven example here: https://docs.jboss.org/hibernate/stable/validator/reference/en-US/html_single/#validator-annotationprocessor-commandline .
This solved it for me:
https://access.redhat.com/documentation/en-us/red_hat_jboss_fuse/6.2/html/apache_cxf_development_guide/Validation#Validation-Intro-Resolver
Read from "Configuring the validation provider explicitly in OSGi"
The problem is that in OSGI CXF has trouble to find the provider automatically so you have to resolve manually by passing the hibernate validator as a constructor argument to the CXF Bean Validation Provider.
Hope this helps :)

spark application has thrown java.lang.NoSuchMethodError: javax.ws.rs.core.Response.readEntity(Ljava/lang/Class;)Ljava/lang/Object

I have an application in java that uses spark and hbase. We need to hit a url deployed in tomcat(jersey). So, we have used resteasy client to do that.
When i execute a standalone java code to hit the url using rest-easy
client, it works fine
However, when i use the same code in my another application that uses spark for some processing, then it throws the error as shown in the title.
I am using maven as build tool in eclipse. After building it, i am creating a runnable jar and selecting the option "extract required libraries into generated jar". For executing the application i am using the command:
nohup spark-submit --master yarn-client myWork.jar myProperties 0 &
The dependency for rest-easy client code:
<dependencies>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>3.8.1</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.jboss.resteasy</groupId>
<artifactId>resteasy-client</artifactId>
<version>3.0.11.Final</version>
</dependency>
</dependencies>
I am unable to figure out that during compile time , it does not throw any error, but during runtime, although the jar has each and every library packed in(including that of spark and hbase), it throws error saying no such method. Please help.
have tried changing the version of resteasy-client but it didn't
help. during compile time i can see the class, how come at runtime it
is missing
Possible reasons could be reasons
1) If you are using maven scope might be provided. so that your jar wont be copied to your distribution.
This is ruled out by above configuration you have mentioned.
2) You are not pointing to correct location from your execution script may be shell script.
3) Your are not passing this jar with --jars option or --driverclasspath --executorclasspath etc...
I doubt issue is because of second or third reasons.
Also have a look at https://spark.apache.org/docs/1.4.1/submitting-applications.html
EDIT :
Question : spark-submit --conf
spark.driver.extraClassPath=surfer/javax.ws.rs-api-2.0.1.jar:surfer/jersey-client-2.25.jar:surfer/jersey-common-2.25.jar:surfer/hk2-api-2.5.0-b30.jar:surfer/jersey-guava-2.25.jar:surfer/hk2-utils-2.5.0-b30.jar:surfer/hk2-locator-2.5.0-b30.jar:surfer/javax.annotation-api-1.2.jar
artifact.jar againHere.csv
now it throws different exception : Exception in thread "main"
java.lang.AbstractMethodError:
javax.ws.rs.core.UriBuilder.uri(Ljava/lang/String;)Ljavax/ws/rs/core/UriBuilder;
i have also tried searching for the class Response$Status$Family
somewhere in classpath other than what i am supplying. i used the
command grep Response$Status$Family.class
/opt/mapr/spark/spark-1.4.1/lib/*.jar And i found that spark also has
this class. May be this is the issue. but how to forcefully tell the
jvm to use the class supplied by me at runtime and not that of spark,
i don't know! can you help?
Since you provided external jar in the classpath
You can use below options to tell framework that it has to use external jar provided by you. This can be done in 2 ways
through spark submit
conf.set...
Since you are using 1.4.1 see configuration options
spark.executor.userClassPathFirst false (Experimental) Same
functionality as spark.driver.userClassPathFirst, but applied to
executor instances.
spark.driver.userClassPathFirst false (Experimental) Whether to give
user-added jars precedence over Spark's own jars when loading classes
in the the driver. This feature can be used to mitigate conflicts
between Spark's dependencies and user dependencies. It is currently an
experimental feature. This is used in cluster mode only. can be used
to to tell framework

Setting javax.xml.ws.Service from JDK, instead of javaee-api with maven

I'm facing with this problem:
The method getPort(QName, Class<T>) in the type Service is not applicable for the arguments (QName, Class<AcessoDadosGeolocalizacao>, WebServiceFeature[])
I used wsimport to generate my clients, but now my maven application is using the class javax.xml.ws.Service from
<dependency>
<groupId>javaee</groupId>
<artifactId>javaee-api</artifactId>
<version>5</version>
<scope>provided</scope>
</dependency>
How can I use the javax.xml.ws.Service from the JDK 6?
I've added the webservices-api to my pom.xml and the problem is gone.
<dependency>
<groupId>javax.xml</groupId>
<artifactId>webservices-api</artifactId>
<version>2.1-b14</version>
</dependency>
If I am adding this entry(webservices-api) ;it is giving run time error while accessing JAXB-API.I found that the JDK6 should be the first in the order of classpath and then maven library.I moved up the JDK6 above the Maven library.Then it worked.
I ran into a similar issue with Eclipse and a Dynamic Web Application. Its not Maven related however googling for that error gets you all of about 7 results in Google as of today's date with about three or more of them being relisting at other websites of the same stack exchange question - so I thought in case others had a similar issue I'd add what helped me. The WAR was set to use JBoss AS5, the VM was set to use Java 6. Because its eclipse and I had already consumed the web service - the error was not occurring on import as the stubs had already been created. I ensured the Java facet was set to use 1.6 (it had been 1.5), I cleaned and built but the error persisted. I then noticed I had a reference on my build path to Java EE 1.5. I removed this, cleaned and built and the error went away.
Hope this helps anyone else faced with the same issue!

JavaFX can't load caspian.css

I have a Maven project in which I use a javaxf WebEngine. I first included javafx by installing e(fx)clipse, and I was able to use it normally.
Now I want to compile my program to a big .jar file with all dependencies included. I first used a <scope>system</scope> and linked to the jfxrt.jar in my JDK (1.7.0_45). I compile my program using mvm package and it works well for the build part.
Then, I installed the jfxrt.jar in maven thanks to this SO answer so I have a javafx-2.2.45.jar with maven.
However, I'm stuck with this error when I run my program and when it comes to instantiate that particular WebEngine:
INFO: com.sun.javafx.css.StyleManager loadStylesheetUnPrivileged Could not find stylesheet: jar:file:/target/project-name-0.1.one-jar.jar!/lib/javafx-2.2.45.jar!/com/sun/javafx/scene/control/skin/caspian/caspian.css
SEVERE: javafx.scene.control.Control impl_processCSS The -fx-skin property has not been defined in CSS for ScrollBarThemeImpl$ScrollBarWidget#5919e0a8[styleClass=scroll-bar]
java.lang.NullPointerException
at com.sun.webpane.sg.theme.ScrollBarThemeImpl.initializeThickness(ScrollBarThemeImpl.java:341)
at com.sun.webpane.sg.theme.ScrollBarThemeImpl.access$100(ScrollBarThemeImpl.java:27)
at com.sun.webpane.sg.theme.ScrollBarThemeImpl$ScrollBarWidget.impl_updatePG(ScrollBarThemeImpl.java:50)
at javafx.scene.Node.impl_syncPGNode(Node.java:425)
at javafx.scene.Scene$ScenePulseListener.syncAll(Scene.java:2106)
at javafx.scene.Scene$ScenePulseListener.syncAll(Scene.java:2115)
at javafx.scene.Scene$ScenePulseListener.syncAll(Scene.java:2115)
at javafx.scene.Scene$ScenePulseListener.synchronizeSceneNodes(Scene.java:2082)
at javafx.scene.Scene$ScenePulseListener.pulse(Scene.java:2193)
at com.sun.javafx.tk.Toolkit.firePulse(Toolkit.java:363)
at com.sun.javafx.tk.quantum.QuantumToolkit.pulse(QuantumToolkit.java:463)
at com.sun.javafx.tk.quantum.QuantumToolkit$9.run(QuantumToolkit.java:332)
at com.sun.glass.ui.InvokeLaterDispatcher$Future.run(InvokeLaterDispatcher.java:76)
I opened this jar and was able to find the caspian.css file where it's supposed to be.
What should I do to be able to use my WebEngine in my "big compiled jar"? I don't care if the solution is a quick, dirty fix such as copy/paste of this css file somewhere else (I already tried that but I might have missed something..)
It's not a good idea to package fx into your jar. Have you tried using http://www.zenjava.com/2013/07/01/javafx-maven-plugin-2-0-released/
You might need to declare your dependency like for not to include JavaFX into the uber jar. You could also exclude javafx packages from when creating this jar, but I don't think it's recommended as in theory resources might not be placed under the javafx package.
<dependency>
<groupId>com.oracle</groupId>
<artifactId>javafx</artifactId>
<version>2.2.3</version>
<scope>provided</scope>
</dependency>
or
<dependency>
<groupId>com.oracle</groupId>
<artifactId>javafx</artifactId>
<version>2.2.3</version>
<optional>true</optional>
</dependency>
Update: the is also a javafx maven plugin as #tomsontom mentioned.

Resources