Gradle with Apache Jena 3.11.0 - gradle

I am trying to create a shadowjar including the latest Apache Jena 3.11 using the gradle build system and additional java code to create an "uber" package. To do so I found information here: https://jena.apache.org/documentation/notes/jena-repack.html however I am having difficulty to translate this to a gradle setup.
Does anyone knows how to achieve this?
5 actionable tasks: 1 executed, 4 up-to-date
Creating memory store
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
Exception in thread "main" java.lang.ExceptionInInitializerError
at nl.wur.ssb.RDFSimpleCon.RDFSimpleCon.createEmptyStore(RDFSimpleCon.java:150)
at nl.wur.ssb.RDFSimpleCon.RDFSimpleCon.<init>(RDFSimpleCon.java:62)
at nl.wur.ssb.RDFSimpleCon.RDFSimpleCon.<init>(RDFSimpleCon.java:159)
at nl.wur.ssb.RDFSimpleCon.Test.main(Test.java:7)
Caused by: java.lang.NullPointerException
at org.apache.jena.tdb.sys.EnvTDB.processGlobalSystemProperties(EnvTDB.java:33)
at org.apache.jena.tdb.TDB.init(TDB.java:252)
at org.apache.jena.tdb.sys.InitTDB.start(InitTDB.java:29)
at org.apache.jena.sys.JenaSystem.lambda$init$2(JenaSystem.java:116)
at java.util.ArrayList.forEach(ArrayList.java:1257)
at org.apache.jena.sys.JenaSystem.forEach(JenaSystem.java:191)
at org.apache.jena.sys.JenaSystem.forEach(JenaSystem.java:168)
at org.apache.jena.sys.JenaSystem.init(JenaSystem.java:114)
at org.apache.jena.tdb.TDBFactory.<clinit>(TDBFactory.java:40)
... 4 more

Java ServiceLoader files need to be merged when creating a repackaged jar file.
For Gradle, this is done with "mergeServiceFiles()" when using the shadowJar.
https://jena.apache.org/documentation/notes/jena-repack.html has the instructions for the Maven shade plugin.

Related

Getting "Caused by: java.lang.IllegalStateException: HTTP Server cannot be loaded: No implementation of HttpServerFacade found on classpath." error

I am trying out the distributed random forest implementation of H2O using sparkling-water. But I am facing the following error when I run the spark-submit command.
Exception in thread "H2O Launcher thread" java.lang.ExceptionInInitializerError
at water.init.NetworkInit.initializeNetworkSockets(NetworkInit.java:77)
at water.H2O.startLocalNode(H2O.java:1621)
at water.H2O.main(H2O.java:2081)
at water.H2OStarter.start(H2OStarter.java:22)
at water.H2OStarter.start(H2OStarter.java:47)
at org.apache.spark.h2o.backends.internal.InternalBackendUtils$$anonfun$6$$anon$1.run(InternalBackendUtils.scala:173)
Caused by: java.lang.IllegalStateException: HTTP Server cannot be loaded: No implementation of HttpServerFacade found on classpath. Please refer to https://0xdata.atlassian.net/browse/TN-13 for details.
at water.webserver.iface.HttpServerLoader.<clinit>(HttpServerLoader.java:16)
... 6 more
I have tried out the solution mentioned at the location https://0xdata.atlassian.net/browse/TN-13
but for some reason it still isn't able to find the ai.h2o:h2o-jetty-8 on the classpath.
I resolved the issue by adding the following maven coordinates to the packages option in spark-submit
--packages ai.h2o:h2o-jetty-8:3.24.0.3

Apache Beam run with Flink throws NoSuchMethodError

I get a beam program jar by maven,and i want to run it with flink local.
when i run like this ,it is ok:
mvn exec:java -Dexec.mainClass=GroupbyTest -Dexec.args="--runner=FlinkRunner
--flinkMaster=localhost:6123
--filesToStage=target/beamTest-1.0-SNAPSHOT.jar"
but when i use flink run,there are something wrong with protobuf:
./bin/flink run /home/maqy/Documents/beam_samples/beamTest/target/beamTest-1.0-SNAPSHOT.jar --runner=FlinkRunner
and there are logs:
Using the result of 'hadoop classpath' to augment the Hadoop classpath: /usr/local/hadoop-2.7.5/etc/hadoop:/usr/local/hadoop-2.7.5/share/hadoop/common/lib/*:/usr/local/hadoop-2.7.5/share/hadoop/common/*:/usr/local/hadoop-2.7.5/share/hadoop/hdfs:/usr/local/hadoop-2.7.5/share/hadoop/hdfs/lib/*:/usr/local/hadoop-2.7.5/share/hadoop/hdfs/*:/usr/local/hadoop-2.7.5/share/hadoop/yarn/lib/*:/usr/local/hadoop-2.7.5/share/hadoop/yarn/*:/usr/local/hadoop-2.7.5/share/hadoop/mapreduce/lib/*:/usr/local/hadoop-2.7.5/share/hadoop/mapreduce/*:/usr/local/hadoop-2.7.5/contrib/capacity-scheduler/*.jar
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/maqy/%e4%b8%8b%e8%bd%bd/flink-1.4.0/lib/slf4j-log4j12-1.7.7.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/hadoop-2.7.5/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
Cluster configuration: Standalone cluster with JobManager at localhost/127.0.0.1:6123
Using address localhost:6123 to connect to JobManager.
JobManager web interface address http://localhost:8081
Starting execution of program
------------------------------------------------------------
The program finished with the following exception:
java.lang.NoSuchMethodError: com.google.protobuf.Descriptors$Descriptor.getOneofs()Ljava/util/List;
at com.google.protobuf.GeneratedMessageV3$FieldAccessorTable.<init>(GeneratedMessageV3.java:1707)
at com.google.protobuf.AnyProto.<clinit>(AnyProto.java:52)
at org.apache.beam.model.pipeline.v1.RunnerApi.<clinit>(RunnerApi.java:53271)
at org.apache.beam.model.pipeline.v1.RunnerApi$Components$TransformsDefaultEntryHolder.<clinit>(RunnerApi.java:448)
at org.apache.beam.model.pipeline.v1.RunnerApi$Components$Builder.internalGetTransforms(RunnerApi.java:1339)
at org.apache.beam.model.pipeline.v1.RunnerApi$Components$Builder.getTransformsOrDefault(RunnerApi.java:1404)
at org.apache.beam.runners.core.construction.SdkComponents.registerPTransform(SdkComponents.java:81)
at org.apache.beam.runners.core.construction.PipelineTranslation$1.visitPrimitiveTransform(PipelineTranslation.java:87)
at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:670)
at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:662)
at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:662)
at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$600(TransformHierarchy.java:311)
at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:245)
at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:458)
at org.apache.beam.runners.core.construction.PipelineTranslation.toProto(PipelineTranslation.java:59)
at org.apache.beam.runners.core.construction.PipelineTranslation.toProto(PipelineTranslation.java:53)
at org.apache.beam.runners.flink.FlinkPipelineExecutionEnvironment.translate(FlinkPipelineExecutionEnvironment.java:91)
at org.apache.beam.runners.flink.FlinkRunner.run(FlinkRunner.java:110)
at org.apache.beam.sdk.Pipeline.run(Pipeline.java:311)
at org.apache.beam.sdk.Pipeline.run(Pipeline.java:297)
at GroupbyTest.main(GroupbyTest.java:100)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:525)
at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:417)
at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:396)
at org.apache.flink.client.CliFrontend.executeProgram(CliFrontend.java:802)
at org.apache.flink.client.CliFrontend.run(CliFrontend.java:282)
at org.apache.flink.client.CliFrontend.parseParameters(CliFrontend.java:1054)
at org.apache.flink.client.CliFrontend$1.call(CliFrontend.java:1101)
at org.apache.flink.client.CliFrontend$1.call(CliFrontend.java:1098)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
at org.apache.flink.client.CliFrontend.main(CliFrontend.java:1098)
I want to know how to fix it , thanks.
A NoSuchMethodError typically indicates a version problem. A class with the right package and name is found in the classpath (otherwise you'd see a ClassNotFoundException) but it lacks a certain method.
This usually happens if the classpath contains the wrong version of a dependency, in your case that's probably Google ProtoBuf.
I have solved this problem by using Beam 2.6.

PropertiesLauncher working differently with slf4j bindings in newer version of spring boot

I have a spring boot project that uses the PropertiesLauncher to load a bunch of hadoop and hive jars at startup to provide connectivity to hadoop and hive. I am using slf4j with logback in my project and when I load the hive-jdbc jars they bring along log4j classes which cause a conflict. This is not an issue as long as I am using springBootVersion = '1.2.3.RELEASE' in my build.gradle.
I have configured the PropertiesLauncher in my build.gradle file
springBoot { layout = 'ZIP' }
bootRepackage {
mainClass = 'com....Application'
enabled = true
}
And starting up the application using this command
java -Dloader.path=file:///etc/hadoop/conf,file:///etc/hive/conf,jars,
byod-ui-1.0.0.SNAPSHOT.jar -jar byod-ui-1.0.0.SNAPSHOT.jar
When the projects start up, the output looks like this
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/byod/byod-ui-1.0.0.SNAPSHOT.jar!/lib/logback-classic-1.1.3.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/byod/jars/hive-jdbc-0.14.0.2.2.8.0-3150-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/byod/jars/hive-jdbc.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [ch.qos.logback.classic.util.ContextSelectorStaticBinder]
Notice that my application jar is detected first and then the hive-jdbc jars and I am assuming because my jar is detected first, the final line indicates that the binding selected is ch.qos.logback.classic.util.ContextSelectorStaticBinder which is logback so everything works great.
If I only change the spring boot version (and nothing else in code or configuration or jars/classpath setup) to springBootVersion = '1.3.2.RELEASE' Now the output looks like this
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/byod/jars/hive-jdbc.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/byod/jars/hive-jdbc-0.14.0.2.2.8.0-3150-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/byod/byod-ui-1.0.0.SNAPSHOT.jar!/lib/logback-classic-1.1.3.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.springframework.boot.loader.MainMethodRunner.run(MainMethodRunner.java:53)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.IllegalArgumentException: LoggerFactory is not a Logback LoggerContext
but Logback is on the classpath. Either remove Logback or the competing implementation
(class org.slf4j.impl.Log4jLoggerFactory loaded from jar:file:/home/aq728y/byod/jars/hive-jdbc.jar!/).
If you are using WebLogic you will need to add 'org.slf4j' to prefer-application-packages in WEB-INF/weblogic.xml Object
of class [org.slf4j.impl.Log4jLoggerFactory] must be an instance of class ch.qos.logback.classic.LoggerContext
Now the order of detected bindings is different. This time, first bindings are detected from the hive-jdbc.jar instead of my application jar and that leads to log4j becoming the "actual binding" at the end. This results in an error and the application start up FAILS.
I wanted to provide these details and post this question to see if there were some changes in the recent version of spring boot that would explain this behavior and possibly help with the resolution. If possible, I would like to continue to use logback and not have to switch to log4j.
In Spring Boot 1.3.x the handling of classpath order has changed.
In 1.2.x, the order was reversed, so specifying
-Dloader.path=file:///etc/hadoop/conf,file:///etc/hive/conf,jars,
byod-ui-1.0.0.SNAPSHOT.jar
produced the following classpath order:
byod-ui-1.0.0.SNAPSHOT.jar
jars
file:///etc/hive/conf
file:///etc/hadoop/conf
In 1.3.x the classpath isn't reverted anymore, so the same commandline options leads to the follwing classpath order:
file:///etc/hadoop/conf
file:///etc/hive/conf
jars
byod-ui-1.0.0.SNAPSHOT.jar
and this causes slf4j to pick the bindings in the hive-jars up first.
So the solution is to simply revert the order on commandline:
-Dloader.path=byod-ui-1.0.0.SNAPSHOT.jar,jars,file:///etc/hive/conf,file:///etc/hadoop/conf
See commit for further informations:
https://github.com/spring-projects/spring-boot/commit/bfa816f2a30dbc188ca563da8f28c22417d907e5

Spring web project deploy failed in tomcat but works in jetty

While using tomcat to deploy my spring web project,it gives me the following exception:
org.apache.catalina.startup.ContextConfig processAn
notationsJar
SEVERE: Unable to process Jar entry [com/ibm/icu/impl/data/LocaleElements_zh__PI
NYIN.class] from Jar [jar:file:/D:/myproject/WEB-INF/lib/icu4j-2.6.1.jar!/] for annot
ations
org.apache.tomcat.util.bcel.classfile.ClassFormatException: Invalid byte tag in
constant pool: 60
at org.apache.tomcat.util.bcel.classfile.Constant.readConstant(Constant.
java:133)
at org.apache.tomcat.util.bcel.classfile.ConstantPool.<init>(ConstantPoo
l.java:60)
at org.apache.tomcat.util.bcel.classfile.ClassParser.readConstantPool(Cl
assParser.java:209)
at org.apache.tomcat.util.bcel.classfile.ClassParser.parse(ClassParser.j
ava:119)
at org.apache.catalina.startup.ContextConfig.processAnnotationsStream(Co
ntextConfig.java:2104)
at org.apache.catalina.startup.ContextConfig.processAnnotationsJar(Conte
xtConfig.java:1980)
at org.apache.catalina.startup.ContextConfig.processAnnotationsUrl(Conte
xtConfig.java:1946)
at org.apache.catalina.startup.ContextConfig.processAnnotations(ContextC
onfig.java:1931)
at org.apache.catalina.startup.ContextConfig.webConfig(ContextConfig.jav
a:1325)
at org.apache.catalina.startup.ContextConfig.configureStart(ContextConfi
g.java:878)
at org.apache.catalina.startup.ContextConfig.lifecycleEvent(ContextConfi
g.java:369)
at org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(Lifecycl
eSupport.java:119)
at org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBa
se.java:90)
at org.apache.catalina.core.StandardContext.startInternal(StandardContex
t.java:5173)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase
.java:901)
at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:87
7)
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:633)
at org.apache.catalina.startup.HostConfig.deployDescriptor(HostConfig.ja
va:657)
at org.apache.catalina.startup.HostConfig$DeployDescriptor.run(HostConfi
g.java:1637)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:47
1)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
at java.util.concurrent.FutureTask.run(FutureTask.java:166)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.
java:1110)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor
.java:603)
at java.lang.Thread.run(Thread.java:722)
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further detail
But it is working in jetty. Can anybody tell me what's the reason?
Thanks.
This seems to be the same issue as this one: Tomcat 7 - Servlet 3.0: Invalid byte tag in constant pool (an old version of com.ibm.icu:icu4j).

java.lang.ClassNotFoundException: org.apache.hadoop.util.ProgramDriver

I am trying to run mahout on my local system and when I run "./bin/mahout" I get the below mentioned error. All I am trying to do is run mahout without hadoop and try out the 20Newsgroup example.
I did "mvn compile" and "mvn install -Dmaven.test.skip=true" in the core, distribution and example directory. Not sure what else am I missing. I know that you could run mahout without Hadoop running on your system.
Appreciate if someone could help.
hadoop binary is not in PATH,HADOOP_HOME/bin,HADOOP_PREFIX/bin, running locally
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/Users/Aanchal/mahout-distribution-0.7/mahout-examples-0.7-job.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/Users/Aanchal/mahout-distribution-0.7/lib/slf4j-jcl-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/Users/Aanchal/mahout-distribution-0.7/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/util/ProgramDriver
at org.apache.mahout.driver.MahoutDriver.main(MahoutDriver.java:96)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.util.ProgramDriver
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
Add this line
CLASSPATH=${CLASSPATH}:$MAHOUT_HOME/lib/hadoop/hadoop-core-0.20.204.0.jar;
to the end of the section in mahout.sh/bat file
# add release dependencies to CLASSPATH
for f in $MAHOUT_HOME/lib/*.jar; do
CLASSPATH=${CLASSPATH}:$f;
done
This exception indicates that the class was not found on the classpath i.e we are trying to load the class definition and class/jar containing the class does not exist in the classpath.
Please check your PATH and HADOOP_HOME configuration and update these variables accordingly.

Resources