PropertiesLauncher working differently with slf4j bindings in newer version of spring boot - spring

I have a spring boot project that uses the PropertiesLauncher to load a bunch of hadoop and hive jars at startup to provide connectivity to hadoop and hive. I am using slf4j with logback in my project and when I load the hive-jdbc jars they bring along log4j classes which cause a conflict. This is not an issue as long as I am using springBootVersion = '1.2.3.RELEASE' in my build.gradle.
I have configured the PropertiesLauncher in my build.gradle file
springBoot { layout = 'ZIP' }
bootRepackage {
mainClass = 'com....Application'
enabled = true
}
And starting up the application using this command
java -Dloader.path=file:///etc/hadoop/conf,file:///etc/hive/conf,jars,
byod-ui-1.0.0.SNAPSHOT.jar -jar byod-ui-1.0.0.SNAPSHOT.jar
When the projects start up, the output looks like this
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/byod/byod-ui-1.0.0.SNAPSHOT.jar!/lib/logback-classic-1.1.3.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/byod/jars/hive-jdbc-0.14.0.2.2.8.0-3150-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/byod/jars/hive-jdbc.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [ch.qos.logback.classic.util.ContextSelectorStaticBinder]
Notice that my application jar is detected first and then the hive-jdbc jars and I am assuming because my jar is detected first, the final line indicates that the binding selected is ch.qos.logback.classic.util.ContextSelectorStaticBinder which is logback so everything works great.
If I only change the spring boot version (and nothing else in code or configuration or jars/classpath setup) to springBootVersion = '1.3.2.RELEASE' Now the output looks like this
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/byod/jars/hive-jdbc.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/byod/jars/hive-jdbc-0.14.0.2.2.8.0-3150-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/byod/byod-ui-1.0.0.SNAPSHOT.jar!/lib/logback-classic-1.1.3.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.springframework.boot.loader.MainMethodRunner.run(MainMethodRunner.java:53)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.IllegalArgumentException: LoggerFactory is not a Logback LoggerContext
but Logback is on the classpath. Either remove Logback or the competing implementation
(class org.slf4j.impl.Log4jLoggerFactory loaded from jar:file:/home/aq728y/byod/jars/hive-jdbc.jar!/).
If you are using WebLogic you will need to add 'org.slf4j' to prefer-application-packages in WEB-INF/weblogic.xml Object
of class [org.slf4j.impl.Log4jLoggerFactory] must be an instance of class ch.qos.logback.classic.LoggerContext
Now the order of detected bindings is different. This time, first bindings are detected from the hive-jdbc.jar instead of my application jar and that leads to log4j becoming the "actual binding" at the end. This results in an error and the application start up FAILS.
I wanted to provide these details and post this question to see if there were some changes in the recent version of spring boot that would explain this behavior and possibly help with the resolution. If possible, I would like to continue to use logback and not have to switch to log4j.

In Spring Boot 1.3.x the handling of classpath order has changed.
In 1.2.x, the order was reversed, so specifying
-Dloader.path=file:///etc/hadoop/conf,file:///etc/hive/conf,jars,
byod-ui-1.0.0.SNAPSHOT.jar
produced the following classpath order:
byod-ui-1.0.0.SNAPSHOT.jar
jars
file:///etc/hive/conf
file:///etc/hadoop/conf
In 1.3.x the classpath isn't reverted anymore, so the same commandline options leads to the follwing classpath order:
file:///etc/hadoop/conf
file:///etc/hive/conf
jars
byod-ui-1.0.0.SNAPSHOT.jar
and this causes slf4j to pick the bindings in the hive-jars up first.
So the solution is to simply revert the order on commandline:
-Dloader.path=byod-ui-1.0.0.SNAPSHOT.jar,jars,file:///etc/hive/conf,file:///etc/hadoop/conf
See commit for further informations:
https://github.com/spring-projects/spring-boot/commit/bfa816f2a30dbc188ca563da8f28c22417d907e5

Related

Gradle with Apache Jena 3.11.0

I am trying to create a shadowjar including the latest Apache Jena 3.11 using the gradle build system and additional java code to create an "uber" package. To do so I found information here: https://jena.apache.org/documentation/notes/jena-repack.html however I am having difficulty to translate this to a gradle setup.
Does anyone knows how to achieve this?
5 actionable tasks: 1 executed, 4 up-to-date
Creating memory store
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
Exception in thread "main" java.lang.ExceptionInInitializerError
at nl.wur.ssb.RDFSimpleCon.RDFSimpleCon.createEmptyStore(RDFSimpleCon.java:150)
at nl.wur.ssb.RDFSimpleCon.RDFSimpleCon.<init>(RDFSimpleCon.java:62)
at nl.wur.ssb.RDFSimpleCon.RDFSimpleCon.<init>(RDFSimpleCon.java:159)
at nl.wur.ssb.RDFSimpleCon.Test.main(Test.java:7)
Caused by: java.lang.NullPointerException
at org.apache.jena.tdb.sys.EnvTDB.processGlobalSystemProperties(EnvTDB.java:33)
at org.apache.jena.tdb.TDB.init(TDB.java:252)
at org.apache.jena.tdb.sys.InitTDB.start(InitTDB.java:29)
at org.apache.jena.sys.JenaSystem.lambda$init$2(JenaSystem.java:116)
at java.util.ArrayList.forEach(ArrayList.java:1257)
at org.apache.jena.sys.JenaSystem.forEach(JenaSystem.java:191)
at org.apache.jena.sys.JenaSystem.forEach(JenaSystem.java:168)
at org.apache.jena.sys.JenaSystem.init(JenaSystem.java:114)
at org.apache.jena.tdb.TDBFactory.<clinit>(TDBFactory.java:40)
... 4 more
Java ServiceLoader files need to be merged when creating a repackaged jar file.
For Gradle, this is done with "mergeServiceFiles()" when using the shadowJar.
https://jena.apache.org/documentation/notes/jena-repack.html has the instructions for the Maven shade plugin.

Apache Beam run with Flink throws NoSuchMethodError

I get a beam program jar by maven,and i want to run it with flink local.
when i run like this ,it is ok:
mvn exec:java -Dexec.mainClass=GroupbyTest -Dexec.args="--runner=FlinkRunner
--flinkMaster=localhost:6123
--filesToStage=target/beamTest-1.0-SNAPSHOT.jar"
but when i use flink run,there are something wrong with protobuf:
./bin/flink run /home/maqy/Documents/beam_samples/beamTest/target/beamTest-1.0-SNAPSHOT.jar --runner=FlinkRunner
and there are logs:
Using the result of 'hadoop classpath' to augment the Hadoop classpath: /usr/local/hadoop-2.7.5/etc/hadoop:/usr/local/hadoop-2.7.5/share/hadoop/common/lib/*:/usr/local/hadoop-2.7.5/share/hadoop/common/*:/usr/local/hadoop-2.7.5/share/hadoop/hdfs:/usr/local/hadoop-2.7.5/share/hadoop/hdfs/lib/*:/usr/local/hadoop-2.7.5/share/hadoop/hdfs/*:/usr/local/hadoop-2.7.5/share/hadoop/yarn/lib/*:/usr/local/hadoop-2.7.5/share/hadoop/yarn/*:/usr/local/hadoop-2.7.5/share/hadoop/mapreduce/lib/*:/usr/local/hadoop-2.7.5/share/hadoop/mapreduce/*:/usr/local/hadoop-2.7.5/contrib/capacity-scheduler/*.jar
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/maqy/%e4%b8%8b%e8%bd%bd/flink-1.4.0/lib/slf4j-log4j12-1.7.7.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/hadoop-2.7.5/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
Cluster configuration: Standalone cluster with JobManager at localhost/127.0.0.1:6123
Using address localhost:6123 to connect to JobManager.
JobManager web interface address http://localhost:8081
Starting execution of program
------------------------------------------------------------
The program finished with the following exception:
java.lang.NoSuchMethodError: com.google.protobuf.Descriptors$Descriptor.getOneofs()Ljava/util/List;
at com.google.protobuf.GeneratedMessageV3$FieldAccessorTable.<init>(GeneratedMessageV3.java:1707)
at com.google.protobuf.AnyProto.<clinit>(AnyProto.java:52)
at org.apache.beam.model.pipeline.v1.RunnerApi.<clinit>(RunnerApi.java:53271)
at org.apache.beam.model.pipeline.v1.RunnerApi$Components$TransformsDefaultEntryHolder.<clinit>(RunnerApi.java:448)
at org.apache.beam.model.pipeline.v1.RunnerApi$Components$Builder.internalGetTransforms(RunnerApi.java:1339)
at org.apache.beam.model.pipeline.v1.RunnerApi$Components$Builder.getTransformsOrDefault(RunnerApi.java:1404)
at org.apache.beam.runners.core.construction.SdkComponents.registerPTransform(SdkComponents.java:81)
at org.apache.beam.runners.core.construction.PipelineTranslation$1.visitPrimitiveTransform(PipelineTranslation.java:87)
at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:670)
at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:662)
at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:662)
at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$600(TransformHierarchy.java:311)
at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:245)
at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:458)
at org.apache.beam.runners.core.construction.PipelineTranslation.toProto(PipelineTranslation.java:59)
at org.apache.beam.runners.core.construction.PipelineTranslation.toProto(PipelineTranslation.java:53)
at org.apache.beam.runners.flink.FlinkPipelineExecutionEnvironment.translate(FlinkPipelineExecutionEnvironment.java:91)
at org.apache.beam.runners.flink.FlinkRunner.run(FlinkRunner.java:110)
at org.apache.beam.sdk.Pipeline.run(Pipeline.java:311)
at org.apache.beam.sdk.Pipeline.run(Pipeline.java:297)
at GroupbyTest.main(GroupbyTest.java:100)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:525)
at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:417)
at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:396)
at org.apache.flink.client.CliFrontend.executeProgram(CliFrontend.java:802)
at org.apache.flink.client.CliFrontend.run(CliFrontend.java:282)
at org.apache.flink.client.CliFrontend.parseParameters(CliFrontend.java:1054)
at org.apache.flink.client.CliFrontend$1.call(CliFrontend.java:1101)
at org.apache.flink.client.CliFrontend$1.call(CliFrontend.java:1098)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
at org.apache.flink.client.CliFrontend.main(CliFrontend.java:1098)
I want to know how to fix it , thanks.
A NoSuchMethodError typically indicates a version problem. A class with the right package and name is found in the classpath (otherwise you'd see a ClassNotFoundException) but it lacks a certain method.
This usually happens if the classpath contains the wrong version of a dependency, in your case that's probably Google ProtoBuf.
I have solved this problem by using Beam 2.6.

slf4j multiple bindings with Mahout on Amazon EMR

I'm running a mahout job on Amazon EMR and getting the following exception:
ArrayUtil.oversize(II)I
attempt_201311181700_0002_m_000000_0: SLF4J: Class path contains multiple SLF4J bindings.
attempt_201311181700_0002_m_000000_0: SLF4J: Found binding in [jar:file:/home/hadoop/lib/slf4j-log4j12-1.7.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
attempt_201311181700_0002_m_000000_0: SLF4J: Found binding in [jar:file:/mnt/var/lib/hadoop/mapred/taskTracker/hadoop/jobcache/job_201311181700_0002/jars/job.jar!/org/slf4j/impl/StaticLoggerBinder.class]
attempt_201311181700_0002_m_000000_0: SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
attempt_201311181700_0002_m_000000_0: SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
Error: org.apache.lucene.util.ArrayUtil.oversize(II)I
attempt_201311181700_0002_m_000000_1: SLF4J: Class path contains multiple SLF4J bindings.
attempt_201311181700_0002_m_000000_1: SLF4J: Found binding in [jar:file:/home/hadoop/lib/slf4j-log4j12-1.7.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
attempt_201311181700_0002_m_000000_1: SLF4J: Found binding in [jar:file:/mnt/var/lib/hadoop/mapred/taskTracker/hadoop/jobcache/job_201311181700_0002/jars/job.jar!/org/slf4j/impl/StaticLoggerBinder.class]
attempt_201311181700_0002_m_000000_1: SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
attempt_201311181700_0002_m_000000_1: SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
Error: org.apache.lucene.util.ArrayUtil.oversize(II)I
Exception in thread "main" java.lang.IllegalStateException: Job failed!
at org.apache.mahout.vectorizer.collocations.llr.CollocDriver.generateCollocations(CollocDriver.java:238)
at org.apache.mahout.vectorizer.collocations.llr.CollocDriver.generateAllGrams(CollocDriver.java:187)
at org.apache.mahout.vectorizer.DictionaryVectorizer.createTermFrequencyVectors(DictionaryVectorizer.java:184)
at clustering.AmazonClusteringDriver.main(AmazonClusteringDriver.java:122)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.main(RunJar.java:187)
I excluded the slf4j dependency in mahout dependency; however, it doesn't solve the problem. So, where is the problem?
This is a very wrong place to ask this.
You should be asking this on the Mahout developer mailing list.
https://cwiki.apache.org/confluence/display/MAHOUT/Mailing+Lists,+IRC+and+Archives#MailingLists%2CIRCandArchives-MahoutUserList

java.lang.ClassNotFoundException: org.apache.hadoop.util.ProgramDriver

I am trying to run mahout on my local system and when I run "./bin/mahout" I get the below mentioned error. All I am trying to do is run mahout without hadoop and try out the 20Newsgroup example.
I did "mvn compile" and "mvn install -Dmaven.test.skip=true" in the core, distribution and example directory. Not sure what else am I missing. I know that you could run mahout without Hadoop running on your system.
Appreciate if someone could help.
hadoop binary is not in PATH,HADOOP_HOME/bin,HADOOP_PREFIX/bin, running locally
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/Users/Aanchal/mahout-distribution-0.7/mahout-examples-0.7-job.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/Users/Aanchal/mahout-distribution-0.7/lib/slf4j-jcl-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/Users/Aanchal/mahout-distribution-0.7/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/util/ProgramDriver
at org.apache.mahout.driver.MahoutDriver.main(MahoutDriver.java:96)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.util.ProgramDriver
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
Add this line
CLASSPATH=${CLASSPATH}:$MAHOUT_HOME/lib/hadoop/hadoop-core-0.20.204.0.jar;
to the end of the section in mahout.sh/bat file
# add release dependencies to CLASSPATH
for f in $MAHOUT_HOME/lib/*.jar; do
CLASSPATH=${CLASSPATH}:$f;
done
This exception indicates that the class was not found on the classpath i.e we are trying to load the class definition and class/jar containing the class does not exist in the classpath.
Please check your PATH and HADOOP_HOME configuration and update these variables accordingly.

SLF4J: java.lang.IllegalStateException: org.slf4j.LoggerFactory could not be successfully initialized

I have the following maven dependency in my pom file:
<!-- depends on slf4j-api, log4j -->
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
<version>1.6.4</version>
</dependency>
When I deploy the project into tomcat, I am getting the error message:
SEVERE: Exception sending context destroyed event to listener instance of class org.springframework.web.context.ContextLoaderListener
java.lang.ExceptionInInitializerError
at org.springframework.web.context.ContextLoaderListener.contextDestroyed(ContextLoaderListener.java:80)
at org.apache.catalina.core.StandardContext.listenerStop(StandardContext.java:4819)
at org.apache.catalina.core.StandardContext.stopInternal(StandardContext.java:5466)
at org.apache.catalina.util.LifecycleBase.stop(LifecycleBase.java:232)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:160)
at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:895)
at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:871)
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:615)
at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:958)
at org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1599)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
at java.util.concurrent.FutureTask.run(FutureTask.java:166)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603)
at java.lang.Thread.run(Thread.java:722)
Caused by: java.lang.IllegalStateException: org.slf4j.LoggerFactory could not be successfully initialized. See also http://www.slf4j.org/codes.html#unsuccessfulInit
at org.slf4j.LoggerFactory.getILoggerFactory(LoggerFactory.java:288)
at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:252)
at org.apache.commons.logging.impl.SLF4JLogFactory.getInstance(SLF4JLogFactory.java:155)
at org.apache.commons.logging.impl.SLF4JLogFactory.getInstance(SLF4JLogFactory.java:131)
at org.apache.commons.logging.LogFactory.getLog(LogFactory.java:685)
at org.springframework.web.context.ContextCleanupListener.<clinit>(ContextCleanupListener.java:43)
... 16 more
When I look at the deployed war file, I see in the war file the following jars (among others):
slf4j-api-1.6.4.jar
slf4j-log4j12-1.6.4.jar
The strange thing is, that I don't see there any log4j.jar (even though it's a dependency of slf4j-log4j12-1.6.4.jar
Questions:
Why was log4j.jar not packed in the war file?
What does the error message mean and how to solve it?
I had this issue due to a bad jar file in the repository. Deleting the entire log4j directory in the Maven repository fixed it once I did Maven > Update Dependencies and it re-downloaded them.
The simple thing is, cause you didn't give it as dependency in your pom, cause slf4j is a logging facade which means you have to give the real implementation with it. The error message gives you a hint to the explanation of the cause of this error.
I got this error too and I actually had the log4j.jar in my war file.
But it turned out to be a classloader issue in my case: I had a jar in my shared/lib which tried to log with slf4j but that classloader did not have the log4j available.
It seems as if you need to downgrade your project to slf4j 1.4.2 to work with the log4j available in your tomcat. They are binary incompatible. You would also hang on to the provided afterwards, so you dont include them double.
The alternative is to get the other project to include the libraries themselves and remove from tomcat common. I know of no other exclusion mechanism on the tomcat.
For reference, and probably not helpful, I currently deploy on weblogic and it has a deployment specification that allows one to exclude server common classes and use bundled classes instead, such as the below snippet (with reference)

Resources