com.interwoven.cssdk.common.CSException: (java.net.SocketException: Connection reset)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at com.interwoven.cssdk.client.axis.common.AxisExceptionTranslator.translateException(AxisExceptionTranslator.java:72)
at com.interwoven.cssdk.client.axis.access.AccessServiceAdapterImpl.beginSessionUsingPassword(AccessServiceAdapterImpl.java:228)
at com.interwoven.cssdk.client.axis.common.AxisFactory.getClient(AxisFactory.java:273)
Root cause:
AxisFault
faultCode: {http://schemas.xmlsoap.org/soap/envelope/}Server.userException
faultSubcode:
faultString: java.net.SocketException: Connection reset
faultActor:
faultNode:
faultDetail:
{http://xml.apache.org/axis/}stackTrace:java.net.SocketException: Connection reset
at java.net.SocketInputStream.read(SocketInputStream.java:210)
at java.net.SocketInputStream.read(SocketInputStream.java:141)
at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
I failed to convert a spring-boot fat jar with h2 driver into graalvm native image, following is the exception message:
Excluding 0 auto-configurations from spring.factories file
Processing existing META-INF/spring.components files...
Registered 15 entries
Configuring initialization time for specific types and packages:
#87 buildtime-init-classes #23 buildtime-init-packages #33 runtime-init-classes #1 runtime-init-packages
Error: Incompatible change of initialization policy for org.springframework.boot.validation.MessageInterpolatorFactory: trying to change RUN_TIME from the command line to BUILD_TIME from feature org.springframework.graalvm.support.InitializationHandler.register
com.oracle.svm.core.util.UserError$UserException: Incompatible change of initialization policy for org.springframework.boot.validation.MessageInterpolatorFactory: trying to change RUN_TIME from the command line to BUILD_TIME from feature org.springframework.graalvm.support.InitializationHandler.register
at com.oracle.svm.core.util.UserError.abort(UserError.java:68)
at com.oracle.svm.hosted.classinitialization.ClassInitializationConfiguration.insertRec(ClassInitializationConfiguration.java:98)
at com.oracle.svm.hosted.classinitialization.ClassInitializationConfiguration.insertRec(ClassInitializationConfiguration.java:112)
at com.oracle.svm.hosted.classinitialization.ClassInitializationConfiguration.insertRec(ClassInitializationConfiguration.java:112)
at com.oracle.svm.hosted.classinitialization.ClassInitializationConfiguration.insertRec(ClassInitializationConfiguration.java:112)
at com.oracle.svm.hosted.classinitialization.ClassInitializationConfiguration.insertRec(ClassInitializationConfiguration.java:112)
at com.oracle.svm.hosted.classinitialization.ClassInitializationConfiguration.insertRec(ClassInitializationConfiguration.java:112)
at com.oracle.svm.hosted.classinitialization.ClassInitializationConfiguration.insert(ClassInitializationConfiguration.java:63)
at com.oracle.svm.hosted.classinitialization.ConfigurableClassInitialization.initializeAtBuildTime(ConfigurableClassInitialization.java:392)
at org.graalvm.nativeimage.hosted.RuntimeClassInitialization.initializeAtBuildTime(RuntimeClassInitialization.java:118)
at org.springframework.graalvm.support.InitializationHandler.register(InitializationHandler.java:52)
at org.springframework.graalvm.support.SpringFeature.beforeAnalysis(SpringFeature.java:79)
at com.oracle.svm.hosted.NativeImageGenerator.lambda$runPointsToAnalysis$7(NativeImageGenerator.java:679)
at com.oracle.svm.hosted.FeatureHandler.forEachFeature(FeatureHandler.java:70)
at com.oracle.svm.hosted.NativeImageGenerator.runPointsToAnalysis(NativeImageGenerator.java:679)
at com.oracle.svm.hosted.NativeImageGenerator.doRun(NativeImageGenerator.java:538)
at com.oracle.svm.hosted.NativeImageGenerator.lambda$run$0(NativeImageGenerator.java:451)
at java.util.concurrent.ForkJoinTask$AdaptedRunnableAction.exec(ForkJoinTask.java:1386)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:157)
Error: Image build request failed with exit status 1
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 24.097 s
[INFO] Finished at: 2020-06-27T11:41:49+08:00
[INFO] Final Memory: 53M/1388M
[INFO] ------------------------------------------------------------------------
I did run it with agent first to obtain config files and include "-H:ConfigurationResourceRoots=nativeimage" in native-image-maven-plugin configuration.
The modules version used in my build environment:
native-image-maven-plugin: 20.1.0
spring-graalvm-native: 0.7.0
graal-sdk: 20.1.0
spring-boot-starter-parent: 2.3.0.RELEASE
spring-context-indexer
Can this be solved by configuration or else?
EDIT # 2020/07/01 --
Got the following error with
spring-boot-starter-parent:2.4.0-SNAPSHOT
spring-graalvm-native:0.8.0-SNAPSHOT
and
<buildArgs>-H:+ReportExceptionStackTraces -H:+TraceClassInitialization --allow-incomplete-classpath -H:+RemoveSaturatedTypeFlows -R:MaxHeapSize=16g -J-Xmx16G</buildArgs>
Warning: class initialization of class org.springframework.boot.validation.MessageInterpolatorFactory failed with exception java.lang.NoClassDefFoundError: javax/validation/ValidationException. This class will be initialized at run time because option --allow-incomplete-classpath is used for image building. Use the option --initialize-at-run-time=org.springframework.boot.validation.MessageInterpolatorFactory to explicitly request delayed initialization of this class.
Number of types dynamically registered for reflective access: #2826
[com.xxx.xxx.xxx.xxx:4160331] analysis: 68,477.47 ms, 5.20 GB
Error: Classes that should be initialized at run time got initialized during image building:
com.alibaba.druid.mock.MockDriver was unintentionally initialized at build time. org.h2.Driver caused initialization of this class with the following trace:
at com.alibaba.druid.mock.MockDriver.<clinit>(MockDriver.java)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at java.lang.Class.newInstance(Class.java:442)
at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:380)
at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
at java.sql.DriverManager$2.run(DriverManager.java:603)
at java.sql.DriverManager$2.run(DriverManager.java:583)
at java.security.AccessController.doPrivileged(Native Method)
at java.sql.DriverManager.loadInitialDrivers(DriverManager.java:583)
at java.sql.DriverManager.<clinit>(DriverManager.java:101)
at org.h2.Driver.load(Driver.java:155)
at org.h2.Driver.<clinit>(Driver.java:41)
com.mysql.jdbc.NonRegisteringDriver was unintentionally initialized at build time. org.h2.Driver caused initialization of this class with the following trace:
at com.mysql.jdbc.NonRegisteringDriver.<clinit>(NonRegisteringDriver.java)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at java.lang.Class.newInstance(Class.java:442)
at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:380)
at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
at java.sql.DriverManager$2.run(DriverManager.java:603)
at java.sql.DriverManager$2.run(DriverManager.java:583)
at java.security.AccessController.doPrivileged(Native Method)
at java.sql.DriverManager.loadInitialDrivers(DriverManager.java:583)
at java.sql.DriverManager.<clinit>(DriverManager.java:101)
at org.h2.Driver.load(Driver.java:155)
at org.h2.Driver.<clinit>(Driver.java:41)
com.mysql.fabric.jdbc.FabricMySQLDriver was unintentionally initialized at build time. org.h2.Driver caused initialization of this class with the following trace:
at com.mysql.fabric.jdbc.FabricMySQLDriver.<clinit>(FabricMySQLDriver.java)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at java.lang.Class.newInstance(Class.java:442)
at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:380)
at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
at java.sql.DriverManager$2.run(DriverManager.java:603)
at java.sql.DriverManager$2.run(DriverManager.java:583)
at java.security.AccessController.doPrivileged(Native Method)
at java.sql.DriverManager.loadInitialDrivers(DriverManager.java:583)
at java.sql.DriverManager.<clinit>(DriverManager.java:101)
at org.h2.Driver.load(Driver.java:155)
at org.h2.Driver.<clinit>(Driver.java:41)
com.alibaba.druid.proxy.DruidDriver was unintentionally initialized at build time. org.h2.Driver caused initialization of this class with the following trace:
at com.alibaba.druid.proxy.DruidDriver.<clinit>(DruidDriver.java)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at java.lang.Class.newInstance(Class.java:442)
at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:380)
at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
at java.sql.DriverManager$2.run(DriverManager.java:603)
at java.sql.DriverManager$2.run(DriverManager.java:583)
at java.security.AccessController.doPrivileged(Native Method)
at java.sql.DriverManager.loadInitialDrivers(DriverManager.java:583)
at java.sql.DriverManager.<clinit>(DriverManager.java:101)
at org.h2.Driver.load(Driver.java:155)
at org.h2.Driver.<clinit>(Driver.java:41)
com.mysql.jdbc.Driver was unintentionally initialized at build time. org.h2.Driver caused initialization of this class with the following trace:
at com.mysql.jdbc.Driver.<clinit>(Driver.java)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at java.lang.Class.newInstance(Class.java:442)
at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:380)
at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
at java.sql.DriverManager$2.run(DriverManager.java:603)
at java.sql.DriverManager$2.run(DriverManager.java:583)
at java.security.AccessController.doPrivileged(Native Method)
at java.sql.DriverManager.loadInitialDrivers(DriverManager.java:583)
at java.sql.DriverManager.<clinit>(DriverManager.java:101)
at org.h2.Driver.load(Driver.java:155)
at org.h2.Driver.<clinit>(Driver.java:41)
com.oracle.svm.core.util.UserError$UserException: Classes that should be initialized at run time got initialized during image building:
com.alibaba.druid.mock.MockDriver was unintentionally initialized at build time. org.h2.Driver caused initialization of this class with the following trace:
at com.alibaba.druid.mock.MockDriver.<clinit>(MockDriver.java)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at java.lang.Class.newInstance(Class.java:442)
at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:380)
at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
at java.sql.DriverManager$2.run(DriverManager.java:603)
at java.sql.DriverManager$2.run(DriverManager.java:583)
at java.security.AccessController.doPrivileged(Native Method)
at java.sql.DriverManager.loadInitialDrivers(DriverManager.java:583)
at java.sql.DriverManager.<clinit>(DriverManager.java:101)
at org.h2.Driver.load(Driver.java:155)
at org.h2.Driver.<clinit>(Driver.java:41)
com.mysql.jdbc.NonRegisteringDriver was unintentionally initialized at build time. org.h2.Driver caused initialization of this class with the following trace:
at com.mysql.jdbc.NonRegisteringDriver.<clinit>(NonRegisteringDriver.java)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at java.lang.Class.newInstance(Class.java:442)
at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:380)
at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
at java.sql.DriverManager$2.run(DriverManager.java:603)
at java.sql.DriverManager$2.run(DriverManager.java:583)
at java.security.AccessController.doPrivileged(Native Method)
at java.sql.DriverManager.loadInitialDrivers(DriverManager.java:583)
at java.sql.DriverManager.<clinit>(DriverManager.java:101)
at org.h2.Driver.load(Driver.java:155)
at org.h2.Driver.<clinit>(Driver.java:41)
com.mysql.fabric.jdbc.FabricMySQLDriver was unintentionally initialized at build time. org.h2.Driver caused initialization of this class with the following trace:
at com.mysql.fabric.jdbc.FabricMySQLDriver.<clinit>(FabricMySQLDriver.java)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at java.lang.Class.newInstance(Class.java:442)
at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:380)
at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
at java.sql.DriverManager$2.run(DriverManager.java:603)
at java.sql.DriverManager$2.run(DriverManager.java:583)
at java.security.AccessController.doPrivileged(Native Method)
at java.sql.DriverManager.loadInitialDrivers(DriverManager.java:583)
at java.sql.DriverManager.<clinit>(DriverManager.java:101)
at org.h2.Driver.load(Driver.java:155)
at org.h2.Driver.<clinit>(Driver.java:41)
com.alibaba.druid.proxy.DruidDriver was unintentionally initialized at build time. org.h2.Driver caused initialization of this class with the following trace:
at com.alibaba.druid.proxy.DruidDriver.<clinit>(DruidDriver.java)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at java.lang.Class.newInstance(Class.java:442)
at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:380)
at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
at java.sql.DriverManager$2.run(DriverManager.java:603)
at java.sql.DriverManager$2.run(DriverManager.java:583)
at java.security.AccessController.doPrivileged(Native Method)
at java.sql.DriverManager.loadInitialDrivers(DriverManager.java:583)
at java.sql.DriverManager.<clinit>(DriverManager.java:101)
at org.h2.Driver.load(Driver.java:155)
at org.h2.Driver.<clinit>(Driver.java:41)
com.mysql.jdbc.Driver was unintentionally initialized at build time. org.h2.Driver caused initialization of this class with the following trace:
at com.mysql.jdbc.Driver.<clinit>(Driver.java)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at java.lang.Class.newInstance(Class.java:442)
at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:380)
at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
at java.sql.DriverManager$2.run(DriverManager.java:603)
at java.sql.DriverManager$2.run(DriverManager.java:583)
at java.security.AccessController.doPrivileged(Native Method)
at java.sql.DriverManager.loadInitialDrivers(DriverManager.java:583)
at java.sql.DriverManager.<clinit>(DriverManager.java:101)
at org.h2.Driver.load(Driver.java:155)
at org.h2.Driver.<clinit>(Driver.java:41)
at com.oracle.svm.core.util.UserError.abort(UserError.java:68)
at com.oracle.svm.hosted.classinitialization.ConfigurableClassInitialization.checkDelayedInitialization(ConfigurableClassInitialization.java:518)
at com.oracle.svm.hosted.classinitialization.ClassInitializationFeature.duringAnalysis(ClassInitializationFeature.java:187)
at com.oracle.svm.hosted.NativeImageGenerator.lambda$runPointsToAnalysis$8(NativeImageGenerator.java:720)
at com.oracle.svm.hosted.FeatureHandler.forEachFeature(FeatureHandler.java:70)
at com.oracle.svm.hosted.NativeImageGenerator.runPointsToAnalysis(NativeImageGenerator.java:720)
at com.oracle.svm.hosted.NativeImageGenerator.doRun(NativeImageGenerator.java:538)
at com.oracle.svm.hosted.NativeImageGenerator.lambda$run$0(NativeImageGenerator.java:451)
at java.util.concurrent.ForkJoinTask$AdaptedRunnableAction.exec(ForkJoinTask.java:1386)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:157)
EDIT # 2020/07/02 --
I got the following error when I tried to init the "unintentionally initialized" classes at build time.
Warning: class initialization of class org.springframework.boot.validation.MessageInterpolatorFactory failed with exception java.lang.NoClassDefFoundError: javax/validation/ValidationException. This class will be initialized at run time because option --allow-incomplete-classpath is used for image building. Use the option --initialize-at-run-time=org.springframework.boot.validation.MessageInterpolatorFactory to explicitly request delayed initialization of this class.
Number of types dynamically registered for reflective access: #2824
[com.xxx.xxx.xxx.xxx:1276846] (clinit): 1,639.21 ms, 6.48 GB
[com.xxx.xxx.xxx.xxx:1276846] (typeflow): 41,244.42 ms, 6.48 GB
[com.xxx.xxx.xxx.xxx:1276846] (objects): 34,338.05 ms, 6.48 GB
[com.xxx.xxx.xxx.xxx:1276846] (features): 9,918.59 ms, 6.48 GB
[com.xxx.xxx.xxx.xxx:1276846] analysis: 89,221.06 ms, 6.48 GB
Error: Unsupported features in 2 methods
Detailed message:
Error: type is not available in this platform: org.graalvm.compiler.hotspot.management.SVMMBean
Trace: Object was reached by
reading field com.sun.jmx.mbeanserver.NamedObject.object of
constant com.sun.jmx.mbeanserver.NamedObject#4e61fbc2 reached by
reading field java.util.HashMap$Node.value of
constant java.util.HashMap$Node#41a5eec0 reached by
reading field java.util.HashMap$Node.next of
constant java.util.HashMap$Node#4a321ac0 reached by
indexing into array
constant java.util.HashMap$Node[]#5ec2c4b0 reached by
reading field java.util.HashMap.table of
constant java.util.HashMap#441dc7c0 reached by
reading field java.util.HashMap$Node.value of
constant java.util.HashMap$Node#8f15e1fa reached by
indexing into array
constant java.util.HashMap$Node[]#2a17237f reached by
reading field java.util.HashMap.table of
constant java.util.HashMap#5eace3ce reached by
reading field com.sun.jmx.mbeanserver.Repository.domainTb of
constant com.sun.jmx.mbeanserver.Repository#1397c899 reached by
reading field com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.repository of
constant com.sun.jmx.interceptor.DefaultMBeanServerInterceptor#1a8e83d1 reached by
reading field com.sun.jmx.mbeanserver.JmxMBeanServer.mbsInterceptor of
constant com.sun.jmx.mbeanserver.JmxMBeanServer#1d840f77 reached by
reading field java.lang.ref.Reference.referent of
constant com.sun.jmx.mbeanserver.WeakIdentityHashMap$IdentityWeakReference#1d840f77 reached by
reading field java.util.HashMap$Node.key of
constant java.util.HashMap$Node#61056459 reached by
indexing into array
constant java.util.HashMap$Node[]#1db861a7 reached by
reading field java.util.HashMap.table of
constant java.util.HashMap#61056459 reached by
reading field com.sun.jmx.mbeanserver.WeakIdentityHashMap.map of
constant com.sun.jmx.mbeanserver.WeakIdentityHashMap#53bb4e30 reached by
scanning method com.sun.jmx.mbeanserver.MXBeanLookup.lookupFor(MXBeanLookup.java:93)
Call path from entry point to com.sun.jmx.mbeanserver.MXBeanLookup.lookupFor(MBeanServerConnection):
at com.sun.jmx.mbeanserver.MXBeanLookup.lookupFor(MXBeanLookup.java:93)
at com.sun.jmx.mbeanserver.MXBeanProxy.invoke(MXBeanProxy.java:162)
at javax.management.MBeanServerInvocationHandler.invoke(MBeanServerInvocationHandler.java:258)
at com.sun.proxy.$Proxy282.duplicate(Unknown Source)
at com.mysql.jdbc.StatementImpl$CancelTask$1.run(StatementImpl.java:124)
at com.oracle.svm.core.thread.JavaThreads.threadStartRoutine(JavaThreads.java:517)
at com.oracle.svm.core.posix.thread.PosixJavaThreads.pthreadStartRoutine(PosixJavaThreads.java:193)
at com.oracle.svm.core.code.IsolateEnterStub.PosixJavaThreads_pthreadStartRoutine_e1f4a8c0039f8337338252cd8734f63a79b5e3df(generated:0)
this repeats 6 times with different call paths.
After upgrading to Spring Boot 2.0 my working SQLs are throwing following exception -
Please help me here.
Caused by: java.lang.IllegalArgumentException:
org.hibernate.hql.internal.ast.QuerySyntaxException: Invalid path:
'com.wtp.core.constant.RECORDSTATUS.ACTIVE' [SELECT l FROM
com.wtp.core.entity.ListingComments l WHERE l.listingID = :listingID
AND l.recordStatus=com.wtp.core.constant.RECORDSTATUS.ACTIVE AND
l.listingStatus =com.wtp.core.constant.LISTINGSTATUS.ACTIVE ORDER BY
l.createdOn asc] at
org.hibernate.internal.ExceptionConverterImpl.convert(ExceptionConverterImpl.java:133)
~[hibernate-core-5.2.14.Final.jar:5.2.14.Final] at
org.hibernate.internal.ExceptionConverterImpl.convert(ExceptionConverterImpl.java:157)
~[hibernate-core-5.2.14.Final.jar:5.2.14.Final] at
org.hibernate.internal.ExceptionConverterImpl.convert(ExceptionConverterImpl.java:164)
~[hibernate-core-5.2.14.Final.jar:5.2.14.Final] at
org.hibernate.internal.AbstractSharedSessionContract.createQuery(AbstractSharedSessionContract.java:670)
~[hibernate-core-5.2.14.Final.jar:5.2.14.Final] at
org.hibernate.internal.AbstractSessionImpl.createQuery(AbstractSessionImpl.java:23)
~[hibernate-core-5.2.14.Final.jar:5.2.14.Final] at
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
~[na:1.8.0_131] at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
~[na:1.8.0_131] at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
~[na:1.8.0_131] at java.lang.reflect.Method.invoke(Method.java:498)
~[na:1.8.0_131] at
org.springframework.orm.jpa.ExtendedEntityManagerCreator$ExtendedEntityManagerInvocationHandler.invoke(ExtendedEntityManagerCreator.java:350)
~[spring-orm-5.0.4.RELEASE.jar:5.0.4.RELEASE] at
com.sun.proxy.$Proxy120.createQuery(Unknown Source) ~[na:na] at
org.springframework.data.jpa.repository.query.SimpleJpaQuery.validateQuery(SimpleJpaQuery.java:87)
~[spring-data-jpa-2.0.5.RELEASE.jar:2.0.5.RELEASE] ... 76 common
frames omitted Caused by:
org.hibernate.hql.internal.ast.QuerySyntaxException: Invalid path:
'com.wtp.core.constant.RECORDSTATUS.ACTIVE' [SELECT l FROM
com.wtp.core.entity.ListingComments l WHERE l.listingID = :listingID
AND l.recordStatus=com.wtp.core.constant.RECORDSTATUS.ACTIVE AND
l.listingStatus =com.wtp.core.constant.LISTINGSTATUS.ACTIVE ORDER BY
l.createdOn asc] at
org.hibernate.hql.internal.ast.QuerySyntaxException.convert(QuerySyntaxException.java:74)
~[hibernate-core-5.2.14.Final.jar:5.2.14.Final] at
org.hibernate.hql.internal.ast.ErrorCounter.throwQueryException(ErrorCounter.java:91)
~[hibernate-core-5.2.14.Final.jar:5.2.14.Final] at
org.hibernate.hql.internal.ast.QueryTranslatorImpl.analyze(QueryTranslatorImpl.java:272)
~[hibernate-core-5.2.14.Final.jar:5.2.14.Final] at
org.hibernate.hql.internal.ast.QueryTranslatorImpl.doCompile(QueryTranslatorImpl.java:189)
~[hibernate-core-5.2.14.Final.jar:5.2.14.Final] at
org.hibernate.hql.internal.ast.QueryTranslatorImpl.compile(QueryTranslatorImpl.java:141)
~[hibernate-core-5.2.14.Final.jar:5.2.14.Final] at
org.hibernate.engine.query.spi.HQLQueryPlan.<init>(HQLQueryPlan.java:115)
~[hibernate-core-5.2.14.Final.jar:5.2.14.Final] at
org.hibernate.engine.query.spi.HQLQueryPlan.<init>(HQLQueryPlan.java:77)
~[hibernate-core-5.2.14.Final.jar:5.2.14.Final] at
org.hibernate.engine.query.spi.QueryPlanCache.getHQLQueryPlan(QueryPlanCache.java:153)
~[hibernate-core-5.2.14.Final.jar:5.2.14.Final] at
org.hibernate.internal.AbstractSharedSessionContract.getQueryPlan(AbstractSharedSessionContract.java:553)
~[hibernate-core-5.2.14.Final.jar:5.2.14.Final] at
org.hibernate.internal.AbstractSharedSessionContract.createQuery(AbstractSharedSessionContract.java:662)
~[hibernate-core-5.2.14.Final.jar:5.2.14.Final] ... 84 common frames
omitted
I believe this is a minor bug in Hibernate 5.2. I ran into it as well. After some investigation I worked out it is caused by the name of the enum starting with capitals.
Just rename your enum to ListingStatus and it should work (LISTINGSTATUS is not standard java naming convention anyway).
Bug is reported here: https://hibernate.atlassian.net/browse/HHH-12429
Update:
I asked for this issue to be closed since it is intended behaviour as explained here: https://vladmihalcea.com/the-performance-penalty-of-class-forname-when-parsing-jpql-and-criteria-queries
If you want to go back to the old behaviour set the property: hibernate.query.conventional_java_constants to false
I'm following this blog post that partitions S3 Access Logs by date using Hive and EMR. I was able to run this script against a small bucket of access logs okay, but table creation on top of a large bucket (~ 1.5 TB) fails with the following error:
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. org.apache.thrift.transport.TTransportException
I looked through the hive logs, but nothing stand out: /mnt/var/log/hive. Not sure what the problem is as this error is pretty generic. I'm pretty much following the blog post verbatim and the script errors out on the following after 10 or 15 minutes
CREATE EXTERNAL TABLE IF NOT EXISTS Accesslogs(....
Update: I found more log info and also ran Hive in debug mode. EMR gets intermittent connection failures to the metastore then finally fails
.........
at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:199) [hive-exec-2.3.1-amzn-0.jar:2.3.1-amzn-0]
at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:100) [hive-exec-2.3.1-amzn-0.jar:2.3.1-amzn-0]
at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2183) [hive-exec-2.3.1-amzn-0.jar:2.3.1-amzn-0]
at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1839) [hive-exec-2.3.1-amzn-0.jar:2.3.1-amzn-0]
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1526) [hive-exec-2.3.1-amzn-0.jar:2.3.1-amzn-0]
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1237) [hive-exec-2.3.1-amzn-0.jar:2.3.1-amzn-0]
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1227) [hive-exec-2.3.1-amzn-0.jar:2.3.1-amzn-0]
at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:233) [hive-cli-2.3.1-amzn-0.jar:2.3.1-amzn-0]
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:184) [hive-cli-2.3.1-amzn-0.jar:2.3.1-amzn-0]
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:403) [hive-cli-2.3.1-amzn-0.jar:2.3.1-amzn-0]
at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:821) [hive-cli-2.3.1-amzn-0.jar:2.3.1-amzn-0]
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759) [hive-cli-2.3.1-amzn-0.jar:2.3.1-amzn-0]
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:686) [hive-cli-2.3.1-amzn-0.jar:2.3.1-amzn-0]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_151]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_151]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_151]
at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_151]
at org.apache.hadoop.util.RunJar.run(RunJar.java:221) [hadoop-common-2.7.3-amzn-5.jar:?]
at org.apache.hadoop.util.RunJar.main(RunJar.java:136) [hadoop-common-2.7.3-amzn-5.jar:?]
Caused by: java.net.ConnectException: Connection refused (Connection refused)
at java.net.PlainSocketImpl.socketConnect(Native Method) ~[?:1.8.0_151]
at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350) ~[?:1.8.0_151]
at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206) ~[?:1.8.0_151]
at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188) ~[?:1.8.0_151]
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392) ~[?:1.8.0_151]
at java.net.Socket.connect(Socket.java:589) ~[?:1.8.0_151]
at org.apache.thrift.transport.TSocket.open(TSocket.java:221) ~[hive-exec-2.3.1-amzn-0.jar:2.3.1-amzn-0]
... 33 more
2017-12-10T15:18:18,718 INFO [e74af478-3227-4bf9-9fde-74d8babcf5f0 main([])]: hive.metastore (HiveMetaStoreClient.java:open(506)) - Waiting 1 seconds before next connection attempt.
2017-12-10T15:18:19,719 INFO [e74af478-3227-4bf9-9fde-74d8babcf5f0 main([])]: hive.metastore (HiveMetaStoreClient.java:open(392)) - Trying to connect to metastore with URI thrift://ip-172-50-31-107.ec2.internal:9083
2017-12-10T15:18:19,719 WARN [e74af478-3227-4bf9-9fde-74d8babcf5f0 main([])]: hive.metastore (HiveMetaStoreClient.java:open(472)) - Failed to connect to the MetaStore Server...
org.apache.thrift.transport.TTransportException: java.net.ConnectException: Connection refused (Connection refused)
at org.apache.thrift.transport.TSocket.open(TSocket.java:226) ~[hive-exec-2.3.1-amzn-0.jar:2.3.1-amzn-0]
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:465) [hive-exec-2.3.1-amzn-0.jar:2.3.1-amzn-0]
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.reconnect(HiveMetaStoreClient.java:335) [hive-exec-2.3.1-amzn-0.jar:2.3.1-amzn-0]
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:163) [hive-exec-2.3.1-amzn-0.jar:2.3.1-amzn-0]
at com.sun.proxy.$Proxy37.createTable(Unknown Source) [?:?]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_151]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_151]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_151]
at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_151]
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient$SynchronizedHandler.invoke(HiveMetaStoreClient.java:2303) [hive-exec-2.3.1-amzn-0.jar:2.3.1-amzn-0]
at com.sun.proxy.$Proxy37.createTable(Unknown Source) [?:?]
at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:854) [hive-exec-2.3.1-amzn-0.jar:2.3.1-amzn-0]
at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:869) [hive-exec-2.3.1-amzn-0.jar:2.3.1-amzn-0]
at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:4356) [hive-exec-2.3.1-amzn-0.jar:2.3.1-amzn-0]
at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:354) [hive-exec-2.3.1-amzn-0.jar:2.3.1-amzn-0]
at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:199) [hive-exec-2.3.1-amzn-0.jar:2.3.1-amzn-0]
at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:100) [hive-exec-2.3.1-amzn-0.jar:2.3.1-amzn-0]
at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2183) [hive-exec-2.3.1-amzn-0.jar:2.3.1-amzn-0]
at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1839) [hive-exec-2.3.1-amzn-0.jar:2.3.1-amzn-0]
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1526) [hive-exec-2.3.1-amzn-0.jar:2.3.1-amzn-0]
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1237) [hive-exec-2.3.1-amzn-0.jar:2.3.1-amzn-0]
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1227) [hive-exec-2.3.1-amzn-0.jar:2.3.1-amzn-0]
at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:233) [hive-cli-2.3.1-amzn-0.jar:2.3.1-amzn-0]
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:184) [hive-cli-2.3.1-amzn-0.jar:2.3.1-amzn-0]
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:403) [hive-cli-2.3.1-amzn-0.jar:2.3.1-amzn-0]
at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:821) [hive-cli-2.3.1-amzn-0.jar:2.3.1-amzn-0]
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759) [hive-cli-2.3.1-amzn-0.jar:2.3.1-amzn-0]
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:686) [hive-cli-2.3.1-amzn-0.jar:2.3.1-amzn-0]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_151]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_151]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_151]
at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_151]
at org.apache.hadoop.util.RunJar.run(RunJar.java:221) [hadoop-common-2.7.3-amzn-5.jar:?]
at org.apache.hadoop.util.RunJar.main(RunJar.java:136) [hadoop-common-2.7.3-amzn-5.jar:?]
Caused by: java.net.ConnectException: Connection refused (Connection refused)
at java.net.PlainSocketImpl.socketConnect(Native Method) ~[?:1.8.0_151]
at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350) ~[?:1.8.0_151]
at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206) ~[?:1.8.0_151]
at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188) ~[?:1.8.0_151]
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392) ~[?:1.8.0_151]
at java.net.Socket.connect(Socket.java:589) ~[?:1.8.0_151]
at org.apache.thrift.transport.TSocket.open(TSocket.java:221) ~[hive-exec-2.3.1-amzn-0.jar:2.3.1-amzn-0]
... 33 more
2017-12-10T15:18:19,720 INFO [e74af478-3227-4bf9-9fde-74d8babcf5f0 main([])]: hive.metastore (HiveMetaStoreClient.java:open(506)) - Waiting 1 seconds before next connection attempt.
2017-12-10T15:18:20,721 INFO [e74af478-3227-4bf9-9fde-74d8babcf5f0 main([])]: hive.metastore (HiveMetaStoreClient.java:open(392)) - Trying to connect to metastore with URI thrift://ip-172-50-31-107.ec2.internal:9083
2017-12-10T15:18:20,721 INFO [e74af478-3227-4bf9-9fde-74d8babcf5f0 main([])]: hive.metastore (HiveMetaStoreClient.java:open(466)) - Opened a connection to metastore, current connections: 1
2017-12-10T15:18:20,795 INFO [e74af478-3227-4bf9-9fde-74d8babcf5f0 main([])]: hive.metastore (HiveMetaStoreClient.java:open(519)) - Connected to metastore.
2017-12-10T15:18:28,013 DEBUG [java-sdk-http-connection-reaper([])]: conn.PoolingHttpClientConnectionManager (PoolingHttpClientConnectionManager.java:closeIdleConnections(401)) - Closing connections idle longer than 60000 MILLISECONDS
2017-12-10T15:18:28,014 DEBUG [java-sdk-http-connection-reaper([])]: conn.PoolingHttpClientConnectionManager (PoolingHttpClientConnectionManager.java:closeIdleConnections(401)) - Closing connections idle longer than 60000 MILLISECONDS
2017-12-10T15:19:28,014 DEBUG [java-sdk-http-connection-reaper([])]: conn.PoolingHttpClientConnectionManager (PoolingHttpClientConnectionManager.java:closeIdleConnections(401)) - Closing connections idle longer than 60000 MILLISECONDS
2017-12-10T15:19:28,014 DEBUG [java-sdk-http-connection-reaper([])]: conn.PoolingHttpClientConnectionManager (PoolingHttpClientConnectionManager.java:closeIdleConnections(401)) - Closing connections idle longer than 60000 MILLISECONDS
2017-12-10T15:20:28,014 DEBUG [java-sdk-http-connection-reaper([])]: conn.PoolingHttpClientConnectionManager (PoolingHttpClientConnectionManager.java:closeIdleConnections(401)) - Closing connections idle longer than 60000 MILLISECONDS
2017-12-10T15:20:28,014 DEBUG [java-sdk-http-connection-reaper([])]: conn.PoolingHttpClientConnectionManager (PoolingHttpClientConnectionManager.java:closeIdleConnections(401)) - Closing connections idle longer than 60000 MILLISECONDS
2017-12-10T15:21:28,015 DEBUG [java-sdk-http-connection-reaper([])]: conn.PoolingHttpClientConnectionManager (PoolingHttpClientConnectionManager.java:closeIdleConnections(401)) - Closing connections idle longer than 60000 MILLISECONDS
2017-12-10T15:21:28,015 DEBUG [java-sdk-http-connection-reaper([])]: conn.PoolingHttpClientConnectionManager (PoolingHttpClientConnectionManager.java:closeIdleConnections(401)) - Closing connections idle longer than 60000 MILLISECONDS
2017-12-10T15:22:28,015 DEBUG [java-sdk-http-connection-reaper([])]: conn.PoolingHttpClientConnectionManager (PoolingHttpClientConnectionManager.java:closeIdleConnections(401)) - Closing connections idle longer than 60000 MILLISECONDS
2017-12-10T15:22:28,015 DEBUG [java-sdk-http-connection-reaper([])]: conn.PoolingHttpClientConnectionManager (PoolingHttpClientConnectionManager.java:closeIdleConnections(401)) - Closing connections idle longer than 60000 MILLISECONDS
2017-12-10T15:22:44,472 ERROR [e74af478-3227-4bf9-9fde-74d8babcf5f0 main([])]: exec.DDLTask (DDLTask.java:failed(639)) - org.apache.hadoop.hive.ql.metadata.HiveException: org.apache.thrift.transport.TTransportException
at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:864)
at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:869)
at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:4356)
at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:354)
at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:199)
at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:100)
at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2183)
at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1839)
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1526)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1237)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1227)
at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:233)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:184)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:403)
at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:821)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:686)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: org.apache.thrift.transport.TTransportException
at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132)
at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86)
at org.apache.thrift.protocol.TBinaryProtocol.readAll(TBinaryProtocol.java:429)
at org.apache.thrift.protocol.TBinaryProtocol.readI32(TBinaryProtocol.java:318)
at org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:219)
at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:77)
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_create_table_with_environment_context(ThriftHiveMetastore.java:1199)
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.create_table_with_environment_context(ThriftHiveMetastore.java:1185)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.create_table_with_environment_context(HiveMetaStoreClient.java:2372)
at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.create_table_with_environment_context(SessionHiveMetaStoreClient.java:93)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:737)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:725)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:173)
at com.sun.proxy.$Proxy37.createTable(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient$SynchronizedHandler.invoke(HiveMetaStoreClient.java:2303)
at com.sun.proxy.$Proxy37.createTable(Unknown Source)
at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:854)
... 22 more
2017-12-10T15:22:44,472 ERROR [e74af478-3227-4bf9-9fde-74d8babcf5f0 main([])]: ql.Driver (SessionState.java:printError(1126)) - FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. org.apache.thrift.transport.TTransportException
2017-12-10T15:22:44,472 DEBUG [e74af478-3227-4bf9-9fde-74d8babcf5f0 main([])]: ql.Driver (DriverContext.java:shutdown(132)) - Shutting down query CREATE EXTERNAL TABLE IF NOT EXISTS Accesslogs(
BucketOwner string,
Bucket string,
RequestDateTime string,
RemoteIP string,
Requester string,
RequestID string,
Operation string,
Key string,
RequestURI_operation string,
RequestURI_key string,
RequestURI_httpProtoversion string,
HTTPstatus string,
ErrorCode string,
BytesSent string,
ObjectSize string,
TotalTime string,
TurnAroundTime string,
Referrer string,
UserAgent string,
VersionId string)
ROW FORMAT SERDE 'org.apache.hadoop.hive.serde2.RegexSerDe'
WITH SERDEPROPERTIES (
'serialization.format' = '1',
At a guess, Hive is trying to do something which is fast on a filesystem (recursive treewalk, rename) which keels over when it gets to s3 and all these things are faked in the client.
Consider filing a JIRA against Hive on this; include any logs server side you can too, and try some different scaled files to see when things fail.
I have configured Hive (1.13.1) with Spark (1.4.0) and I am able to access all the Databases and Table from hive and my warehouse directory is hdfs://192.168.1.17:8020/user/hive/warehouse
But when, I am trying to save a Dataframe through Spark-Shell (using master) into Hive using df.saveAsTable("df") function, I got this error.
15/07/03 14:48:59 INFO audit: ugi=user ip=unknown-ip-addr cmd=get_database: default
15/07/03 14:48:59 INFO HiveMetaStore: 0: get_table : db=default tbl=df
15/07/03 14:48:59 INFO audit: ugi=user ip=unknown-ip-addr cmd=get_table : db=default tbl=df
java.net.ConnectException: Call From bdiuser-Vostro-3800/127.0.1.1 to 192.168.1.19:8020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:783)
at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:730)
at org.apache.hadoop.ipc.Client.call(Client.java:1414)
at org.apache.hadoop.ipc.Client.call(Client.java:1363)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
at com.sun.proxy.$Proxy14.getFileInfo(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:190)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
at com.sun.proxy.$Proxy14.getFileInfo(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:699)
at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1762)
at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1124)
at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1120)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1120)
at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1398)
at org.apache.spark.sql.sources.InsertIntoHadoopFsRelation.run(commands.scala:78)
at org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult$lzycompute(commands.scala:57)
at org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult(commands.scala:57)
at org.apache.spark.sql.execution.ExecutedCommand.doExecute(commands.scala:68)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:88)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:88)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:148)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:87)
at org.apache.spark.sql.SQLContext$QueryExecution.toRdd$lzycompute(SQLContext.scala:939)
at org.apache.spark.sql.SQLContext$QueryExecution.toRdd(SQLContext.scala:939)
at org.apache.spark.sql.sources.ResolvedDataSource$.apply(ddl.scala:332)
at org.apache.spark.sql.hive.execution.CreateMetastoreDataSourceAsSelect.run(commands.scala:239)
at org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult$lzycompute(commands.scala:57)
at org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult(commands.scala:57)
at org.apache.spark.sql.execution.ExecutedCommand.doExecute(commands.scala:68)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:88)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:88)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:148)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:87)
at org.apache.spark.sql.SQLContext$QueryExecution.toRdd$lzycompute(SQLContext.scala:939)
at org.apache.spark.sql.SQLContext$QueryExecution.toRdd(SQLContext.scala:939)
at org.apache.spark.sql.DataFrameWriter.saveAsTable(DataFrameWriter.scala:211)
at org.apache.spark.sql.DataFrame.saveAsTable(DataFrame.scala:1517)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:22)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:27)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:29)
at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:31)
at $iwC$$iwC$$iwC$$iwC.<init>(<console>:33)
at $iwC$$iwC$$iwC.<init>(<console>:35)
at $iwC$$iwC.<init>(<console>:37)
at $iwC.<init>(<console>:39)
at <init>(<console>:41)
at .<init>(<console>:45)
at .<clinit>(<console>)
at .<init>(<console>:7)
at .<clinit>(<console>)
at $print(<console>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:657)
at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:665)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:670)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:997)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.net.ConnectException: Connection refused
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:744)
at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:529)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:493)
at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:604)
at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:699)
at org.apache.hadoop.ipc.Client$Connection.access$2800(Client.java:367)
at org.apache.hadoop.ipc.Client.getConnection(Client.java:1462)
at org.apache.hadoop.ipc.Client.call(Client.java:1381)
... 86 more
When, I go through this error, I found that program tried different host for HDFS connection to save table.
And i also tried with different worker's spark-shell, I got same error.
Please find the example below:
val options = Map("path" -> hiveTablePath)
result.write.format("orc").partitionBy("partitiondate").options(options).mode(SaveMode.Append).saveAsTable(hiveTable)
I have explained this a little bit more in my blog.
With saveAsTable the default location that Spark saves to is controlled by the HiveMetastore (based on the docs). Another option would be to use saveAsParquetFile and specify the path and then later register that path with your hive metastore OR use the new DataFrameWriter interface and specify the path option write.format(source).mode(mode).options(options).saveAsTable(tableName).
You can write spark dataframe to the existing spark table.
Please find the example below:
df.write.mode("overwrite").saveAsTable("database.tableName")