org.hibernate.hql.internal.ast.QuerySyntaxException: Invalid path: - spring

After upgrading to Spring Boot 2.0 my working SQLs are throwing following exception -
Please help me here.
Caused by: java.lang.IllegalArgumentException:
org.hibernate.hql.internal.ast.QuerySyntaxException: Invalid path:
'com.wtp.core.constant.RECORDSTATUS.ACTIVE' [SELECT l FROM
com.wtp.core.entity.ListingComments l WHERE l.listingID = :listingID
AND l.recordStatus=com.wtp.core.constant.RECORDSTATUS.ACTIVE AND
l.listingStatus =com.wtp.core.constant.LISTINGSTATUS.ACTIVE ORDER BY
l.createdOn asc] at
org.hibernate.internal.ExceptionConverterImpl.convert(ExceptionConverterImpl.java:133)
~[hibernate-core-5.2.14.Final.jar:5.2.14.Final] at
org.hibernate.internal.ExceptionConverterImpl.convert(ExceptionConverterImpl.java:157)
~[hibernate-core-5.2.14.Final.jar:5.2.14.Final] at
org.hibernate.internal.ExceptionConverterImpl.convert(ExceptionConverterImpl.java:164)
~[hibernate-core-5.2.14.Final.jar:5.2.14.Final] at
org.hibernate.internal.AbstractSharedSessionContract.createQuery(AbstractSharedSessionContract.java:670)
~[hibernate-core-5.2.14.Final.jar:5.2.14.Final] at
org.hibernate.internal.AbstractSessionImpl.createQuery(AbstractSessionImpl.java:23)
~[hibernate-core-5.2.14.Final.jar:5.2.14.Final] at
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
~[na:1.8.0_131] at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
~[na:1.8.0_131] at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
~[na:1.8.0_131] at java.lang.reflect.Method.invoke(Method.java:498)
~[na:1.8.0_131] at
org.springframework.orm.jpa.ExtendedEntityManagerCreator$ExtendedEntityManagerInvocationHandler.invoke(ExtendedEntityManagerCreator.java:350)
~[spring-orm-5.0.4.RELEASE.jar:5.0.4.RELEASE] at
com.sun.proxy.$Proxy120.createQuery(Unknown Source) ~[na:na] at
org.springframework.data.jpa.repository.query.SimpleJpaQuery.validateQuery(SimpleJpaQuery.java:87)
~[spring-data-jpa-2.0.5.RELEASE.jar:2.0.5.RELEASE] ... 76 common
frames omitted Caused by:
org.hibernate.hql.internal.ast.QuerySyntaxException: Invalid path:
'com.wtp.core.constant.RECORDSTATUS.ACTIVE' [SELECT l FROM
com.wtp.core.entity.ListingComments l WHERE l.listingID = :listingID
AND l.recordStatus=com.wtp.core.constant.RECORDSTATUS.ACTIVE AND
l.listingStatus =com.wtp.core.constant.LISTINGSTATUS.ACTIVE ORDER BY
l.createdOn asc] at
org.hibernate.hql.internal.ast.QuerySyntaxException.convert(QuerySyntaxException.java:74)
~[hibernate-core-5.2.14.Final.jar:5.2.14.Final] at
org.hibernate.hql.internal.ast.ErrorCounter.throwQueryException(ErrorCounter.java:91)
~[hibernate-core-5.2.14.Final.jar:5.2.14.Final] at
org.hibernate.hql.internal.ast.QueryTranslatorImpl.analyze(QueryTranslatorImpl.java:272)
~[hibernate-core-5.2.14.Final.jar:5.2.14.Final] at
org.hibernate.hql.internal.ast.QueryTranslatorImpl.doCompile(QueryTranslatorImpl.java:189)
~[hibernate-core-5.2.14.Final.jar:5.2.14.Final] at
org.hibernate.hql.internal.ast.QueryTranslatorImpl.compile(QueryTranslatorImpl.java:141)
~[hibernate-core-5.2.14.Final.jar:5.2.14.Final] at
org.hibernate.engine.query.spi.HQLQueryPlan.<init>(HQLQueryPlan.java:115)
~[hibernate-core-5.2.14.Final.jar:5.2.14.Final] at
org.hibernate.engine.query.spi.HQLQueryPlan.<init>(HQLQueryPlan.java:77)
~[hibernate-core-5.2.14.Final.jar:5.2.14.Final] at
org.hibernate.engine.query.spi.QueryPlanCache.getHQLQueryPlan(QueryPlanCache.java:153)
~[hibernate-core-5.2.14.Final.jar:5.2.14.Final] at
org.hibernate.internal.AbstractSharedSessionContract.getQueryPlan(AbstractSharedSessionContract.java:553)
~[hibernate-core-5.2.14.Final.jar:5.2.14.Final] at
org.hibernate.internal.AbstractSharedSessionContract.createQuery(AbstractSharedSessionContract.java:662)
~[hibernate-core-5.2.14.Final.jar:5.2.14.Final] ... 84 common frames
omitted

I believe this is a minor bug in Hibernate 5.2. I ran into it as well. After some investigation I worked out it is caused by the name of the enum starting with capitals.
Just rename your enum to ListingStatus and it should work (LISTINGSTATUS is not standard java naming convention anyway).
Bug is reported here: https://hibernate.atlassian.net/browse/HHH-12429
Update:
I asked for this issue to be closed since it is intended behaviour as explained here: https://vladmihalcea.com/the-performance-penalty-of-class-forname-when-parsing-jpql-and-criteria-queries
If you want to go back to the old behaviour set the property: hibernate.query.conventional_java_constants to false

Related

Can Create Hive Table using XML but having an error when Querying

I need to create a hive table using a set of xml files in a hadoop directory.
I have tried using the code below:
add jar hdfs:///user/hivexmlserde-1.0.5.3.jar;
CREATE EXTERNAL TABLE test
(sport STRING)
ROW FORMAT SERDE 'com.ibm.spss.hive.serde2.xml.XmlSerDe'
WITH SERDEPROPERTIES (
"column.xpath.sport"="/event_list/event/#Sport")
STORED AS
INPUTFORMAT 'com.ibm.spss.hive.serde2.xml.XmlInputFormat'
OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.IgnoreKeyTextOutputFormat'
LOCATION'/user/2019-04-21'
TBLPROPERTIES (
"xmlinput.start"="<event_list>",
"xmlinput.end"="</event_list>");
It returns the error below:
So I tried adding another jar file.
add jar hdfs:///user/hive_serde.jar;
and with this I can create the table.
Querying select * from test limit 10; will return a result but when querying with aggregates like select sport, count(*) from test limit 10; it returns an error below:
ERROR : Status: Failed
ERROR : Vertex failed, vertexName=Map 1, vertexId=vertex_1555929930390_0107_1_00, diagnostics=[Vertex vertex_1555929930390_0107_1_00 [Map 1] killed/failed due to:INIT_FAILURE, Fail to create InputInitializerManager, org.apache.tez.dag.api.TezReflectionException: Unable to instantiate class with 1 arguments: org.apache.hadoop.hive.ql.exec.tez.HiveSplitGenerator
at org.apache.tez.common.ReflectionUtils.getNewInstance(ReflectionUtils.java:71)
at org.apache.tez.common.ReflectionUtils.createClazzInstance(ReflectionUtils.java:89)
at org.apache.tez.dag.app.dag.RootInputInitializerManager$1.run(RootInputInitializerManager.java:152)
at org.apache.tez.dag.app.dag.RootInputInitializerManager$1.run(RootInputInitializerManager.java:148)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)
at org.apache.tez.dag.app.dag.RootInputInitializerManager.createInitializer(RootInputInitializerManager.java:148)
at org.apache.tez.dag.app.dag.RootInputInitializerManager.runInputInitializers(RootInputInitializerManager.java:121)
at org.apache.tez.dag.app.dag.impl.VertexImpl.setupInputInitializerManager(VertexImpl.java:4122)
at org.apache.tez.dag.app.dag.impl.VertexImpl.access$3100(VertexImpl.java:207)
at org.apache.tez.dag.app.dag.impl.VertexImpl$InitTransition.handleInitEvent(VertexImpl.java:2932)
at org.apache.tez.dag.app.dag.impl.VertexImpl$InitTransition.transition(VertexImpl.java:2879)
at org.apache.tez.dag.app.dag.impl.VertexImpl$InitTransition.transition(VertexImpl.java:2861)
at org.apache.hadoop.yarn.state.StateMachineFactory$MultipleInternalArc.doTransition(StateMachineFactory.java:385)
at org.apache.hadoop.yarn.state.StateMachineFactory.doTransition(StateMachineFactory.java:302)
at org.apache.hadoop.yarn.state.StateMachineFactory.access$500(StateMachineFactory.java:46)
at org.apache.hadoop.yarn.state.StateMachineFactory$InternalStateMachine.doTransition(StateMachineFactory.java:487)
at org.apache.tez.state.StateMachineTez.doTransition(StateMachineTez.java:59)
at org.apache.tez.dag.app.dag.impl.VertexImpl.handle(VertexImpl.java:1957)
at org.apache.tez.dag.app.dag.impl.VertexImpl.handle(VertexImpl.java:206)
at org.apache.tez.dag.app.DAGAppMaster$VertexEventDispatcher.handle(DAGAppMaster.java:2317)
at org.apache.tez.dag.app.DAGAppMaster$VertexEventDispatcher.handle(DAGAppMaster.java:2303)
at org.apache.tez.common.AsyncDispatcher.dispatch(AsyncDispatcher.java:180)
at org.apache.tez.common.AsyncDispatcher$1.run(AsyncDispatcher.java:115)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.tez.common.ReflectionUtils.getNewInstance(ReflectionUtils.java:68)
... 25 more
Caused by: java.lang.RuntimeException: Failed to load plan: hdfs://sbx-hdp-mn01.ourbiworld.prod:8020/tmp/hive/hive/937f8627-235f-4adb-a06d-38a975990aaa/hive_2019-04-27_05-34-39_420_1560175914632249959-28/hive/_tez_scratch_dir/025ba737-1593-4ccc-b255-5959bc6d4b70/map.xml
at org.apache.hadoop.hive.ql.exec.Utilities.getBaseWork(Utilities.java:509)
at org.apache.hadoop.hive.ql.exec.Utilities.getMapWork(Utilities.java:342)
at org.apache.hadoop.hive.ql.exec.tez.HiveSplitGenerator.<init>(HiveSplitGenerator.java:137)
... 30 more
Caused by: org.apache.hive.com.esotericsoftware.kryo.KryoException: java.lang.IllegalArgumentException: Unable to create serializer "org.apache.hive.com.esotericsoftware.kryo.serializers.DefaultSerializers$EnumSerializer" for class: org.apache.hadoop.hive.ql.metadata.VirtualColumn
Serialization trace:
virtualCols (org.apache.hadoop.hive.ql.plan.TableScanDesc)
conf (org.apache.hadoop.hive.ql.exec.TableScanOperator)
aliasToWork (org.apache.hadoop.hive.ql.plan.MapWork)
at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:144)
at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:551)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:708)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readObject(SerializationUtilities.java:218)
at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:125)
at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:551)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:790)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readClassAndObject(SerializationUtilities.java:180)
at org.apache.hive.com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:161)
at org.apache.hive.com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:39)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:708)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readObject(SerializationUtilities.java:218)
at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:125)
at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:551)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:686)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readObject(SerializationUtilities.java:210)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities.deserializeObjectByKryo(SerializationUtilities.java:707)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities.deserializePlan(SerializationUtilities.java:613)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities.deserializePlan(SerializationUtilities.java:590)
at org.apache.hadoop.hive.ql.exec.Utilities.getBaseWork(Utilities.java:470)
... 32 more
Caused by: java.lang.IllegalArgumentException: Unable to create serializer "org.apache.hive.com.esotericsoftware.kryo.serializers.DefaultSerializers$EnumSerializer" for class: org.apache.hadoop.hive.ql.metadata.VirtualColumn
at org.apache.hive.com.esotericsoftware.kryo.factories.ReflectionSerializerFactory.makeSerializer(ReflectionSerializerFactory.java:67)
at org.apache.hive.com.esotericsoftware.kryo.factories.ReflectionSerializerFactory.makeSerializer(ReflectionSerializerFactory.java:45)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.getDefaultSerializer(Kryo.java:359)
at org.apache.hive.com.esotericsoftware.kryo.util.DefaultClassResolver.registerImplicit(DefaultClassResolver.java:74)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.getRegistration(Kryo.java:490)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.getSerializer(Kryo.java:505)
at org.apache.hive.com.esotericsoftware.kryo.serializers.CollectionSerializer.read(CollectionSerializer.java:120)
at org.apache.hive.com.esotericsoftware.kryo.serializers.CollectionSerializer.read(CollectionSerializer.java:40)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:708)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readObject(SerializationUtilities.java:218)
at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:125)
... 51 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hive.com.esotericsoftware.kryo.factories.ReflectionSerializerFactory.makeSerializer(ReflectionSerializerFactory.java:60)
... 61 more
Caused by: java.lang.NoSuchFieldError: stringTypeInfo
at org.apache.hadoop.hive.ql.metadata.VirtualColumn.<clinit>(VirtualColumn.java:47)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at java.lang.Class.getEnumConstantsShared(Class.java:3320)
at java.lang.Class.getEnumConstants(Class.java:3297)
at org.apache.hive.com.esotericsoftware.kryo.serializers.DefaultSerializers$EnumSerializer.<init>(DefaultSerializers.java:392)
... 66 more
]
ERROR : Vertex killed, vertexName=Reducer 2, vertexId=vertex_1555929930390_0107_1_01, diagnostics=[Vertex received Kill in NEW state., Vertex vertex_1555929930390_0107_1_01 [Reducer 2] killed/failed due to:OTHER_VERTEX_FAILURE]
ERROR : DAG did not succeed due to VERTEX_FAILURE. failedVertices:1 killedVertices:1
INFO : org.apache.tez.common.counters.DAGCounter:
INFO : AM_CPU_MILLISECONDS: 500
INFO : AM_GC_TIME_MILLIS: 0
ERROR : FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.tez.TezTask. Vertex failed, vertexName=Map 1, vertexId=vertex_1555929930390_0107_1_00, diagnostics=[Vertex vertex_1555929930390_0107_1_00 [Map 1] killed/failed due to:INIT_FAILURE, Fail to create InputInitializerManager, org.apache.tez.dag.api.TezReflectionException: Unable to instantiate class with 1 arguments: org.apache.hadoop.hive.ql.exec.tez.HiveSplitGenerator
at org.apache.tez.common.ReflectionUtils.getNewInstance(ReflectionUtils.java:71)
at org.apache.tez.common.ReflectionUtils.createClazzInstance(ReflectionUtils.java:89)
at org.apache.tez.dag.app.dag.RootInputInitializerManager$1.run(RootInputInitializerManager.java:152)
at org.apache.tez.dag.app.dag.RootInputInitializerManager$1.run(RootInputInitializerManager.java:148)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)
at org.apache.tez.dag.app.dag.RootInputInitializerManager.createInitializer(RootInputInitializerManager.java:148)
at org.apache.tez.dag.app.dag.RootInputInitializerManager.runInputInitializers(RootInputInitializerManager.java:121)
at org.apache.tez.dag.app.dag.impl.VertexImpl.setupInputInitializerManager(VertexImpl.java:4122)
at org.apache.tez.dag.app.dag.impl.VertexImpl.access$3100(VertexImpl.java:207)
at org.apache.tez.dag.app.dag.impl.VertexImpl$InitTransition.handleInitEvent(VertexImpl.java:2932)
at org.apache.tez.dag.app.dag.impl.VertexImpl$InitTransition.transition(VertexImpl.java:2879)
at org.apache.tez.dag.app.dag.impl.VertexImpl$InitTransition.transition(VertexImpl.java:2861)
at org.apache.hadoop.yarn.state.StateMachineFactory$MultipleInternalArc.doTransition(StateMachineFactory.java:385)
at org.apache.hadoop.yarn.state.StateMachineFactory.doTransition(StateMachineFactory.java:302)
at org.apache.hadoop.yarn.state.StateMachineFactory.access$500(StateMachineFactory.java:46)
at org.apache.hadoop.yarn.state.StateMachineFactory$InternalStateMachine.doTransition(StateMachineFactory.java:487)
at org.apache.tez.state.StateMachineTez.doTransition(StateMachineTez.java:59)
at org.apache.tez.dag.app.dag.impl.VertexImpl.handle(VertexImpl.java:1957)
at org.apache.tez.dag.app.dag.impl.VertexImpl.handle(VertexImpl.java:206)
at org.apache.tez.dag.app.DAGAppMaster$VertexEventDispatcher.handle(DAGAppMaster.java:2317)
at org.apache.tez.dag.app.DAGAppMaster$VertexEventDispatcher.handle(DAGAppMaster.java:2303)
at org.apache.tez.common.AsyncDispatcher.dispatch(AsyncDispatcher.java:180)
at org.apache.tez.common.AsyncDispatcher$1.run(AsyncDispatcher.java:115)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.tez.common.ReflectionUtils.getNewInstance(ReflectionUtils.java:68)
... 25 more
Caused by: java.lang.RuntimeException: Failed to load plan: hdfs://sbx-hdp-mn01.ourbiworld.prod:8020/tmp/hive/hive/937f8627-235f-4adb-a06d-38a975990aaa/hive_2019-04-27_05-34-39_420_1560175914632249959-28/hive/_tez_scratch_dir/025ba737-1593-4ccc-b255-5959bc6d4b70/map.xml
at org.apache.hadoop.hive.ql.exec.Utilities.getBaseWork(Utilities.java:509)
at org.apache.hadoop.hive.ql.exec.Utilities.getMapWork(Utilities.java:342)
at org.apache.hadoop.hive.ql.exec.tez.HiveSplitGenerator.<init>(HiveSplitGenerator.java:137)
... 30 more
Caused by: org.apache.hive.com.esotericsoftware.kryo.KryoException: java.lang.IllegalArgumentException: Unable to create serializer "org.apache.hive.com.esotericsoftware.kryo.serializers.DefaultSerializers$EnumSerializer" for class: org.apache.hadoop.hive.ql.metadata.VirtualColumn
Serialization trace:
virtualCols (org.apache.hadoop.hive.ql.plan.TableScanDesc)
conf (org.apache.hadoop.hive.ql.exec.TableScanOperator)
aliasToWork (org.apache.hadoop.hive.ql.plan.MapWork)
at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:144)
at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:551)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:708)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readObject(SerializationUtilities.java:218)
at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:125)
at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:551)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:790)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readClassAndObject(SerializationUtilities.java:180)
at org.apache.hive.com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:161)
at org.apache.hive.com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:39)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:708)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readObject(SerializationUtilities.java:218)
at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:125)
at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:551)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:686)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readObject(SerializationUtilities.java:210)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities.deserializeObjectByKryo(SerializationUtilities.java:707)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities.deserializePlan(SerializationUtilities.java:613)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities.deserializePlan(SerializationUtilities.java:590)
at org.apache.hadoop.hive.ql.exec.Utilities.getBaseWork(Utilities.java:470)
... 32 more
Caused by: java.lang.IllegalArgumentException: Unable to create serializer "org.apache.hive.com.esotericsoftware.kryo.serializers.DefaultSerializers$EnumSerializer" for class: org.apache.hadoop.hive.ql.metadata.VirtualColumn
at org.apache.hive.com.esotericsoftware.kryo.factories.ReflectionSerializerFactory.makeSerializer(ReflectionSerializerFactory.java:67)
at org.apache.hive.com.esotericsoftware.kryo.factories.ReflectionSerializerFactory.makeSerializer(ReflectionSerializerFactory.java:45)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.getDefaultSerializer(Kryo.java:359)
at org.apache.hive.com.esotericsoftware.kryo.util.DefaultClassResolver.registerImplicit(DefaultClassResolver.java:74)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.getRegistration(Kryo.java:490)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.getSerializer(Kryo.java:505)
at org.apache.hive.com.esotericsoftware.kryo.serializers.CollectionSerializer.read(CollectionSerializer.java:120)
at org.apache.hive.com.esotericsoftware.kryo.serializers.CollectionSerializer.read(CollectionSerializer.java:40)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:708)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readObject(SerializationUtilities.java:218)
at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:125)
... 51 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hive.com.esotericsoftware.kryo.factories.ReflectionSerializerFactory.makeSerializer(ReflectionSerializerFactory.java:60)
... 61 more
Caused by: java.lang.NoSuchFieldError: stringTypeInfo
at org.apache.hadoop.hive.ql.metadata.VirtualColumn.<clinit>(VirtualColumn.java:47)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at java.lang.Class.getEnumConstantsShared(Class.java:3320)
at java.lang.Class.getEnumConstants(Class.java:3297)
at org.apache.hive.com.esotericsoftware.kryo.serializers.DefaultSerializers$EnumSerializer.<init>(DefaultSerializers.java:392)
... 66 more
]Vertex killed, vertexName=Reducer 2, vertexId=vertex_1555929930390_0107_1_01, diagnostics=[Vertex received Kill in NEW state., Vertex vertex_1555929930390_0107_1_01 [Reducer 2] killed/failed due to:OTHER_VERTEX_FAILURE]DAG did not succeed due to VERTEX_FAILURE. failedVertices:1 killedVertices:1
INFO : Completed executing command(queryId=hive_20190427053439_1914cb17-3ac9-43ce-9d1a-0fde3b0e117e); Time taken: 3.29 seconds
Error: Error while processing statement: FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.tez.TezTask. Vertex failed, vertexName=Map 1, vertexId=vertex_1555929930390_0107_1_00, diagnostics=[Vertex vertex_1555929930390_0107_1_00 [Map 1] killed/failed due to:INIT_FAILURE, Fail to create InputInitializerManager, org.apache.tez.dag.api.TezReflectionException: Unable to instantiate class with 1 arguments: org.apache.hadoop.hive.ql.exec.tez.HiveSplitGenerator
at org.apache.tez.common.ReflectionUtils.getNewInstance(ReflectionUtils.java:71)
at org.apache.tez.common.ReflectionUtils.createClazzInstance(ReflectionUtils.java:89)
at org.apache.tez.dag.app.dag.RootInputInitializerManager$1.run(RootInputInitializerManager.java:152)
at org.apache.tez.dag.app.dag.RootInputInitializerManager$1.run(RootInputInitializerManager.java:148)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)
at org.apache.tez.dag.app.dag.RootInputInitializerManager.createInitializer(RootInputInitializerManager.java:148)
at org.apache.tez.dag.app.dag.RootInputInitializerManager.runInputInitializers(RootInputInitializerManager.java:121)
at org.apache.tez.dag.app.dag.impl.VertexImpl.setupInputInitializerManager(VertexImpl.java:4122)
at org.apache.tez.dag.app.dag.impl.VertexImpl.access$3100(VertexImpl.java:207)
at org.apache.tez.dag.app.dag.impl.VertexImpl$InitTransition.handleInitEvent(VertexImpl.java:2932)
at org.apache.tez.dag.app.dag.impl.VertexImpl$InitTransition.transition(VertexImpl.java:2879)
at org.apache.tez.dag.app.dag.impl.VertexImpl$InitTransition.transition(VertexImpl.java:2861)
at org.apache.hadoop.yarn.state.StateMachineFactory$MultipleInternalArc.doTransition(StateMachineFactory.java:385)
at org.apache.hadoop.yarn.state.StateMachineFactory.doTransition(StateMachineFactory.java:302)
at org.apache.hadoop.yarn.state.StateMachineFactory.access$500(StateMachineFactory.java:46)
at org.apache.hadoop.yarn.state.StateMachineFactory$InternalStateMachine.doTransition(StateMachineFactory.java:487)
at org.apache.tez.state.StateMachineTez.doTransition(StateMachineTez.java:59)
at org.apache.tez.dag.app.dag.impl.VertexImpl.handle(VertexImpl.java:1957)
at org.apache.tez.dag.app.dag.impl.VertexImpl.handle(VertexImpl.java:206)
at org.apache.tez.dag.app.DAGAppMaster$VertexEventDispatcher.handle(DAGAppMaster.java:2317)
at org.apache.tez.dag.app.DAGAppMaster$VertexEventDispatcher.handle(DAGAppMaster.java:2303)
at org.apache.tez.common.AsyncDispatcher.dispatch(AsyncDispatcher.java:180)
at org.apache.tez.common.AsyncDispatcher$1.run(AsyncDispatcher.java:115)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.tez.common.ReflectionUtils.getNewInstance(ReflectionUtils.java:68)
... 25 more
Caused by: java.lang.RuntimeException: Failed to load plan: hdfs://sbx-hdp-mn01.ourbiworld.prod:8020/tmp/hive/hive/937f8627-235f-4adb-a06d-38a975990aaa/hive_2019-04-27_05-34-39_420_1560175914632249959-28/hive/_tez_scratch_dir/025ba737-1593-4ccc-b255-5959bc6d4b70/map.xml
at org.apache.hadoop.hive.ql.exec.Utilities.getBaseWork(Utilities.java:509)
at org.apache.hadoop.hive.ql.exec.Utilities.getMapWork(Utilities.java:342)
at org.apache.hadoop.hive.ql.exec.tez.HiveSplitGenerator.<init>(HiveSplitGenerator.java:137)
... 30 more
Caused by: org.apache.hive.com.esotericsoftware.kryo.KryoException: java.lang.IllegalArgumentException: Unable to create serializer "org.apache.hive.com.esotericsoftware.kryo.serializers.DefaultSerializers$EnumSerializer" for class: org.apache.hadoop.hive.ql.metadata.VirtualColumn
Serialization trace:
virtualCols (org.apache.hadoop.hive.ql.plan.TableScanDesc)
conf (org.apache.hadoop.hive.ql.exec.TableScanOperator)
aliasToWork (org.apache.hadoop.hive.ql.plan.MapWork)
at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:144)
at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:551)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:708)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readObject(SerializationUtilities.java:218)
at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:125)
at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:551)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:790)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readClassAndObject(SerializationUtilities.java:180)
at org.apache.hive.com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:161)
at org.apache.hive.com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:39)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:708)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readObject(SerializationUtilities.java:218)
at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:125)
at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:551)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:686)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readObject(SerializationUtilities.java:210)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities.deserializeObjectByKryo(SerializationUtilities.java:707)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities.deserializePlan(SerializationUtilities.java:613)
at org.apache.hadoop.hive.ql.exec.SerializationUtilities.deserializePlan(SerializationUtilities.java:590)
at org.apache.hadoop.hive.ql.exec.Utilities.getBaseWork(Utilities.java:470)
... 32 more
The problem is I can create a table but cannot query.
Thanks for any help.
Your aggregation query does not contain group by, this is allowed only if you are aggregating all columns in the select. Add group by sport:
select sport, count(*) cnt from test group by sport limit 10;

Spark2 data load issue in instantiating HiveSessionState

Getting following issues during data read using Spark2 with cluster mode.
"java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveSessionState':"
I am absolutely clueless about this issue after googling a lot. Please help.
The code I have run
spark = SparkSession.builder.getOrCreate();
val lines: Dataset[String] = spark.read.textFile("/data/sample/abc.csv").
Exception is coming from above line.
Exception full stack Trace:
ERROR yarn.ApplicationMaster: User class threw exception: java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveSessionState':
java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveSessionState':
at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:981)
at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)
at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
at org.apache.spark.sql.DataFrameReader.<init>(DataFrameReader.scala:549)
at org.apache.spark.sql.SparkSession.read(SparkSession.scala:605)
at com.abcd.Learning$.main(Learning.scala:26)
at com.abcd.Learning.main(Learning.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:646)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978)
... 11 more
Caused by: java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveExternalCatalog':
at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:169)
at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:86)
at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101)
at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)
at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:157)
at org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessionState.scala:32)
... 16 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:166)
... 24 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:353)
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:257)
at org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExternalCatalog.scala:66)
... 29 more
Caused by: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class com.pepperdata.supervisor.agent.resource.LocalFileSystemWrapper not found
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:548)
at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:188)
... 37 more
Caused by: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class com.pepperdata.supervisor.agent.resource.LocalFileSystemWrapper not found
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2199)
at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2685)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2705)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:97)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2748)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2730)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:385)
at org.apache.hadoop.fs.FileSystem.getLocal(FileSystem.java:356)
at org.apache.hadoop.hive.ql.session.SessionState.createPath(SessionState.java:666)
at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:593)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:526)
... 38 more
Caused by: java.lang.ClassNotFoundException: Class com.pepperdata.supervisor.agent.resource.LocalFileSystemWrapper not found
at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2105)
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2197)
... 48 more
Similar to the solution given here worked for me.
I did the following
zipped the spark jars directory here: /usr/local/Cellar/apache-spark/2.1.0/libexec/jars, and named it spark-jars.zip
copied the spark-jars.zip to hdfs: $ hdfs dfs -copyFromLocal /usr/local/Cellar/apache-spark/2.1.0/libexec/spark-jars.zip hdfs:/user/<username>/
passed the spark-jars.zip location in the configuration while executing the spark job: $ HADOOP_CONF_DIR=/Users/<username>/hadoop_conf spark-submit --conf spark.yarn.archive=hdfs:/user/<username>/spark-jars.zip --conf spark.dynamicAllocation.enabled=true --conf spark.shuffle.service.enabled=true --class "com.<whatever>.<package>" --master yarn --deploy-mode cluster --queue online1 --driver-memory 3G --executor-memory 3G ./build/libs/<main class>.jar

WARN web[o.s.s.n.NotificationService] Unable to deliver notification

I did get this error when I've upgraded from 5.0 to 5.1.
It was related to https://jira.codehaus.org/browse/SONAR-4778.
See http://sonarqube.15.x6.nabble.com/WARN-web-o-s-s-n-NotificationService-Unable-to-deliver-notification-td5035238.html for details.
It was a bug and it was supposed to be fixed in 5.1.1 with https://jira.codehaus.org/browse/SONAR-6566 as Simon Brandhof stated.
Now I've upgraded to 5.1.1 and I'm still having the same problem.
2015.06.09 09:52:48 WARN web[o.s.s.n.NotificationService] Unable to deliver notification Notification{type='issue-changes', fields={new.resolution=FIXED, old.status=OPEN, projectKey=xxxxxx, new.status=CLOSED, ruleName=Public types, methods and fields (API) should be documented with Javadoc, reporter=null, assignee=xxxxxx, message=Document this public constructor., old.reso
lution=null, projectName=xxxxxxxx, componentKey=null, key=d7571ab9-ac49-41c9-9973-35d4a1ee83cf}} for user xxxxx via EmailNotificationChannel
javax.persistence.PersistenceException: org.hibernate.exception.JDBCConnectionException: could not execute query
at org.hibernate.ejb.AbstractEntityManagerImpl.throwPersistenceException(AbstractEntityManagerImpl.java:614) ~[hibernate-entitymanager-3.4.0.GA.jar:3.4.0.GA]
at org.hibernate.ejb.QueryImpl.getResultList(QueryImpl.java:76) ~[hibernate-entitymanager-3.4.0.GA.jar:3.4.0.GA]
at org.sonar.jpa.session.JpaDatabaseSession.getSingleResult(JpaDatabaseSession.java:207) ~[sonar-core-5.1.1.jar:na]
at org.sonar.jpa.session.JpaDatabaseSession.getSingleResult(JpaDatabaseSession.java:238) ~[sonar-core-5.1.1.jar:na]
at org.sonar.core.user.HibernateUserFinder.findByLogin(HibernateUserFinder.java:47) ~[sonar-core-5.1.1.jar:na]
at org.sonar.plugins.emailnotifications.EmailNotificationChannel.deliver(EmailNotificationChannel.java:98) ~[na:na]
at org.sonar.server.notifications.NotificationService.dispatch(NotificationService.java:200) [sonar-server-5.1.1.jar:na]
at org.sonar.server.notifications.NotificationService.deliver(NotificationService.java:190) [sonar-server-5.1.1.jar:na]
at org.sonar.server.computation.step.SendIssueNotificationsStep.doExecute(SendIssueNotificationsStep.java:83) [sonar-server-5.1.1.jar:na]
at org.sonar.server.computation.step.SendIssueNotificationsStep.execute(SendIssueNotificationsStep.java:66) [sonar-server-5.1.1.jar:na]
at org.sonar.server.computation.ComputationService.process(ComputationService.java:89) [sonar-server-5.1.1.jar:na]
at org.sonar.server.computation.ComputationContainer.execute(ComputationContainer.java:47) [sonar-server-5.1.1.jar:na]
at org.sonar.server.computation.ComputationThread.run(ComputationThread.java:58) [sonar-server-5.1.1.jar:na]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [na:1.8.0_45]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) [na:1.8.0_45]
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180) [na:1.8.0_45]
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293) [na:1.8.0_45]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [na:1.8.0_45]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [na:1.8.0_45]
at java.lang.Thread.run(Thread.java:745) [na:1.8.0_45]
Caused by: org.hibernate.exception.JDBCConnectionException: could not execute query
at org.hibernate.exception.SQLStateConverter.convert(SQLStateConverter.java:97) ~[hibernate-core-3.3.2.GA.jar:3.3.2.GA]
at org.hibernate.exception.JDBCExceptionHelper.convert(JDBCExceptionHelper.java:66) ~[hibernate-core-3.3.2.GA.jar:3.3.2.GA]
at org.hibernate.loader.Loader.doList(Loader.java:2235) ~[hibernate-core-3.3.2.GA.jar:3.3.2.GA]
at org.hibernate.loader.Loader.listIgnoreQueryCache(Loader.java:2129) ~[hibernate-core-3.3.2.GA.jar:3.3.2.GA]
at org.hibernate.loader.Loader.list(Loader.java:2124) ~[hibernate-core-3.3.2.GA.jar:3.3.2.GA]
at org.hibernate.loader.hql.QueryLoader.list(QueryLoader.java:401) ~[hibernate-core-3.3.2.GA.jar:3.3.2.GA]
at org.hibernate.hql.ast.QueryTranslatorImpl.list(QueryTranslatorImpl.java:363) ~[hibernate-core-3.3.2.GA.jar:3.3.2.GA]
at org.hibernate.engine.query.HQLQueryPlan.performList(HQLQueryPlan.java:196) ~[hibernate-core-3.3.2.GA.jar:3.3.2.GA]
at org.hibernate.impl.SessionImpl.list(SessionImpl.java:1149) ~[hibernate-core-3.3.2.GA.jar:3.3.2.GA]
at org.hibernate.impl.QueryImpl.list(QueryImpl.java:102) ~[hibernate-core-3.3.2.GA.jar:3.3.2.GA]
at org.hibernate.impl.SessionImpl.list(SessionImpl.java:1149) ~[hibernate-core-3.3.2.GA.jar:3.3.2.GA]
at org.hibernate.impl.QueryImpl.list(QueryImpl.java:102) ~[hibernate-core-3.3.2.GA.jar:3.3.2.GA]
at org.hibernate.ejb.QueryImpl.getResultList(QueryImpl.java:67) ~[hibernate-entitymanager-3.4.0.GA.jar:3.4.0.GA]
... 18 common frames omitted
Caused by: com.mysql.jdbc.exceptions.jdbc4.MySQLNonTransientConnectionException: No operations allowed after connection closed.
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[na:1.8.0_45]
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[na:1.8.0_45]
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[na:1.8.0_45]
at java.lang.reflect.Constructor.newInstance(Constructor.java:422) ~[na:1.8.0_45]
at com.mysql.jdbc.Util.handleNewInstance(Util.java:377) ~[mysql-connector-java-5.1.34.jar:5.1.34]
at com.mysql.jdbc.Util.getInstance(Util.java:360) ~[mysql-connector-java-5.1.34.jar:5.1.34]
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:956) ~[mysql-connector-java-5.1.34.jar:5.1.34]
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:935) ~[mysql-connector-java-5.1.34.jar:5.1.34]
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:924) ~[mysql-connector-java-5.1.34.jar:5.1.34]
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:870) ~[mysql-connector-java-5.1.34.jar:5.1.34]
at com.mysql.jdbc.ConnectionImpl.throwConnectionClosedException(ConnectionImpl.java:1232) ~[mysql-connector-java-5.1.34.jar:5.1.34]
at com.mysql.jdbc.ConnectionImpl.checkClosed(ConnectionImpl.java:1225) ~[mysql-connector-java-5.1.34.jar:5.1.34]
at com.mysql.jdbc.ConnectionImpl.prepareStatement(ConnectionImpl.java:4104) ~[mysql-connector-java-5.1.34.jar:5.1.34]
at com.mysql.jdbc.ConnectionImpl.prepareStatement(ConnectionImpl.java:4073) ~[mysql-connector-java-5.1.34.jar:5.1.34]
at org.apache.commons.dbcp.DelegatingConnection.prepareStatement(DelegatingConnection.java:281) ~[commons-dbcp-1.4.jar:1.4]
at org.apache.commons.dbcp.PoolingDataSource$PoolGuardConnectionWrapper.prepareStatement(PoolingDataSource.java:313) ~[commons-dbcp-1.4.jar:1.4]
at org.hibernate.jdbc.AbstractBatcher.getPreparedStatement(AbstractBatcher.java:534) ~[hibernate-core-3.3.2.GA.jar:3.3.2.GA]
at org.hibernate.jdbc.AbstractBatcher.getPreparedStatement(AbstractBatcher.java:452) ~[hibernate-core-3.3.2.GA.jar:3.3.2.GA]
at org.hibernate.jdbc.AbstractBatcher.prepareQueryStatement(AbstractBatcher.java:161) ~[hibernate-core-3.3.2.GA.jar:3.3.2.GA]
at org.hibernate.loader.Loader.prepareQueryStatement(Loader.java:1577) ~[hibernate-core-3.3.2.GA.jar:3.3.2.GA]
at org.hibernate.loader.Loader.doQuery(Loader.java:696) ~[hibernate-core-3.3.2.GA.jar:3.3.2.GA]
at org.hibernate.loader.Loader.doQueryAndInitializeNonLazyCollections(Loader.java:259) ~[hibernate-core-3.3.2.GA.jar:3.3.2.GA]
at org.hibernate.loader.Loader.doList(Loader.java:2232) ~[hibernate-core-3.3.2.GA.jar:3.3.2.GA]
... 26 common frames omitted
Caused by: com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: The last packet successfully received from the server was 264Â 434Â 682 milliseconds ago. The last packet sent successfully to
ur application, increasing the server configured values for client timeouts, or using the Connector/J connection property 'autoReconnect=true' to avoid this problem.ction validity before use in yo at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[na:1.8.0_45]
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[na:1.8.0_45]
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[na:1.8.0_45]
at java.lang.reflect.Constructor.newInstance(Constructor.java:422) ~[na:1.8.0_45]
at com.mysql.jdbc.Util.handleNewInstance(Util.java:377) ~[mysql-connector-java-5.1.34.jar:5.1.34]
at com.mysql.jdbc.SQLError.createCommunicationsException(SQLError.java:1036) ~[mysql-connector-java-5.1.34.jar:5.1.34]
at com.mysql.jdbc.MysqlIO.send(MysqlIO.java:3661) ~[mysql-connector-java-5.1.34.jar:5.1.34]
at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:2417) ~[mysql-connector-java-5.1.34.jar:5.1.34]
at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2582) ~[mysql-connector-java-5.1.34.jar:5.1.34]
at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2530) ~[mysql-connector-java-5.1.34.jar:5.1.34]
at com.mysql.jdbc.PreparedStatement.executeInternal(PreparedStatement.java:1907) ~[mysql-connector-java-5.1.34.jar:5.1.34]
at com.mysql.jdbc.PreparedStatement.executeQuery(PreparedStatement.java:2030) ~[mysql-connector-java-5.1.34.jar:5.1.34]
at com.mysql.jdbc.PreparedStatement.executeInternal(PreparedStatement.java:1907) ~[mysql-connector-java-5.1.34.jar:5.1.34]
at com.mysql.jdbc.PreparedStatement.executeQuery(PreparedStatement.java:2030) ~[mysql-connector-java-5.1.34.jar:5.1.34]
at org.apache.commons.dbcp.DelegatingPreparedStatement.executeQuery(DelegatingPreparedStatement.java:96) ~[commons-dbcp-1.4.jar:1.4]
at org.apache.commons.dbcp.DelegatingPreparedStatement.executeQuery(DelegatingPreparedStatement.java:96) ~[commons-dbcp-1.4.jar:1.4]
at org.hibernate.jdbc.AbstractBatcher.getResultSet(AbstractBatcher.java:208) ~[hibernate-core-3.3.2.GA.jar:3.3.2.GA]
at org.hibernate.loader.Loader.getResultSet(Loader.java:1812) ~[hibernate-core-3.3.2.GA.jar:3.3.2.GA]
at org.hibernate.loader.Loader.doQuery(Loader.java:697) ~[hibernate-core-3.3.2.GA.jar:3.3.2.GA]
... 28 common frames omitted
Caused by: java.net.SocketException: Relais brisé (pipe)
at java.net.SocketOutputStream.socketWrite0(Native Method) ~[na:1.8.0_45]
at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:109) ~[na:1.8.0_45]
at java.net.SocketOutputStream.write(SocketOutputStream.java:153) ~[na:1.8.0_45]
at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) ~[na:1.8.0_45]
at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) ~[na:1.8.0_45]
at com.mysql.jdbc.MysqlIO.send(MysqlIO.java:3643) ~[mysql-connector-java-5.1.34.jar:5.1.34]
... 38 common frames omitted
I don't have access to the Jira, so I cannot add a comment to the bug.
Can you please help me with this.
Here's the database configuration from the system information page :
Database
Database MySQL
Database Version 5.5.37-log
Username sonarqube#xxxxxxxxxxx
URL jdbc:mysql://xxxxxxxxxx/sonarqube?useUnicode=true&characterEncoding=utf8&rewriteBatchedStatements=true&useConfigs=maxPerformance
Driver MySQL Connector Java
Driver Version mysql-connector-java-5.1.34 ( Revision: jess.balint#oracle.com-20141014163213-wqbwpf1ok2kvo1om )
Version Status UP_TO_DATE
Pool Active Connections 2
Pool Max Connections 50
Pool Initial Size 0
Pool Idle Connections 3
Pool Min Idle Connections 2
Pool Max Idle Connections 5
Pool Max Wait (ms) 5000
Pool Remove Abandoned false
Pool Remove Abandoned Timeout (seconds) 300

SPARK : Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient

I am running with Hadoop 2.7.0, hive 1.1.0 and spark 1.3.1. I have my metastore db in mysql database. I can create and view data from hive shell.
hive (dwhdb)> select * from dwhdb.test_sample;
OK
test_sample.emp_id test_sample.emp_name test_sample.emp_dept test_sample.emp_sal
Eid1 EName1 EDept1 100.0
Eid2 EName2 EDept1 102.0
Eid3 EName3 EDept1 101.0
Eid4 EName4 EDept2 110.0
Eid5 EName5 EDept2 121.0
Eid6 EName6 EDept3 99.0
Time taken: 0.135 seconds, Fetched: 6 row(s)
But when I am trying to select same data from spark I am getting error
hduser#ubuntu:~$ spark-shell
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 1.3.1
/_/
Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_21)
Type in expressions to have them evaluated.
Type :help for more information.
Spark context available as sc.
SQL context available as sqlContext.
scala> val sqlHContext = new org.apache.spark.sql.hive.HiveContext(sc)
sqlHContext: org.apache.spark.sql.hive.HiveContext = org.apache.spark.sql.hive.HiveContext#5a8081b4
scala> sqlHContext.sql("SELECT emp_id, emp_name from dwhdb.test_sample").collect().foreach(println)
java.lang.RuntimeException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:346)
at org.apache.spark.sql.hive.HiveContext.sessionState$lzycompute(HiveContext.scala:239)
at org.apache.spark.sql.hive.HiveContext.sessionState(HiveContext.scala:235)
at org.apache.spark.sql.hive.HiveContext.hiveconf$lzycompute(HiveContext.scala:251)
at org.apache.spark.sql.hive.HiveContext.hiveconf(HiveContext.scala:250)
at org.apache.spark.sql.hive.HiveContext.sql(HiveContext.scala:95)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:24)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:29)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:31)
at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:33)
at $iwC$$iwC$$iwC$$iwC.<init>(<console>:35)
at $iwC$$iwC$$iwC.<init>(<console>:37)
at $iwC$$iwC.<init>(<console>:39)
at $iwC.<init>(<console>:41)
at <init>(<console>:43)
at .<init>(<console>:47)
at .<clinit>(<console>)
at .<init>(<console>:7)
at .<clinit>(<console>)
at $print(<console>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:856)
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:901)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:813)
at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:656)
at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:664)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:669)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:996)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944)
at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:944)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1058)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1412)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:62)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:72)
at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2453)
at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2465)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:340)
... 51 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:525)
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1410)
... 56 more
Caused by: java.lang.NumberFormatException: For input string: "600s"
at java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
at java.lang.Integer.parseInt(Integer.java:492)
at java.lang.Integer.parseInt(Integer.java:527)
at org.apache.hadoop.conf.Configuration.getInt(Configuration.java:1134)
at org.apache.hadoop.hive.conf.HiveConf.getIntVar(HiveConf.java:1211)
at org.apache.hadoop.hive.conf.HiveConf.getIntVar(HiveConf.java:1220)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:293)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:214)
... 61 more
Could you please let me know what can be the possible reason for this.
The logs says you are trying to convert string '600s' to an Integer. Can you check where you are providing this.Looks like you have this in your hive-site.xml.
Please change it to '600'.

Issue in Hive-0.14.0 Integration in HBase-0.98.8-hadoop2

I have hive 0.14.1 hbase 0.98.8 and hadoop 2.5.0 I am trying to integrate hive with hbase and put zookeeper-3.4.6.jar,hbase-common-0.98.8-hadoop2.jar file from HBase/lib to Hive/lib. Steps Followed are as follows:
1.hive --auxpath $HIVE_HOME/lib/hive-hbase-handler-0.14.1.jar,$HIVE_HOME/lib/hbase-common-0.98.8-hadoop2.jar,$HIVE_HOME/lib/zookeeper-3.4.6.jar,$HIVE_HOME/lib/guava-11.0.2.jar -hiveconf hbase.master=masternode:60000
2.CREATE TABLE hbase_table_1(key int, value string)
STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'
WITH SERDEPROPERTIES ("hbase.columns.mapping" = ":key,cf1:val")
TBLPROPERTIES ("hbase.table.name" = "xyz");
This Command does not works for me.It give me error.
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:org.apache.hadoop.hbase.DoNotRetryIOException: java.lang.NoSuchMethodError: org.apache.hadoop.hbase.protobuf.generated.ClientProtos$Scan$Builder.setCaching(I)Lorg/apache/hadoop/hbase/protobuf/generated/ClientProtos$Scan$Builder;
at org.apache.hadoop.hbase.client.RpcRetryingCaller.translateException(RpcRetryingCaller.java:216)
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:125)
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:93)
at org.apache.hadoop.hbase.client.ClientScanner.nextScanner(ClientScanner.java:283)
at org.apache.hadoop.hbase.client.ClientScanner.initializeScannerInConstruction(ClientScanner.java:188)
at org.apache.hadoop.hbase.client.ClientScanner.<init>(ClientScanner.java:183)
at org.apache.hadoop.hbase.client.ClientScanner.<init>(ClientScanner.java:110)
at org.apache.hadoop.hbase.client.HTable.getScanner(HTable.java:745)
at org.apache.hadoop.hbase.catalog.MetaReader.fullScan(MetaReader.java:542)
at org.apache.hadoop.hbase.catalog.MetaReader.tableExists(MetaReader.java:310)
at org.apache.hadoop.hbase.client.HBaseAdmin.tableExists(HBaseAdmin.java:307)
at org.apache.hadoop.hbase.client.HBaseAdmin.tableExists(HBaseAdmin.java:321)
at org.apache.hadoop.hive.hbase.HBaseStorageHandler.preCreateTable(HBaseStorageHandler.java:182)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:602)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:595)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:90)
at com.sun.proxy.$Proxy8.createTable(Unknown Source)
at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:670)
at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:3959)
at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:295)
at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)
at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85)
at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1604)
at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1364)
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1177)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1004)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:994)
at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:247)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:199)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:410)
at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:783)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:677)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:616)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
Caused by: java.lang.NoSuchMethodError: org.apache.hadoop.hbase.protobuf.generated.ClientProtos$Scan$Builder.setCaching(I)Lorg/apache/hadoop/hbase/protobuf/generated/ClientProtos$Scan$Builder;
at org.apache.hadoop.hbase.protobuf.ProtobufUtil.toScan(ProtobufUtil.java:879)
at org.apache.hadoop.hbase.protobuf.RequestConverter.buildScanRequest(RequestConverter.java:488)
at org.apache.hadoop.hbase.client.ScannerCallable.openScanner(ScannerCallable.java:303)
at org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:164)
at org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:59)
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:117)
... 40 more
)
You may need to add the hbase-protocol JAR to the classpath. The symbol that could not be resolved per your stack trace appears to come from there, at least based on cursory inspection only.

Resources