Spring Quartz Dynamically adding Jobs - spring

I am trying to add a job to the scheduler dynamically by
schedulerFactoryBean.getScheduler().scheduleJob(cronTrigger);
But this is giving an error
Caused by: org.quartz.JobPersistenceException: The job (JOB_GROUP.JOB_NAME) referenced by the trigger does not exist.
at org.quartz.simpl.RAMJobStore.storeTrigger(RAMJobStore.java:422)
at org.quartz.core.QuartzScheduler.scheduleJob(QuartzScheduler.java:932)
at org.quartz.impl.StdScheduler.scheduleJob(StdScheduler.java:258)
at com.abc.RescheduleJob.rescheduleJobs(RescheduleJob.java:32)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.util.MethodInvoker.invoke(MethodInvoker.java:269)
at org.springframework.scheduling.quartz.MethodInvokingJobDetailFactoryBean$MethodInvokingJob.executeInternal(MethodInvokingJobDetailFactoryBean.java:257)
Please help.

Related

java.lang.ClassNotFoundException: Class org.apache.hadoop.hive.contrib.serde2.MultiDelimitSerDe not found in Hive

I am trying to process multicharacter delmiter in hive.
I already created a table with the same successfully
create external table showtmp3(doc_name STRING,doc_content STRING) row format SERDE 'org.apache.hadoop.hive.serde2.MultiDelimitSerDe' WITH SERDEPROPERTIES ('field.delim'='#a#') location '/unmesha/OUT';
Then I need to issue a query as below.
INSERT OVERWRITE DIRECTORY '/unmesha/OUT_tmpShowData' SELECT * showtmp3 limit 10;
But it is giving me the error below
Error: java.lang.RuntimeException: Error in configuring object
at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:109)
at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:75)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:449)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:106)
... 9 more
Caused by: java.lang.RuntimeException: Error in configuring object
at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:109)
at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:75)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
at org.apache.hadoop.mapred.MapRunner.configure(MapRunner.java:38)
... 14 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:106)
... 17 more
Caused by: java.lang.RuntimeException: Map operator initialization failed
at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.configure(ExecMapper.java:153)
... 22 more
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.ClassNotFoundException: Class org.apache.hadoop.hive.contrib.serde2.MultiDelimitSerDe not found
at org.apache.hadoop.hive.ql.exec.MapOperator.getConvertedOI(MapOperator.java:323)
at org.apache.hadoop.hive.ql.exec.MapOperator.setChildren(MapOperator.java:333)
at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.configure(ExecMapper.java:122)
... 22 more
Caused by: java.lang.ClassNotFoundException: Class org.apache.hadoop.hive.contrib.serde2.MultiDelimitSerDe not found
at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2018)
at org.apache.hadoop.hive.ql.plan.PartitionDesc.getDeserializer(PartitionDesc.java:136)
at org.apache.hadoop.hive.ql.exec.MapOperator.getConvertedOI(MapOperator.java:297)
... 24 more
FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask
Should I download any jar and put in some location?
Please suggest
This is probably what you are looking for: http://grepcode.com/file/repo1.maven.org/maven2/org.apache.hive/hive-contrib/0.14.0/org/apache/hadoop/hive/contrib/serde2/MultiDelimitSerDe.java
You can download the original jar from Maven repo: https://mvnrepository.com/artifact/org.apache.hive/hive-contrib/0.14.0
Download this jar file and place it into you hive /lib folders of all nodes of your cluster. This should resolve the issue
you have to add hive-contrib.jar file to the hive. Run the below command in hive shell.
add JAR /opt/cloudera/parcels/CDH <version> /jars/hive-contrib.jar
(or)
add jar /usr/lib/hive/lib/hive-contrib-<version>.jar;

Failing oozie launcher on yarn-cluster mode

so I'm trying to run a spark job on yarn-cluster mode (succeeded running it in local mode and yarn-client), but I am running into a problem where oozie launcher fails. Below is the error message from stderr.
Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.JavaMain], main() threw exception, java.lang.NoSuchMethodError: org.apache.spark.network.util.JavaUtils.byteStringAsBytes(Ljava/lang/String;)J
org.apache.oozie.action.hadoop.JavaMainException: java.lang.NoSuchMethodError: org.apache.spark.network.util.JavaUtils.byteStringAsBytes(Ljava/lang/String;)J
at org.apache.oozie.action.hadoop.JavaMain.run(JavaMain.java:60)
at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:46)
at org.apache.oozie.action.hadoop.JavaMain.main(JavaMain.java:38)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:228)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
at org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.runSubtask(LocalContainerLauncher.java:370)
at org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.runTask(LocalContainerLauncher.java:295)
at org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.access$200(LocalContainerLauncher.java:181)
at org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler$1.run(LocalContainerLauncher.java:224)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NoSuchMethodError: org.apache.spark.network.util.JavaUtils.byteStringAsBytes(Ljava/lang/String;)J
at org.apache.spark.util.Utils$.memoryStringToMb(Utils.scala:993)
at org.apache.spark.util.MemoryParam$.unapply(MemoryParam.scala:27)
at org.apache.spark.deploy.yarn.ClientArguments.parseArgs(ClientArguments.scala:168)
at org.apache.spark.deploy.yarn.ClientArguments.<init>(ClientArguments.scala:58)
at org.apache.spark.deploy.yarn.Client$.main(Client.scala:966)
at org.apache.spark.deploy.yarn.Client.main(Client.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:674)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.oozie.action.hadoop.JavaMain.run(JavaMain.java:57)
... 19 more
The job runs on spark 1.5.2, so I downloaded and put the spark-assembly-1.5.2-hadoop2.6.0.jar file onto HDFS, and set the spark.yarn.jar field in my spark config file to point to the jar path, and set the oozie.libpath field in my job.properties file to point to the directory in which the jar resides.
I searched for other possible versions of Spark in the classpath section in the stdout log, and found two instances where spark-1.3.0-cdh5.4.5-yarn-shuffle.jar was being picked up (fortunately, spark-assembly-1.5.2-hadoop2.6.0.jar is being picked up elsewhere too so I am setting the path correctly).
So, the problem seems to be that oozie, or oozie launcher is defaulting to use spark 1.3 for some reason (which is installed on the system that the job is trying to run on). I tried setting the oozie.use.system.libpath field to false in the job.properties file, but it doesn't seem to have helped. Any ideas on what I can to do prevent spark 1.3 from being picked up, or any other solutions that can solve the NoSuchMethodError I am facing?
Any help would be greatly appreciated, thanks.

Phoenix Error while trying to create table

When I either log in to sqlline.py in phoenix or I try to create table in phoenix through api's I get an exception.
With my limited knowledge of phenix I am unable to figure out why phoenix is checking for System.Catalog table even before it creates it.Any help will be greatly appreciated.
StackTrace:
*4/11/18 06:07:18 WARN client.HConnectionManager$HConnectionImplementation: Encountered problems when prefetch META table:
org.apache.hadoop.hbase.TableNotFoundException: Cannot find row in .META. for table: SYSTEM.CATALOG, row=SYSTEM.CATALOG,\x00SYSTEM\x00CATALOG,99999999999999
at org.apache.hadoop.hbase.client.MetaScanner.metaScan(MetaScanner.java:151)
at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.prefetchRegionCache(HConnectionManager.java:1059)
at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegionInMeta(HConnectionManager.java:1121)
at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:1001)
at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:958)
at org.apache.phoenix.query.ConnectionQueryServicesImpl.metaDataCoprocessorExec(ConnectionQueryServicesImpl.java:914)
at org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(ConnectionQueryServicesImpl.java:1053)
at org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataClient.java:1156)
at org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:422)
at org.apache.phoenix.compile.CreateTableCompiler$2.execute(CreateTableCompiler.java:183)
at org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:226)
at org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.java:908)
at org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:1351)
at org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:131)
at org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.connect(PhoenixEmbeddedDriver.java:112)
at sqlline.SqlLine$DatabaseConnection.connect(SqlLine.java:4650)
at sqlline.SqlLine$DatabaseConnection.getConnection(SqlLine.java:4701)
at sqlline.SqlLine$Commands.connect(SqlLine.java:3942)
at sqlline.SqlLine$Commands.connect(SqlLine.java:3851)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at sqlline.SqlLine$ReflectiveCommandHandler.execute(SqlLine.java:2810)
at sqlline.SqlLine.dispatch(SqlLine.java:817)
at sqlline.SqlLine.initArgs(SqlLine.java:633)
at sqlline.SqlLine.begin(SqlLine.java:680)
at sqlline.SqlLine.mainWithInputRedirection(SqlLine.java:441)
at sqlline.SqlLine.main(SqlLine.java:424)
Error: SYSTEM.CATALOG (state=08000,code=101)
org.apache.phoenix.exception.PhoenixIOException: SYSTEM.CATALOG
at org.apache.phoenix.util.ServerUtil.parseServerException(ServerUtil.java:97)
at org.apache.phoenix.query.ConnectionQueryServicesImpl.metaDataCoprocessorExec(ConnectionQueryServicesImpl.java:935)
at org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(ConnectionQueryServicesImpl.java:1053)
at org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataClient.java:1156)
at org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:422)
at org.apache.phoenix.compile.CreateTableCompiler$2.execute(CreateTableCompiler.java:183)
at org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:226)
at org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.java:908)
at org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:1351)
at org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:131)
at org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.connect(PhoenixEmbeddedDriver.java:112)
at sqlline.SqlLine$DatabaseConnection.connect(SqlLine.java:4650)
at sqlline.SqlLine$DatabaseConnection.getConnection(SqlLine.java:4701)
at sqlline.SqlLine$Commands.connect(SqlLine.java:3942)
at sqlline.SqlLine$Commands.connect(SqlLine.java:3851)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at sqlline.SqlLine$ReflectiveCommandHandler.execute(SqlLine.java:2810)
at sqlline.SqlLine.dispatch(SqlLine.java:817)
at sqlline.SqlLine.initArgs(SqlLine.java:633)
at sqlline.SqlLine.begin(SqlLine.java:680)
at sqlline.SqlLine.mainWithInputRedirection(SqlLine.java:441)
at sqlline.SqlLine.main(SqlLine.java:424)
Caused by: org.apache.hadoop.hbase.TableNotFoundException: SYSTEM.CATALOG
at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegionInMeta(HConnectionManager.java:1139)
at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:1001)
at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:958)
at org.apache.phoenix.query.ConnectionQueryServicesImpl.metaDataCoprocessorExec(ConnectionQueryServicesImpl.java:914)
... 23 more*
Check your hdfs://.../hbase/data/default/ is exist SYSTEM.CATALOG ?
if there is nothing, you must try to use bin/hbase clean --cleanZk
before you use the command, you must stop hbase Master and regionServers,but still keep ZK alive.

Cannot find row in .META. for Table

I am using Hbase 0.94.1 with Hadoop 1.2.1 and using Thrift API to access tables stored in Hbase from my C# application. I am able to connect to Server but while going to perform any operation from client it gives following error in CLI-log:
14/03/11 12:18:53 WARN client.HConnectionManager$HConnectionImplementation: Encountered problems when prefetch META table:
org.apache.hadoop.hbase.TableNotFoundException: Cannot find row in .META. for table: tblAssetsView, row=t\x00\x00\x00b\x00\x00\x00l\x00\x00\x00A\x00\x00\x00s\x00\x00\x00s\x00\x00\x00e\x00\x00\x00t\x00\x00\x00s\x00\x00\x00V\x00\x00\x00i\x00\x00\x00e\x00\x00\x00w\x00\x00\x00,,99999999999999
at org.apache.hadoop.hbase.client.MetaScanner.metaScan(MetaScanner.java:151)
at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.prefetchRegionCache(HConnectionManager.java:1059)
at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegionInMeta(HConnectionManager.java:1121)
at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:1001)
at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:958)
at org.apache.hadoop.hbase.client.HTable.finishSetup(HTable.java:251)
at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:155)
at org.apache.hadoop.hbase.thrift.ThriftServerRunner$HBaseHandler.getTable(ThriftServerRunner.java:458)
at org.apache.hadoop.hbase.thrift.ThriftServerRunner$HBaseHandler.getTable(ThriftServerRunner.java:464)
at org.apache.hadoop.hbase.thrift.ThriftServerRunner$HBaseHandler.getRowWithColumnsTs(ThriftServerRunner.java:766)
at org.apache.hadoop.hbase.thrift.ThriftServerRunner$HBaseHandler.getRow(ThriftServerRunner.java:739)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.hbase.thrift.HbaseHandlerMetricsProxy.invoke(HbaseHandlerMetricsProxy.java:65)
at com.sun.proxy.$Proxy6.getRow(Unknown Source)
at org.apache.hadoop.hbase.thrift.generated.Hbase$Processor$getRow.getResult(Hbase.java:3906)
at org.apache.hadoop.hbase.thrift.generated.Hbase$Processor$getRow.getResult(Hbase.java:3894)
at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:32)
at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:34)
at org.apache.hadoop.hbase.thrift.TBoundedThreadPoolServer$ClientConnnection.run(TBoundedThreadPoolServer.java:287)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:744)
14/03/11 12:18:53 WARN thrift.ThriftServerRunner$HBaseHandler: tblAssetsView
org.apache.hadoop.hbase.TableNotFoundException: tblAssetsView
at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegionInMeta(HConnectionManager.java:1139)
at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:1001)
at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:958)
at org.apache.hadoop.hbase.client.HTable.finishSetup(HTable.java:251)
at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:155)
at org.apache.hadoop.hbase.thrift.ThriftServerRunner$HBaseHandler.getTable(ThriftServerRunner.java:458)
at org.apache.hadoop.hbase.thrift.ThriftServerRunner$HBaseHandler.getTable(ThriftServerRunner.java:464)
at org.apache.hadoop.hbase.thrift.ThriftServerRunner$HBaseHandler.getRowWithColumnsTs(ThriftServerRunner.java:766)
at org.apache.hadoop.hbase.thrift.ThriftServerRunner$HBaseHandler.getRow(ThriftServerRunner.java:739)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.hbase.thrift.HbaseHandlerMetricsProxy.invoke(HbaseHandlerMetricsProxy.java:65)
at com.sun.proxy.$Proxy6.getRow(Unknown Source)
at org.apache.hadoop.hbase.thrift.generated.Hbase$Processor$getRow.getResult(Hbase.java:3906)
at org.apache.hadoop.hbase.thrift.generated.Hbase$Processor$getRow.getResult(Hbase.java:3894)
at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:32)
at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:34)
at org.apache.hadoop.hbase.thrift.TBoundedThreadPoolServer$ClientConnnection.run(TBoundedThreadPoolServer.java:287)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:744)
But I can access my table from Hbase Shell with all shell operations.
I am totally stuck here, please devise some methods to overcome this issue.
I have encounter this error
finally turned out I used wrong configuration files,I have two cluster,I used another cluster's configuration files.Maybe your configuration have
some problem

Error while executing Hive command line

I am getting the following error when tried to execute hive command line interface. May I know the incompatible Jar files causing the problem and the solution to fix it?
Exception in thread "main" java.lang.NoSuchFieldError: ALLOW_UNQUOTED_CONTROL_CHARS
at org.apache.hadoop.hive.ql.udf.generic.GenericUDTFJSONTuple.<clinit>(GenericUDTFJSONTuple.java:59)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:113)
at org.apache.hadoop.hive.ql.exec.FunctionRegistry.registerGenericUDTF(FunctionRegistry.java:545)
at org.apache.hadoop.hive.ql.exec.FunctionRegistry.registerGenericUDTF(FunctionRegistry.java:539)
at org.apache.hadoop.hive.ql.exec.FunctionRegistry.<clinit>(FunctionRegistry.java:472)
at org.apache.hadoop.hive.ql.session.SessionState.<init>(SessionState.java:202)
at org.apache.hadoop.hive.cli.CliSessionState.<init>(CliSessionState.java:86)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:635)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:613)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.util.RunJar.main(RunJar.java:156)

Resources