I installed hive 1.1.0 on windows 7 32-bit, I can use hive console to create table, query etc, and I can see those were written to hdfs://users/hive/warehouse
But i'm not able to start hiveserver2. After entering the command, it hangs. Below are the console print, please help, thanks!
C:\hive\bin>hive --service hiveserver2
File Not Found
File Not Found
File Not Found
File Not Found
File Not Found
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/C:/hadoop/share/hadoop/common/lib/slf4j-log4j
12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/C:/hive/lib/hive-jdbc-1.1.0-standalone.jar!/o
rg/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/C:/hbase/lib/slf4j-log4j12-1.7.5.jar!/org/slf
4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
hive.log
2015-06-02 15:38:46,920 WARN [main]: common.LogUtils (LogUtils.java:logConfigLocation(140)) - DEPRECATED: Ignoring hive-default.xml found on the CLASSPATH at /C:/hive/conf/hive-default.xml
2015-06-02 15:38:47,014 INFO [main]: server.HiveServer2 (HiveStringUtils.java:startupShutdownMessage(662)) - STARTUP_MSG:
/************************************************************
STARTUP_MSG: Starting HiveServer2
STARTUP_MSG: host = NWT1302004/10.192.37.175
STARTUP_MSG: args = []
STARTUP_MSG: version = 1.1.0
STARTUP_MSG: classpath = too long, remove, otherwise can't save question
STARTUP_MSG: build = git://localhost.localdomain/Users/noland/workspaces/hive-apache/hive -r 3b87e226d9f2ff5d69385ed20704302cffefab21; compiled by 'noland' on Wed Feb 18 16:06:08 PST 2015
************************************************************/
2015-06-02 15:38:47,030 INFO [main]: server.HiveServer2 (HiveServer2.java:startHiveServer2(303)) - Starting HiveServer2
2015-06-02 15:38:49,409 INFO [main]: metastore.HiveMetaStore (HiveMetaStore.java:newRawStore(575)) - 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
2015-06-02 15:38:49,613 INFO [main]: metastore.ObjectStore (ObjectStore.java:initialize(269)) - ObjectStore, initialize called
2015-06-02 15:39:05,190 INFO [main]: metastore.ObjectStore (ObjectStore.java:getPMF(350)) - Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
2015-06-02 15:39:12,603 INFO [main]: metastore.MetaStoreDirectSql (MetaStoreDirectSql.java:<init>(132)) - Using direct SQL, underlying DB is DERBY
2015-06-02 15:39:12,603 INFO [main]: metastore.ObjectStore (ObjectStore.java:setConf(252)) - Initialized ObjectStore
2015-06-02 15:39:14,678 INFO [main]: metastore.HiveMetaStore (HiveMetaStore.java:createDefaultRoles_core(649)) - Added admin role in metastore
2015-06-02 15:39:14,678 INFO [main]: metastore.HiveMetaStore (HiveMetaStore.java:createDefaultRoles_core(658)) - Added public role in metastore
2015-06-02 15:39:14,958 INFO [main]: metastore.HiveMetaStore (HiveMetaStore.java:addAdminUsers_core(698)) - No user is added in admin role, since config is empty
2015-06-02 15:39:16,767 INFO [main]: session.SessionState (SessionState.java:createPath(586)) - Created local directory: C:/Users/46172/AppData/Local/Temp/335c6071-cebd-40b8-b4c3-4d77d3e81d48_resources
2015-06-02 15:39:16,798 INFO [main]: session.SessionState (SessionState.java:createPath(586)) - Created HDFS directory: /tmp/hive/46172/335c6071-cebd-40b8-b4c3-4d77d3e81d48
2015-06-02 15:39:16,798 INFO [main]: session.SessionState (SessionState.java:createPath(586)) - Created local directory: C:/Users/46172/AppData/Local/Temp/46172/335c6071-cebd-40b8-b4c3-4d77d3e81d48
2015-06-02 15:39:16,798 INFO [main]: session.SessionState (SessionState.java:createPath(586)) - Created HDFS directory: /tmp/hive/46172/335c6071-cebd-40b8-b4c3-4d77d3e81d48/_tmp_space.db
2015-06-02 15:39:16,798 INFO [main]: session.SessionState (SessionState.java:start(488)) - No Tez session required at this point. hive.execution.engine=mr.
2015-06-02 15:39:21,262 INFO [main]: service.CompositeService (SessionManager.java:initOperationLogRootDir(132)) - Operation log root directory is created: C:\Users\46172\AppData\Local\Temp\46172\operation_logs
2015-06-02 15:39:21,329 INFO [main]: service.CompositeService (SessionManager.java:createBackgroundOperationPool(89)) - HiveServer2: Background operation thread pool size: 100
2015-06-02 15:39:21,329 INFO [main]: service.CompositeService (SessionManager.java:createBackgroundOperationPool(91)) - HiveServer2: Background operation thread wait queue size: 100
2015-06-02 15:39:21,330 INFO [main]: service.CompositeService (SessionManager.java:createBackgroundOperationPool(94)) - HiveServer2: Background operation thread keepalive time: 10 seconds
2015-06-02 15:39:21,433 INFO [main]: service.AbstractService (AbstractService.java:init(89)) - Service:OperationManager is inited.
2015-06-02 15:39:21,433 INFO [main]: service.AbstractService (AbstractService.java:init(89)) - Service:SessionManager is inited.
2015-06-02 15:39:21,433 INFO [main]: service.AbstractService (AbstractService.java:init(89)) - Service:CLIService is inited.
2015-06-02 15:39:21,433 INFO [main]: service.AbstractService (AbstractService.java:init(89)) - Service:ThriftBinaryCLIService is inited.
2015-06-02 15:39:21,433 INFO [main]: service.AbstractService (AbstractService.java:init(89)) - Service:HiveServer2 is inited.
2015-06-02 15:39:21,433 INFO [main]: service.AbstractService (AbstractService.java:start(104)) - Service:OperationManager is started.
2015-06-02 15:39:21,433 INFO [main]: service.AbstractService (AbstractService.java:start(104)) - Service:SessionManager is started.
2015-06-02 15:39:21,433 INFO [main]: service.AbstractService (AbstractService.java:start(104)) - Service:CLIService is started.
2015-06-02 15:39:21,433 INFO [main]: metastore.ObjectStore (ObjectStore.java:initialize(269)) - ObjectStore, initialize called
2015-06-02 15:39:21,433 INFO [main]: metastore.MetaStoreDirectSql (MetaStoreDirectSql.java:<init>(132)) - Using direct SQL, underlying DB is DERBY
2015-06-02 15:39:21,433 INFO [main]: metastore.ObjectStore (ObjectStore.java:setConf(252)) - Initialized ObjectStore
2015-06-02 15:39:21,433 INFO [main]: metastore.HiveMetaStore (HiveMetaStore.java:logInfo(732)) - 0: get_databases: default
2015-06-02 15:39:21,433 INFO [main]: HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(358)) - ugi=46172 ip=unknown-ip-addr cmd=get_databases: default
2015-06-02 15:39:21,559 INFO [main]: metastore.HiveMetaStore (HiveMetaStore.java:logInfo(732)) - 0: Shutting down the object store...
2015-06-02 15:39:21,559 INFO [main]: HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(358)) - ugi=46172 ip=unknown-ip-addr cmd=Shutting down the object store...
2015-06-02 15:39:21,559 INFO [main]: metastore.HiveMetaStore (HiveMetaStore.java:logInfo(732)) - 0: Metastore shutdown complete.
2015-06-02 15:39:21,559 INFO [main]: HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(358)) - ugi=46172 ip=unknown-ip-addr cmd=Metastore shutdown complete.
2015-06-02 15:39:21,559 INFO [main]: service.AbstractService (AbstractService.java:start(104)) - Service:ThriftBinaryCLIService is started.
2015-06-02 15:39:21,559 INFO [main]: service.AbstractService (AbstractService.java:start(104)) - Service:HiveServer2 is started.
2015-06-02 15:50:25,794 WARN [main]: common.LogUtils (LogUtils.java:logConfigLocation(140)) - DEPRECATED: Ignoring hive-default.xml found on the CLASSPATH at /C:/hive/conf/hive-default.xml
2015-06-02 15:50:25,856 INFO [main]: SessionState (SessionState.java:printInfo(852)) -
Logging initialized using configuration in jar:file:/C:/hive/lib/hive-common-1.1.0.jar!/hive-log4j.properties
2015-06-02 15:50:26,152 INFO [main]: metastore.HiveMetaStore (HiveMetaStore.java:newRawStore(575)) - 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
2015-06-02 15:50:26,199 INFO [main]: metastore.ObjectStore (ObjectStore.java:initialize(269)) - ObjectStore, initialize called
2015-06-02 15:50:29,569 INFO [main]: metastore.ObjectStore (ObjectStore.java:getPMF(350)) - Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
2015-06-02 15:50:34,252 INFO [main]: metastore.MetaStoreDirectSql (MetaStoreDirectSql.java:<init>(132)) - Using direct SQL, underlying DB is DERBY
2015-06-02 15:50:34,252 INFO [main]: metastore.ObjectStore (ObjectStore.java:setConf(252)) - Initialized ObjectStore
2015-06-02 15:50:34,424 WARN [main]: metastore.ObjectStore (ObjectStore.java:checkSchema(6599)) - Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.1.0
2015-06-02 15:50:34,830 WARN [main]: metastore.ObjectStore (ObjectStore.java:getDatabase(548)) - Failed to get database default, returning NoSuchObjectException
2015-06-02 15:50:35,282 INFO [main]: metastore.HiveMetaStore (HiveMetaStore.java:createDefaultRoles_core(649)) - Added admin role in metastore
2015-06-02 15:50:35,329 INFO [main]: metastore.HiveMetaStore (HiveMetaStore.java:createDefaultRoles_core(658)) - Added public role in metastore
2015-06-02 15:50:35,454 INFO [main]: metastore.HiveMetaStore (HiveMetaStore.java:addAdminUsers_core(698)) - No user is added in admin role, since config is empty
2015-06-02 15:50:35,750 INFO [main]: session.SessionState (SessionState.java:createPath(586)) - Created local directory: C:/Users/46172/AppData/Local/Temp/bc6dabf4-b50d-4a04-9677-774846caca49_resources
2015-06-02 15:50:35,750 INFO [main]: session.SessionState (SessionState.java:createPath(586)) - Created HDFS directory: /tmp/hive/46172/bc6dabf4-b50d-4a04-9677-774846caca49
2015-06-02 15:50:35,766 INFO [main]: session.SessionState (SessionState.java:createPath(586)) - Created local directory: C:/Users/46172/AppData/Local/Temp/46172/bc6dabf4-b50d-4a04-9677-774846caca49
2015-06-02 15:50:35,766 INFO [main]: session.SessionState (SessionState.java:createPath(586)) - Created HDFS directory: /tmp/hive/46172/bc6dabf4-b50d-4a04-9677-774846caca49/_tmp_space.db
2015-06-02 15:50:35,766 INFO [main]: session.SessionState (SessionState.java:start(488)) - No Tez session required at this point. hive.execution.engine=mr.
2015-06-02 15:50:37,934 INFO [main]: metastore.HiveMetaStore (HiveMetaStore.java:logInfo(732)) - 0: get_all_databases
2015-06-02 15:50:37,934 INFO [main]: HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(358)) - ugi=46172 ip=unknown-ip-addr cmd=get_all_databases
2015-06-02 15:50:37,950 INFO [main]: metastore.HiveMetaStore (HiveMetaStore.java:logInfo(732)) - 0: get_functions: db=default pat=*
2015-06-02 15:50:37,950 INFO [main]: HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(358)) - ugi=46172 ip=unknown-ip-addr cmd=get_functions: db=default pat=*
2015-06-02 15:50:44,623 INFO [main]: log.PerfLogger (PerfLogger.java:PerfLogBegin(121)) - <PERFLOG method=Driver.run from=org.apache.hadoop.hive.ql.Driver>
2015-06-02 15:50:44,623 INFO [main]: log.PerfLogger (PerfLogger.java:PerfLogBegin(121)) - <PERFLOG method=TimeToSubmit from=org.apache.hadoop.hive.ql.Driver>
2015-06-02 15:50:44,623 INFO [main]: log.PerfLogger (PerfLogger.java:PerfLogBegin(121)) - <PERFLOG method=compile from=org.apache.hadoop.hive.ql.Driver>
2015-06-02 15:50:44,716 INFO [main]: log.PerfLogger (PerfLogger.java:PerfLogBegin(121)) - <PERFLOG method=parse from=org.apache.hadoop.hive.ql.Driver>
2015-06-02 15:50:44,732 INFO [main]: parse.ParseDriver (ParseDriver.java:parse(185)) - Parsing command: version
2015-06-02 15:50:44,982 ERROR [main]: ql.Driver (SessionState.java:printError(861)) - FAILED: ParseException line 1:0 cannot recognize input near 'version' '<EOF>' '<EOF>'
org.apache.hadoop.hive.ql.parse.ParseException: line 1:0 cannot recognize input near 'version' '<EOF>' '<EOF>'
at org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:202)
at org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:166)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:393)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:307)
at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1112)
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1160)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1049)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1039)
at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:207)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:159)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:370)
at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:754)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:675)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:615)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
2015-06-02 15:50:44,982 INFO [main]: log.PerfLogger (PerfLogger.java:PerfLogEnd(148)) - </PERFLOG method=compile start=1433231444623 end=1433231444982 duration=359 from=org.apache.hadoop.hive.ql.Driver>
2015-06-02 15:50:44,982 INFO [main]: log.PerfLogger (PerfLogger.java:PerfLogBegin(121)) - <PERFLOG method=releaseLocks from=org.apache.hadoop.hive.ql.Driver>
2015-06-02 15:50:44,982 INFO [main]: log.PerfLogger (PerfLogger.java:PerfLogEnd(148)) - </PERFLOG method=releaseLocks start=1433231444982 end=1433231444982 duration=0 from=org.apache.hadoop.hive.ql.Driver>
2015-06-02 15:50:44,982 INFO [main]: log.PerfLogger (PerfLogger.java:PerfLogBegin(121)) - <PERFLOG method=releaseLocks from=org.apache.hadoop.hive.ql.Driver>
2015-06-02 15:50:44,982 INFO [main]: log.PerfLogger (PerfLogger.java:PerfLogEnd(148)) - </PERFLOG method=releaseLocks start=1433231444982 end=1433231444982 duration=0 from=org.apache.hadoop.hive.ql.Driver>
2015-06-02 15:51:03,720 WARN [main]: common.LogUtils (LogUtils.java:logConfigLocation(140)) - DEPRECATED: Ignoring hive-default.xml found on the CLASSPATH at /C:/hive/conf/hive-default.xml
2015-06-02 15:51:03,783 INFO [main]: SessionState (SessionState.java:printInfo(852)) -
Logging initialized using configuration in jar:file:/C:/hive/lib/hive-common-1.1.0.jar!/hive-log4j.properties
2015-06-02 15:51:04,487 INFO [main]: metastore.HiveMetaStore (HiveMetaStore.java:newRawStore(575)) - 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
2015-06-02 15:51:04,518 INFO [main]: metastore.ObjectStore (ObjectStore.java:initialize(269)) - ObjectStore, initialize called
2015-06-02 15:51:07,888 INFO [main]: metastore.ObjectStore (ObjectStore.java:getPMF(350)) - Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
2015-06-02 15:51:09,666 INFO [main]: metastore.MetaStoreDirectSql (MetaStoreDirectSql.java:<init>(132)) - Using direct SQL, underlying DB is DERBY
2015-06-02 15:51:09,666 INFO [main]: metastore.ObjectStore (ObjectStore.java:setConf(252)) - Initialized ObjectStore
2015-06-02 15:51:09,838 INFO [main]: metastore.HiveMetaStore (HiveMetaStore.java:createDefaultRoles_core(649)) - Added admin role in metastore
2015-06-02 15:51:09,838 INFO [main]: metastore.HiveMetaStore (HiveMetaStore.java:createDefaultRoles_core(658)) - Added public role in metastore
2015-06-02 15:51:09,869 INFO [main]: metastore.HiveMetaStore (HiveMetaStore.java:addAdminUsers_core(698)) - No user is added in admin role, since config is empty
2015-06-02 15:51:10,481 INFO [main]: session.SessionState (SessionState.java:createPath(586)) - Created local directory: C:/Users/46172/AppData/Local/Temp/320de2e6-0f30-408b-b8f1-e65869a939ea_resources
2015-06-02 15:51:10,496 INFO [main]: session.SessionState (SessionState.java:createPath(586)) - Created HDFS directory: /tmp/hive/46172/320de2e6-0f30-408b-b8f1-e65869a939ea
2015-06-02 15:51:10,496 INFO [main]: session.SessionState (SessionState.java:createPath(586)) - Created local directory: C:/Users/46172/AppData/Local/Temp/46172/320de2e6-0f30-408b-b8f1-e65869a939ea
2015-06-02 15:51:10,496 INFO [main]: session.SessionState (SessionState.java:createPath(586)) - Created HDFS directory: /tmp/hive/46172/320de2e6-0f30-408b-b8f1-e65869a939ea/_tmp_space.db
2015-06-02 15:51:10,496 INFO [main]: session.SessionState (SessionState.java:start(488)) - No Tez session required at this point. hive.execution.engine=mr.
2015-06-02 15:51:11,027 INFO [main]: metastore.HiveMetaStore (HiveMetaStore.java:logInfo(732)) - 0: get_all_databases
2015-06-02 15:51:11,027 INFO [main]: HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(358)) - ugi=46172 ip=unknown-ip-addr cmd=get_all_databases
2015-06-02 15:51:11,042 INFO [main]: metastore.HiveMetaStore (HiveMetaStore.java:logInfo(732)) - 0: get_functions: db=default pat=*
2015-06-02 15:51:11,042 INFO [main]: HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(358)) - ugi=46172 ip=unknown-ip-addr cmd=get_functions: db=default pat=*
2015-06-02 16:12:31,212 INFO [Thread-6]: server.HiveServer2 (HiveServer2.java:stop(269)) - Shutting down HiveServer2
2015-06-02 16:12:31,228 INFO [Thread-2]: server.HiveServer2 (HiveStringUtils.java:run(680)) - SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down HiveServer2 at NWT1302004/10.192.37.175
************************************************************/
2015-06-02 16:12:31,228 INFO [Thread-7]: thrift.ThriftCLIService (ThriftBinaryCLIService.java:run(98)) - Started ThriftBinaryCLIService on port 10000 with 5...500 worker threads
2015-06-02 16:12:31,228 INFO [Thread-6]: thrift.ThriftCLIService (ThriftCLIService.java:stop(138)) - Thrift server has stopped
2015-06-02 16:12:31,228 INFO [Thread-6]: service.AbstractService (AbstractService.java:stop(125)) - Service:ThriftBinaryCLIService is stopped.
2015-06-02 16:12:31,228 INFO [Thread-6]: service.AbstractService (AbstractService.java:stop(125)) - Service:OperationManager is stopped.
2015-06-02 16:12:31,228 INFO [Thread-6]: service.AbstractService (AbstractService.java:stop(125)) - Service:SessionManager is stopped.
2015-06-02 16:12:31,228 INFO [Thread-6]: service.AbstractService (AbstractService.java:stop(125)) - Service:CLIService is stopped.
2015-06-02 16:12:31,228 INFO [Thread-6]: service.AbstractService (AbstractService.java:stop(125)) - Service:HiveServer2 is stopped.
I used mysql as metastore, the hiveserver2 is able to start now:
2015-06-03 10:37:36,863 INFO [main]: metastore.MetaStoreDirectSql (MetaStoreDirectSql.java:<init>(132)) - Using direct SQL, underlying DB is MYSQL
2015-06-03 10:37:36,863 INFO [main]: metastore.ObjectStore (ObjectStore.java:setConf(252)) - Initialized ObjectStore
2015-06-03 10:37:36,879 INFO [main]: metastore.HiveMetaStore (HiveMetaStore.java:logInfo(732)) - 0: get_databases: default
2015-06-03 10:37:36,879 INFO [main]: HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(358)) - ugi=46172 ip=unknown-ip-addr cmd=get_databases: default
2015-06-03 10:37:36,910 INFO [main]: metastore.HiveMetaStore (HiveMetaStore.java:logInfo(732)) - 0: Shutting down the object store...
2015-06-03 10:37:36,910 INFO [main]: HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(358)) - ugi=46172 ip=unknown-ip-addr cmd=Shutting down the object store...
2015-06-03 10:37:36,910 INFO [main]: metastore.HiveMetaStore (HiveMetaStore.java:logInfo(732)) - 0: Metastore shutdown complete.
2015-06-03 10:37:36,910 INFO [main]: HiveMetaStore.audit (HiveMetaStore.java:logAuditEvent(358)) - ugi=46172 ip=unknown-ip-addr cmd=Metastore shutdown complete.
2015-06-03 10:37:36,910 INFO [main]: service.AbstractService (AbstractService.java:start(104)) - Service:ThriftBinaryCLIService is started.
2015-06-03 10:37:36,926 INFO [main]: service.AbstractService (AbstractService.java:start(104)) - Service:HiveServer2 is started.
Related
Facing this issue from quite sometime now and not able to track the reason why is it happening.
Whenever we start hiveserver2 using command ->
./hiveserver2 &
It starts and stays up for sometime but then shuts down. In hive logs it does show the following error while hiveserver is up and running.
2018-03-12 04:44:57,029 ERROR [HiveServer2-Handler-Pool: Thread-33]: server.TThreadPoolServer (TThreadPoolServer.java:run(296)) - Erro
r occurred during processing of message.
java.lang.RuntimeException: org.apache.thrift.transport.TSaslTransportException: No data or no sasl data in the stream
at org.apache.thrift.transport.TSaslServerTransport$Factory.getTransport(TSaslServerTransport.java:219)
at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:268)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.thrift.transport.TSaslTransportException: No data or no sasl data in the stream
at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:328)
at org.apache.thrift.transport.TSaslServerTransport.open(TSaslServerTransport.java:41)
at org.apache.thrift.transport.TSaslServerTransport$Factory.getTransport(TSaslServerTransport.java:216)
... 4 more
2018-03-12 04:45:55,361 INFO [main]: SessionState (SessionState.java:printInfo(951)) -
Logging initialized using configuration in file:/usr/local/hive/conf/hive-log4j.properties
But I'm not really sure that the shutting down of hiveserver is due to above error as it keeps on running for hours before shutting down.
Following are the hive logs that comes when hiveserver shuts down
2018-03-12 04:46:25,285 INFO [main]: ql.Driver (SessionState.java:printInfo(951)) - Stage-Stage-1: Map: 4 Reduce: 1 Cumulative CPU
: 18.09 sec HDFS Read: 763046 HDFS Write: 2217 SUCCESS
2018-03-12 04:46:25,286 INFO [main]: ql.Driver (SessionState.java:printInfo(951)) - Total MapReduce CPU Time Spent: 18 seconds 90 mse
c
2018-03-12 04:46:25,286 INFO [main]: ql.Driver (SessionState.java:printInfo(951)) - OK
2018-03-12 04:46:25,286 INFO [main]: log.PerfLogger (PerfLogger.java:PerfLogBegin(121)) - <PERFLOG method=releaseLocks from=org.apach
e.hadoop.hive.ql.Driver>
2018-03-12 04:46:25,295 INFO [main]: log.PerfLogger (PerfLogger.java:PerfLogEnd(148)) - </PERFLOG method=releaseLocks start=152082998
5286 end=1520829985295 duration=9 from=org.apache.hadoop.hive.ql.Driver>
2018-03-12 04:46:25,295 INFO [main]: log.PerfLogger (PerfLogger.java:PerfLogEnd(148)) - </PERFLOG method=Driver.run start=15208299614
77 end=1520829985295 duration=23818 from=org.apache.hadoop.hive.ql.Driver>
2018-03-12 04:46:25,304 INFO [main]: CliDriver (SessionState.java:printInfo(951)) - Time taken: 23.818 seconds
2018-03-12 04:46:25,304 INFO [main]: log.PerfLogger (PerfLogger.java:PerfLogBegin(121)) - <PERFLOG method=releaseLocks from=org.apach
e.hadoop.hive.ql.Driver>
2018-03-12 04:46:25,305 INFO [main]: log.PerfLogger (PerfLogger.java:PerfLogEnd(148)) - </PERFLOG method=releaseLocks start=152082998
5304 end=1520829985305 duration=1 from=org.apache.hadoop.hive.ql.Driver>
2018-03-12 04:46:36,351 INFO [Thread-9]: server.HiveServer2 (HiveServer2.java:stop(305)) - Shutting down HiveServer2
2018-03-12 04:46:36,351 INFO [Thread-9]: thrift.ThriftCLIService (ThriftCLIService.java:stop(201)) - Thrift server has stopped
2018-03-12 04:46:36,351 INFO [Thread-9]: service.AbstractService (AbstractService.java:stop(125)) - Service:ThriftBinaryCLIService is
stopped.
2018-03-12 04:46:36,351 INFO [Thread-9]: service.AbstractService (AbstractService.java:stop(125)) - Service:OperationManager is stopp
ed.
2018-03-12 04:46:36,351 INFO [Thread-9]: service.AbstractService (AbstractService.java:stop(125)) - Service:SessionManager is stopped
.
2018-03-12 04:46:36,351 INFO [Thread-3]: server.HiveServer2 (HiveStringUtils.java:run(709)) - SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down HiveServer2 at SERVER-HOSTNAME/192.168.***.**
************************************************************/
2018-03-12 04:46:46,352 WARN [Thread-9]: service.CompositeService (SessionManager.java:cleanupLoggingRootDir(213)) - Failed to cleanu
p root dir of HS2 logging: /usr/local/hive/log
java.io.FileNotFoundException: File does not exist: /usr/local/hive/log
at org.apache.commons.io.FileUtils.forceDelete(FileUtils.java:2275)
at org.apache.hive.service.cli.session.SessionManager.cleanupLoggingRootDir(SessionManager.java:211)
at org.apache.hive.service.cli.session.SessionManager.stop(SessionManager.java:205)
at org.apache.hive.service.CompositeService.stop(CompositeService.java:102)
at org.apache.hive.service.CompositeService.stop(CompositeService.java:92)
at org.apache.hive.service.cli.CLIService.stop(CLIService.java:165)
at org.apache.hive.service.CompositeService.stop(CompositeService.java:102)
at org.apache.hive.service.CompositeService.stop(CompositeService.java:92)
at org.apache.hive.service.server.HiveServer2.stop(HiveServer2.java:307)
at org.apache.hive.service.server.HiveServer2$1.run(HiveServer2.java:107)
2018-03-12 04:46:46,353 INFO [Thread-9]: service.AbstractService (AbstractService.java:stop(125)) - Service:CLIService is stopped.
2018-03-12 04:46:46,353 INFO [Thread-9]: service.AbstractService (AbstractService.java:stop(125)) - Service:HiveServer2 is stopped.
2018-03-12 04:51:07,336 INFO [main]: SessionState (SessionState.java:printInfo(951)) -
Logging initialized using configuration in file:/usr/local/hive/conf/hive-log4j.properties
If the issue is actually because of...
ERROR [HiveServer2-Handler-Pool: Thread-33]: server.TThreadPoolServer (TThreadPoolServer.java:run(296)) - Erro
r occurred during processing of message.
java.lang.RuntimeException: org.apache.thrift.transport.TSaslTransportException: No data or no sasl data in the stream
...then here are my hive-site.xml settings which are related to it as mentioned in many other related posts.
<name>hive.server2.authentication</name>
<value>PAM</value>
<name>hive.server2.authentication.pam.services</name>
<value>sshd,sudo</value>
<name>hive.server2.thrift.sasl.qop</name>
<value>auth</value>
<name>hive.metastore.sasl.enabled</name>
<value>false</value>
EDITS
Tried starting hiveserver after changing hive.server2.authentication from PAM to NONE
But Again hiveserver started with the following error
ERROR [HiveServer2-Handler-Pool: Thread-31]: server.TThreadPoolServer (TThreadPoolServer.java:run(296)) - Error occurred during processing of message.
java.lang.RuntimeException: org.apache.thrift.transport.TSaslTransportException: No data or no sasl data in the stream
also when trying to connect to beeline it throws connection exception as expected,
bin$ ./beeline
Beeline version 1.2.2 by Apache Hive
beeline> !connect jdbc:hive2://192.168.XXX.XX:XXX7 myuser myp#sw0rd
Connecting to jdbc:hive2://192.168.XXX.XX:XXX7
Error: Could not open client transport with JDBC Uri: jdbc:hive2://192.168.203.XXX.XX:XXX7: java.net.ConnectException: Connection timed out (Connection timed out) (state=08S01,code=0)
0: jdbc:hive2://192.168.XXX.XX:XXX7 (closed)>
0: jdbc:hive2://192.168.XXX.XX:XXX7 (closed)>
while ps -ef | grep hive shows that hiveserver is up
ps -ef | grep hive
hduser 30902 30165 1 05:39 pts/1 00:00:15 /data/apps/jdk/bin/java -Xmx4000m -Djava.library.path=/usr/local/hadoop/lib -Djava.net.preferIPv4Stack=true -Dhadoop.log.dir=/usr/local/hadoop/logs -Dhadoop.log.file=hadoop.log -Dhadoop.home.dir=/usr/local/hadoop -Dhadoop.id.str=hduser -Dhadoop.root.logger=INFO,console -Djava.library.path=/usr/local/hadoop/lib/native -Dhadoop.policy.file=hadoop-policy.xml -Djava.net.preferIPv4Stack=true -Dhadoop.security.logger=INFO,NullAppender org.apache.hadoop.util.RunJar /usr/local/hive/lib/hive-service-1.2.2.jar org.apache.hive.service.server.HiveServer2
HiveServer2 documentation mentions that when using PAM authentication mode, if the user's password has expired, it will cause the server to go down. Please check if that's the case and you can also try setting hive.server2.authentication to NONE and check if that lets you connect to the server.
a time out on a connection may be just because it's not listening at all on the port, or not authorized to be connected.
netstat -na to check the port listening
/etc/security/access.conf
or iptable -L
?
I have a problem i am trying to execute pig command in local mode I am using pig -x local and trying to execute a dump statment but its giving me error please have a look and explain to me, another problem is when I am running the same command dump b; in mapreduce mode the command is running fine and its showing me the results i wonder why its not running in local mode please have a look:-
[dead#master ~]$ pig -x local
17/04/20 16:00:53 INFO pig.ExecTypeProvider: Trying ExecType : LOCAL
17/04/20 16:00:53 INFO pig.ExecTypeProvider: Picked LOCAL as the ExecType
2017-04-20 16:00:53,772 [main] INFO org.apache.pig.Main - Apache Pig version 0.16.0 (r1746530) compiled Jun 01 2016, 23:10:49
2017-04-20 16:00:53,772 [main] INFO org.apache.pig.Main - Logging error messages to: /opt/hadoop/pig_1492684253771.log
2017-04-20 16:00:53,826 [main] INFO org.apache.pig.impl.util.Utils - Default bootup file /opt/hadoop/.pigbootup not found
2017-04-20 16:00:54,150 [main] INFO org.apache.pig.backend.hadoop.executionengine.HExecutionEngine - Connecting to hadoop file system at: file:///
2017-04-20 16:00:54,322 [main] INFO org.apache.pig.PigServer - Pig Script ID for the session: PIG-default-f6e00222-79c4-4e12-a2f1-c9404e17c69c
2017-04-20 16:00:54,322 [main] WARN org.apache.pig.PigServer - ATS is disabled since yarn.timeline-service.enabled set to false
grunt> data = load '/books.csv' using PigStorage(',') as (booknum:int, author:chararray, title:chararray, published:int);
grunt> b = foreach data generate author,title;
grunt> dump b;
2017-04-20 16:01:06,219 [main] INFO org.apache.pig.tools.pigstats.ScriptState - Pig features used in the script: UNKNOWN
2017-04-20 16:01:06,424 [main] INFO org.apache.pig.newplan.logical.optimizer.LogicalPlanOptimizer - {RULES_ENABLED=[AddForEach, ColumnMapKeyPrune, ConstantCalculator, GroupByConstParallelSetter, LimitOptimizer, LoadTypeCastInserter, MergeFilter, MergeForEach, PartitionFilterOptimizer, PredicatePushdownOptimizer, PushDownForEachFlatten, PushUpFilter, SplitFilter, StreamTypeCastInserter]}
2017-04-20 16:01:06,493 [main] INFO org.apache.pig.newplan.logical.rules.ColumnPruneVisitor - Columns pruned for data: $0, $3
2017-04-20 16:01:06,600 [main] INFO org.apache.pig.impl.util.SpillableMemoryManager - Selected heap (Tenured Gen) of size 699072512 to monitor. collectionUsageThreshold = 489350752, usageThreshold = 489350752
2017-04-20 16:01:06,758 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MRCompiler - File concatenation threshold: 100 optimistic? false
2017-04-20 16:01:06,821 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MultiQueryOptimizer - MR plan size before optimization: 1
2017-04-20 16:01:06,821 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MultiQueryOptimizer - MR plan size after optimization: 1
2017-04-20 16:01:06,945 [main] INFO org.apache.hadoop.metrics.jvm.JvmMetrics - Initializing JVM Metrics with processName=JobTracker, sessionId=
2017-04-20 16:01:07,013 [main] INFO org.apache.pig.tools.pigstats.mapreduce.MRScriptState - Pig script settings are added to the job
2017-04-20 16:01:07,049 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler - mapred.job.reduce.markreset.buffer.percent is not set, set to default 0.3
2017-04-20 16:01:07,090 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler - Setting up single store job
2017-04-20 16:01:07,141 [main] INFO org.apache.pig.data.SchemaTupleFrontend - Key [pig.schematuple] is false, will not generate code.
2017-04-20 16:01:07,141 [main] INFO org.apache.pig.data.SchemaTupleFrontend - Starting process to move generated code to distributed cacche
2017-04-20 16:01:07,141 [main] INFO org.apache.pig.data.SchemaTupleFrontend - Distributed cache not supported or needed in local mode. Setting key [pig.schematuple.local.dir] with code temp directory: /tmp/1492684267141-0
2017-04-20 16:01:07,242 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - 1 map-reduce job(s) waiting for submission.
2017-04-20 16:01:07,315 [JobControl] INFO org.apache.hadoop.metrics.jvm.JvmMetrics - Cannot initialize JVM Metrics with processName=JobTracker, sessionId= - already initialized
2017-04-20 16:01:07,763 [JobControl] WARN org.apache.hadoop.mapreduce.JobResourceUploader - No job jar file set. User classes may not be found. See Job or Job#setJar(String).
2017-04-20 16:01:07,813 [JobControl] INFO org.apache.pig.builtin.PigStorage - Using PigTextInputFormat
2017-04-20 16:01:07,860 [JobControl] INFO org.apache.hadoop.mapreduce.JobSubmitter - Cleaning up the staging area file:/tmp/hadoop-dead/mapred/staging/dead1668871356/.staging/job_local1668871356_0001
2017-04-20 16:01:07,893 [JobControl] INFO org.apache.hadoop.mapreduce.lib.jobcontrol.ControlledJob - PigLatin:DefaultJobName got an error while submitting
org.apache.pig.backend.executionengine.ExecException: ERROR 2118: Input path does not exist: file:/books.csv
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigInputFormat.getSplits(PigInputFormat.java:279)
at org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits(JobSubmitter.java:301)
at org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:318)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:196)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1290)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1287)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1287)
at org.apache.hadoop.mapreduce.lib.jobcontrol.ControlledJob.submit(ControlledJob.java:335)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.pig.backend.hadoop23.PigJobControl.submit(PigJobControl.java:128)
at org.apache.pig.backend.hadoop23.PigJobControl.run(PigJobControl.java:194)
at java.lang.Thread.run(Thread.java:745)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher$1.run(MapReduceLauncher.java:276)
Caused by: org.apache.hadoop.mapreduce.lib.input.InvalidInputException: Input path does not exist: file:/books.csv
at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.singleThreadedListStatus(FileInputFormat.java:323)
at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.listStatus(FileInputFormat.java:265)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigTextInputFormat.listStatus(PigTextInputFormat.java:36)
at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.getSplits(FileInputFormat.java:387)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigInputFormat.getSplits(PigInputFormat.java:265)
... 18 more
2017-04-20 16:01:07,898 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - HadoopJobId: job_local1668871356_0001
2017-04-20 16:01:07,898 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - Processing aliases b,data
2017-04-20 16:01:07,898 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - detailed locations: M: data[1,7],b[-1,-1] C: R:
2017-04-20 16:01:07,972 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - 0% complete
2017-04-20 16:01:08,113 [main] WARN org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - Ooops! Some job has failed! Specify -stop_on_failure if you want Pig to stop immediately on failure.
2017-04-20 16:01:08,114 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - job job_local1668871356_0001 has failed! Stop running all dependent jobs
2017-04-20 16:01:08,114 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - 100% complete
2017-04-20 16:01:08,142 [main] INFO org.apache.hadoop.metrics.jvm.JvmMetrics - Cannot initialize JVM Metrics with processName=JobTracker, sessionId= - already initialized
2017-04-20 16:01:08,142 [main] ERROR org.apache.pig.tools.pigstats.PigStats - ERROR 0: java.lang.IllegalStateException: Job in state DEFINE instead of RUNNING
2017-04-20 16:01:08,142 [main] ERROR org.apache.pig.tools.pigstats.mapreduce.MRPigStatsUtil - 1 map reduce job(s) failed!
2017-04-20 16:01:08,143 [main] INFO org.apache.pig.tools.pigstats.mapreduce.SimplePigStats - Script Statistics:
HadoopVersion PigVersion UserId StartedAt FinishedAt Features
2.7.2 0.16.0 dead 2017-04-20 16:01:07 2017-04-20 16:01:08 UNKNOWN
Failed!
Failed Jobs:
JobId Alias Feature Message Outputs
job_local1668871356_0001 b,data MAP_ONLY Message: org.apache.pig.backend.executionengine.ExecException: ERROR 2118: Input path does not exist: file:/books.csv
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigInputFormat.getSplits(PigInputFormat.java:279)
at org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits(JobSubmitter.java:301)
at org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:318)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:196)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1290)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1287)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1287)
at org.apache.hadoop.mapreduce.lib.jobcontrol.ControlledJob.submit(ControlledJob.java:335)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.pig.backend.hadoop23.PigJobControl.submit(PigJobControl.java:128)
at org.apache.pig.backend.hadoop23.PigJobControl.run(PigJobControl.java:194)
at java.lang.Thread.run(Thread.java:745)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher$1.run(MapReduceLauncher.java:276)
Caused by: org.apache.hadoop.mapreduce.lib.input.InvalidInputException: Input path does not exist: file:/books.csv
at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.singleThreadedListStatus(FileInputFormat.java:323)
at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.listStatus(FileInputFormat.java:265)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigTextInputFormat.listStatus(PigTextInputFormat.java:36)
at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.getSplits(FileInputFormat.java:387)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigInputFormat.getSplits(PigInputFormat.java:265)
... 18 more
file:/tmp/temp-1274993195/tmp-1105974594,
Input(s):
Failed to read data from "/books.csv"
Output(s):
Failed to produce result in "file:/tmp/temp-1274993195/tmp-1105974594"
Counters:
Total records written : 0
Total bytes written : 0
Spillable Memory Manager spill count : 0
Total bags proactively spilled: 0
Total records proactively spilled: 0
Job DAG:
job_local1668871356_0001
2017-04-20 16:01:08,144 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - Failed!
2017-04-20 16:01:08,147 [main] ERROR org.apache.pig.tools.grunt.Grunt - ERROR 1066: Unable to open iterator for alias b. Backend error : java.lang.IllegalStateException: Job in state DEFINE instead of RUNNING
Details at logfile: /opt/hadoop/pig_1492684253771.log
I start the sentry service(without kerberos, ad or ldap), and config hive, impala with sentry.
Then I used beeline to connect hive2(beeline> !connect jdbc:hive2://),
and ran the command "create role test_role", but it throwed an error.
What could cause it happen?
The following is the log:
[root#cdh1 ~]# su - hive -s /bin/bash
[hive#cdh1 ~]$ beeline
Beeline version 0.13.1-cdh5.3.0 by Apache Hive
beeline> !connect jdbc:hive2://
scan complete in 3ms
Connecting to jdbc:hive2://
Enter username for jdbc:hive2://:
Enter password for jdbc:hive2://:
16/02/19 13:46:20 WARN conf.HiveConf: DEPRECATED: Configuration property hive.metastore.local no longer has any effect. Make sure to provide a valid value for hive.metastore.uris if you are connecting to a remote metastore.
16/02/19 13:46:20 INFO hive.metastore: Trying to connect to metastore with URI thrift://cdh1:9083
16/02/19 13:46:20 INFO hive.metastore: Connected to metastore.
16/02/19 13:46:21 INFO session.SessionState: No Tez session required at this point. hive.execution.engine=mr.
16/02/19 13:46:21 WARN conf.HiveConf: DEPRECATED: Configuration property hive.metastore.local no longer has any effect. Make sure to provide a valid value for hive.metastore.uris if you are connecting to a remote metastore.
16/02/19 13:46:21 INFO service.CompositeService: HiveServer2: Background operation thread pool size: 100
16/02/19 13:46:21 INFO service.CompositeService: HiveServer2: Background operation thread wait queue size: 100
16/02/19 13:46:21 INFO service.CompositeService: HiveServer2: Background operation thread keepalive time: 10
16/02/19 13:46:21 INFO service.AbstractService: Service:OperationManager is inited.
16/02/19 13:46:21 INFO service.AbstractService: Service:LogManager is inited.
16/02/19 13:46:21 INFO service.AbstractService: Service:SessionManager is inited.
16/02/19 13:46:21 INFO service.AbstractService: Service:CLIService is inited.
16/02/19 13:46:21 INFO service.AbstractService: Service:OperationManager is started.
16/02/19 13:46:21 INFO service.AbstractService: Service:LogManager is started.
16/02/19 13:46:21 INFO service.AbstractService: Service:SessionManager is started.
16/02/19 13:46:21 INFO service.AbstractService: Service:CLIService is started.
16/02/19 13:46:21 INFO hive.metastore: Trying to connect to metastore with URI thrift://cdh1:9083
16/02/19 13:46:21 INFO hive.metastore: Connected to metastore.
16/02/19 13:46:21 INFO thrift.ThriftCLIService: Client protocol version: HIVE_CLI_SERVICE_PROTOCOL_V6
16/02/19 13:46:21 INFO session.SessionState: No Tez session required at this point. hive.execution.engine=mr.
16/02/19 13:46:21 INFO session.SessionState: No Tez session required at this point. hive.execution.engine=mr.
Connected to: Apache Hive (version 0.13.1-cdh5.3.0)
Driver: Hive JDBC (version 0.13.1-cdh5.3.0)
Transaction isolation: TRANSACTION_REPEATABLE_READ
0: jdbc:hive2://>
0: jdbc:hive2://> create role test_role;
16/02/19 13:46:32 INFO log.LogManager: Operation log size: 131072
16/02/19 13:46:32 INFO log.PerfLogger: <PERFLOG method=compile from=org.apache.hadoop.hive.ql.Driver>
16/02/19 13:46:32 INFO log.PerfLogger: <PERFLOG method=parse from=org.apache.hadoop.hive.ql.Driver>
16/02/19 13:46:32 INFO parse.ParseDriver: Parsing command: create role test_role
16/02/19 13:46:32 INFO parse.ParseDriver: Parse Completed
16/02/19 13:46:32 INFO log.PerfLogger: </PERFLOG method=parse start=1455860792301 end=1455860792688 duration=387 from=org.apache.hadoop.hive.ql.Driver>
16/02/19 13:46:32 INFO log.PerfLogger: <PERFLOG method=semanticAnalyze from=org.apache.hadoop.hive.ql.Driver>
FAILED: SemanticException The current builtin authorization in Hive is incomplete and disabled.
16/02/19 13:46:32 ERROR ql.Driver: FAILED: SemanticException The current builtin authorization in Hive is incomplete and disabled.
org.apache.hadoop.hive.ql.parse.SemanticException: The current builtin authorization in Hive is incomplete and disabled.
at org.apache.hadoop.hive.ql.parse.authorization.RestrictedHiveAuthorizationTaskFactoryImpl.raiseAuthError(RestrictedHiveAuthorizationTaskFactoryImpl.java:140)
at org.apache.hadoop.hive.ql.parse.authorization.RestrictedHiveAuthorizationTaskFactoryImpl.createCreateRoleTask(RestrictedHiveAuthorizationTaskFactoryImpl.java:47)
at org.apache.hadoop.hive.ql.parse.DDLSemanticAnalyzer.analyzeCreateRole(DDLSemanticAnalyzer.java:559)
at org.apache.hadoop.hive.ql.parse.DDLSemanticAnalyzer.analyzeInternal(DDLSemanticAnalyzer.java:455)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:206)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:437)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:335)
at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1026)
at org.apache.hadoop.hive.ql.Driver.compileAndRespond(Driver.java:1019)
at org.apache.hive.service.cli.operation.SQLOperation.prepare(SQLOperation.java:100)
at org.apache.hive.service.cli.operation.SQLOperation.run(SQLOperation.java:173)
at org.apache.hive.service.cli.session.HiveSessionImpl.runOperationWithLogCapture(HiveSessionImpl.java:715)
at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal(HiveSessionImpl.java:370)
at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementAsync(HiveSessionImpl.java:357)
at org.apache.hive.service.cli.CLIService.executeStatementAsync(CLIService.java:237)
at org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(ThriftCLIService.java:392)
at org.apache.hive.jdbc.HiveStatement.execute(HiveStatement.java:232)
at org.apache.hive.beeline.Commands.execute(Commands.java:736)
at org.apache.hive.beeline.Commands.sql(Commands.java:657)
at org.apache.hive.beeline.BeeLine.dispatch(BeeLine.java:910)
at org.apache.hive.beeline.BeeLine.execute(BeeLine.java:772)
at org.apache.hive.beeline.BeeLine.begin(BeeLine.java:734)
at org.apache.hive.beeline.BeeLine.mainWithInputRedirection(BeeLine.java:469)
at org.apache.hive.beeline.BeeLine.main(BeeLine.java:452)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
16/02/19 13:46:32 INFO log.PerfLogger: </PERFLOG method=compile start=1455860792263 end=1455860792747 duration=484 from=org.apache.hadoop.hive.ql.Driver>
16/02/19 13:46:32 INFO log.PerfLogger: <PERFLOG method=releaseLocks from=org.apache.hadoop.hive.ql.Driver>
16/02/19 13:46:32 INFO log.PerfLogger: </PERFLOG method=releaseLocks start=1455860792747 end=1455860792747 duration=0 from=org.apache.hadoop.hive.ql.Driver>
16/02/19 13:46:32 WARN thrift.ThriftCLIService: Error executing statement:
org.apache.hive.service.cli.HiveSQLException: Error while compiling statement: FAILED: SemanticException The current builtin authorization in Hive is incomplete and disabled.
at org.apache.hive.service.cli.operation.SQLOperation.prepare(SQLOperation.java:102)
at org.apache.hive.service.cli.operation.SQLOperation.run(SQLOperation.java:173)
at org.apache.hive.service.cli.session.HiveSessionImpl.runOperationWithLogCapture(HiveSessionImpl.java:715)
at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal(HiveSessionImpl.java:370)
at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementAsync(HiveSessionImpl.java:357)
at org.apache.hive.service.cli.CLIService.executeStatementAsync(CLIService.java:237)
at org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(ThriftCLIService.java:392)
at org.apache.hive.jdbc.HiveStatement.execute(HiveStatement.java:232)
at org.apache.hive.beeline.Commands.execute(Commands.java:736)
at org.apache.hive.beeline.Commands.sql(Commands.java:657)
at org.apache.hive.beeline.BeeLine.dispatch(BeeLine.java:910)
at org.apache.hive.beeline.BeeLine.execute(BeeLine.java:772)
at org.apache.hive.beeline.BeeLine.begin(BeeLine.java:734)
at org.apache.hive.beeline.BeeLine.mainWithInputRedirection(BeeLine.java:469)
at org.apache.hive.beeline.BeeLine.main(BeeLine.java:452)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
Error: Error while compiling statement: FAILED: SemanticException The current builtin authorization in Hive is incomplete and disabled. (state=42000,code=40000)
0: jdbc:hive2://>
"Enter username for jdbc:hive2://:" prompt is empty.
You need to provide the username of the sentry admin, one of the sentry.metastore.service.users values.
I am attempting to run a Hive action through an Oozie workflow that I've created in Hue, but the action "heart beat"s forever and does not execute the Hive SQL.
I've read other posts about heart beating forever, but this one seems to be occurring at a different point, after the SQL statement has been parsed. I've checked memory on each node in the cluster, and I've verified that the task count parameters are reasonable.
Here is the hive-config.xml file:
<configuration>
<property>
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:hive://10.1.10.250:10000/testdb</value>
<description>JDBC connect string</description>
</property>
<property>
<name>javax.jdo.option.ConnectionDriverName</name>
<value>org.apache.hadoop.hive.jdbc.HiveDriver</value>
<description>JDBC driver</description>
</property>
</configuration>
I know that the Hive connection is working, because the action fails if provided with either a bad SQL statement, a bad URL, or a bad driver name.
Here is the action stdout log:
[...truncated]
=================================================================
>>> Invoking Hive command line now >>>
4283 [main] INFO org.apache.hadoop.hive.ql.log.PerfLogger - <PERFLOG method=Driver.run from=org.apache.hadoop.hive.ql.Driver>
4284 [main] INFO org.apache.hadoop.hive.ql.log.PerfLogger - <PERFLOG method=TimeToSubmit from=org.apache.hadoop.hive.ql.Driver>
4284 [main] INFO org.apache.hadoop.hive.ql.log.PerfLogger - <PERFLOG method=compile from=org.apache.hadoop.hive.ql.Driver>
4339 [main] INFO org.apache.hadoop.hive.ql.log.PerfLogger - <PERFLOG method=parse from=org.apache.hadoop.hive.ql.Driver>
4354 [main] INFO hive.ql.parse.ParseDriver - Parsing command: create table testdb.temp99 (col1 int)
4665 [main] INFO hive.ql.parse.ParseDriver - Parse Completed
4667 [main] INFO org.apache.hadoop.hive.ql.log.PerfLogger - </PERFLOG method=parse start=1418968298270 end=1418968298598 duration=328 from=org.apache.hadoop.hive.ql.Driver>
4667 [main] INFO org.apache.hadoop.hive.ql.log.PerfLogger - <PERFLOG method=semanticAnalyze from=org.apache.hadoop.hive.ql.Driver>
4733 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Starting Semantic Analysis
4735 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Creating table testdb.temp99 position=13
4760 [main] INFO org.apache.hadoop.hive.ql.Driver - Semantic Analysis Completed
4775 [main] INFO org.apache.hadoop.hive.ql.log.PerfLogger - </PERFLOG method=semanticAnalyze start=1418968298598 end=1418968298706 duration=108 from=org.apache.hadoop.hive.ql.Driver>
4784 [main] INFO org.apache.hadoop.hive.ql.Driver - Returning Hive schema: Schema(fieldSchemas:null, properties:null)
4784 [main] INFO org.apache.hadoop.hive.ql.log.PerfLogger - </PERFLOG method=compile start=1418968298215 end=1418968298715 duration=500 from=org.apache.hadoop.hive.ql.Driver>
4785 [main] INFO org.apache.hadoop.hive.ql.Driver - Concurrency mode is disabled, not creating a lock manager
4785 [main] INFO org.apache.hadoop.hive.ql.log.PerfLogger - <PERFLOG method=Driver.execute from=org.apache.hadoop.hive.ql.Driver>
4785 [main] INFO org.apache.hadoop.hive.ql.Driver - Starting command: create table testdb.temp99 (col1 int)
4792 [main] INFO org.apache.hadoop.hive.ql.log.PerfLogger - </PERFLOG method=TimeToSubmit start=1418968298215 end=1418968298723 duration=508 from=org.apache.hadoop.hive.ql.Driver>
4792 [main] INFO org.apache.hadoop.hive.ql.log.PerfLogger - <PERFLOG method=runTasks from=org.apache.hadoop.hive.ql.Driver>
4792 [main] INFO org.apache.hadoop.hive.ql.log.PerfLogger - <PERFLOG method=task.DDL.Stage-0 from=org.apache.hadoop.hive.ql.Driver>
4815 [main] INFO hive.ql.exec.DDLTask - Default to LazySimpleSerDe for table testdb.temp99
4935 [main] INFO org.apache.hadoop.hive.metastore.HiveMetaStore - 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
4959 [main] INFO org.apache.hadoop.hive.metastore.ObjectStore - ObjectStore, initialize called
5261 [main] INFO DataNucleus.Persistence - Property datanucleus.cache.level2 unknown - will be ignored
Heart beat
Heart beat
[...forever...]
Why does the workflow heart beat at this point in the log rather than continuing?
ADDENDUM:
The Oozie workflow associated with this Hive action is:
<workflow-app name="Hive-copy" xmlns="uri:oozie:workflow:0.4">
<start to="Hive"/>
<action name="Hive">
<hive xmlns="uri:oozie:hive-action:0.2">
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node>
<job-xml>/user/test/hive-config.xml</job-xml>
<script>/user/test/test.sql</script>
<file>hive-config.xml#hive-config.xml</file>
</hive>
<ok to="end"/>
<error to="kill"/>
</action>
<kill name="kill">
<message>Action failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
</kill>
<end name="end"/>
</workflow-app>
This issue is related to nodeManager.
Try to set the following property in the mapred-site.xml file fixes the issue:
<property>
<name>mapred.tasktracker.map.tasks.maximum</name>
<value>50 </value>
</property>
<property>
<name>mapred.tasktracker.map.tasks.maximum</name>
<value>50 </value>
</property>
It was similar to this issue : https://groups.google.com/a/cloudera.org/forum/?fromgroups=#!topic/cdh-user/v0BHtQ0hlBg
Am working on Hortonworks Hive.
I have seen same type of errors. But underlying MapReduce error seems to be different here in the case as Application error with exitCode 1.
In Hive, the statement
Select * from SomeTable;
...Is working fine, but
Select colName from SomeTable;
...Is not working.
Application error log
2014-03-17 12:49:15,557 INFO org.apache.hadoop.yarn.server.resourcemanager.rmapp.RMAppImpl: application_1395039411618_0001 State change from ACCEPTED to FAILED
2014-03-17 12:49:15,558 INFO org.apache.hadoop.yarn.server.resourcemanager.scheduler.capacity.CapacityScheduler: Application appattempt_1395039411618_0001_000002 is done. finalState=FAILED
2014-03-17 12:49:15,559 INFO org.apache.hadoop.yarn.server.resourcemanager.scheduler.AppSchedulingInfo: Application application_1395039411618_0001 requests cleared
2014-03-17 12:49:15,559 INFO org.apache.hadoop.yarn.server.resourcemanager.scheduler.capacity.LeafQueue: Application removed - appId: application_1395039411618_0001 user: asande queue: default #user-pending-applications: 0 #user-active-applications: 0 #queue-pending-applications: 0 #queue-active-applications: 0
2014-03-17 12:49:15,559 INFO org.apache.hadoop.yarn.server.resourcemanager.scheduler.capacity.ParentQueue: Application removed - appId: application_1395039411618_0001 user: asande leaf-queue of parent: root #applications: 0
2014-03-17 12:49:15,559 WARN org.apache.hadoop.yarn.server.resourcemanager.RMAuditLogger: USER=asande OPERATION=Application Finished - Failed TARGET=RMAppManager RESULT=FAILURE DESCRIPTION=App failed with state: FAILED PERMISSIONS=Application application_1395039411618_0001 failed 2 times due to AM Container for appattempt_1395039411618_0001_000002 exited with exitCode: 1 due to: Exception from container-launch:
org.apache.hadoop.util.Shell$ExitCodeException:
Here's Hive.log. (But it seems there's nothing wrong in the log.)
<code>
2014-03-17 10:45:37,322 INFO server.HiveServer2 (HiveStringUtils.java:startupShutdownMessage(604)) - STARTUP_MSG:
/************************************************************
STARTUP_MSG: Starting HiveServer2
STARTUP_MSG: host = ASANDE1/16.155.82.203
STARTUP_MSG: args = [-hiveconf, hive.hadoop.classpath=c:\hdp\hive-0.12.0.2.0.6.0-0009\lib\*.............................................., -hiveconf, hive.querylog.location=c:\hadoop\logs\hive\history, -hiveconf, hive.log.dir=c:\hadoop\logs\hive]
STARTUP_MSG: version = 0.12.0.2.0.6.0-0009
STARTUP_MSG: classpath = c:\hdp\hadoop-2.2.0.2.0.6.0-0009\etc\hadoop;c:\hdp\hadoop-;;
STARTUP_MSG: build = git://sijenkins-vm3/cygdrive/d/w/bw/project/hive-monarch -r a7f54db5645b645500778b92e7fad8fab7738080; compiled by 'jenkins' on Fri Dec 20 18:29:58 PST 2013
************************************************************/
2014-03-17 10:45:39,622 INFO service.CompositeService (SessionManager.java:init(60)) - HiveServer2: Async execution thread pool size: 100
2014-03-17 10:45:39,622 INFO service.CompositeService (SessionManager.java:init(62)) - HiveServer2: Async execution wait queue size: 100
2014-03-17 10:45:39,623 INFO service.CompositeService (SessionManager.java:init(64)) - HiveServer2: Async execution thread keepalive time: 10
2014-03-17 10:45:39,628 INFO service.AbstractService (AbstractService.java:init(89)) - Service:OperationManager is inited.
2014-03-17 10:45:39,628 INFO service.AbstractService (AbstractService.java:init(89)) - Service:SessionManager is inited.
2014-03-17 10:45:39,628 INFO service.AbstractService (AbstractService.java:init(89)) - Service:CLIService is inited.
2014-03-17 10:45:39,628 INFO service.AbstractService (AbstractService.java:init(89)) - Service:ThriftBinaryCLIService is inited.
2014-03-17 10:45:39,628 INFO service.AbstractService (AbstractService.java:init(89)) - Service:HiveServer2 is inited.
2014-03-17 10:45:39,628 INFO service.AbstractService (AbstractService.java:start(104)) - Service:OperationManager is started.
2014-03-17 10:45:39,628 INFO service.AbstractService (AbstractService.java:start(104)) - Service:SessionManager is started.
2014-03-17 10:45:39,628 INFO service.AbstractService (AbstractService.java:start(104)) - Service:CLIService is started.
2014-03-17 10:45:39,925 INFO hive.metastore (HiveMetaStoreClient.java:open(244)) - Trying to connect to metastore with URI thrift://ASANDE1:9083
2014-03-17 10:46:02,239 WARN hive.metastore (HiveMetaStoreClient.java:open(307)) - set_ugi() not successful, Likely cause: new client talking to old server. Continuing without it.
org.apache.thrift.transport.TTransportException: java.net.SocketTimeoutException: Read timed out
at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:129)
at org.apache.thrift.transport.TTransport.readAll(TTransport.java:84)
at org.apache.thrift.protocol.TBinaryProtocol.readAll(TBinaryProtocol.java:378)
at org.apache.thrift.protocol.TBinaryProtocol.readI32(TBinaryProtocol.java:297)
at org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:204)
at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:69)
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_set_ugi(ThriftHiveMetastore.java:2822)
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.set_ugi(ThriftHiveMetastore.java:2808)
Caused by: java.net.SocketTimeoutException: Read timed out
at java.net.SocketInputStream.socketRead0(Native Method)
at java.net.SocketInputStream.read(SocketInputStream.java:129)
at java.io.BufferedInputStream.fill(BufferedInputStream.java:218)
at java.io.BufferedInputStream.read1(BufferedInputStream.java:258)
at java.io.BufferedInputStream.read(BufferedInputStream.java:317)
at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:127)
... 14 more
2014-03-17 10:46:02,280 INFO hive.metastore (HiveMetaStoreClient.java:open(322)) - Waiting 1 seconds before next connection attempt.
2014-03-17 10:46:03,280 INFO hive.metastore (HiveMetaStoreClient.java:open(332)) - Connected to metastore.
2014-03-17 10:46:08,455 ERROR hive.log (MetaStoreUtils.java:logAndThrowMetaException(960)) - Got exception: org.apache.thrift.TApplicationException get_databases failed: out of sequence response
org.apache.thrift.TApplicationException: get_databases failed: out of sequence response
at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:76)
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_databases(ThriftHiveMetastore.java:500)
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_databases(ThriftHiveMetastore.java:487)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getDatabases(HiveMetaStoreClient.java:722)
at org.apache.hive.service.cli.CLIService.start(CLIService.java:83)
at org.apache.hive.service.CompositeService.start(CompositeService.java:70)
at org.apache.hive.service.server.HiveServer2.start(HiveServer2.java:73)
at org.apache.hive.service.server.HiveServer2.main(HiveServer2.java:103)
2014-03-17 10:46:08,455 ERROR hive.log (MetaStoreUtils.java:logAndThrowMetaException(961)) - Converting exception to MetaException
2014-03-17 10:46:08,621 ERROR service.CompositeService (CompositeService.java:start(74)) - Error starting services HiveServer2
org.apache.hive.service.ServiceException: Unable to connect to MetaStore!
at org.apache.hive.service.cli.CLIService.start(CLIService.java:85)
..........................
Caused by: MetaException(message:Got exception: org.apache.thrift.TApplicationException get_databases failed: out of sequence response)
at org.apache.hadoop.hive.metastore.MetaStoreUtils.logAndThrowMetaException(MetaStoreUtils.java:962)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getDatabases(HiveMetaStoreClient.java:724)
at org.apache.hive.service.cli.CLIService.start(CLIService.java:83)
... 3 more
2014-03-17 10:46:08,627 INFO service.AbstractService (AbstractService.java:stop(125)) - Service:OperationManager is stopped.
2014-03-17 10:46:08,627 INFO service.AbstractService (AbstractService.java:stop(125)) - Service:SessionManager is stopped.
2014-03-17 10:46:08,627 INFO service.AbstractService (AbstractService.java:stop(125)) - Service:CLIService is stopped.
2014-03-17 10:46:08,627 FATAL server.HiveServer2 (HiveServer2.java:main(105)) - Error starting HiveServer2
org.apache.hive.service.ServiceException: Failed to Start HiveServer2
at org.apache.hive.service.CompositeService.start(CompositeService.java:80)
at org.apache.hive.service.server.HiveServer2.start(HiveServer2.java:73)
at org.apache.hive.service.server.HiveServer2.main(HiveServer2.java:103)
Caused by: org.apache.hive.service.ServiceException: Unable to connect to MetaStore!
at org.apache.hive.service.cli.CLIService.start(CLIService.java:85)
at org.apache.hive.service.CompositeService.start(CompositeService.java:70)
... 2 more
Caused by: MetaException(message:Got exception: org.apache.thrift.TApplicationException get_databases failed: out of sequence response)
at org.apache.hadoop.hive.metastore.MetaStoreUtils.logAndThrowMetaException(MetaStoreUtils.java:962)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getDatabases(HiveMetaStoreClient.java:724)
at org.apache.hive.service.cli.CLIService.start(CLIService.java:83)
... 3 more
2014-03-17 10:46:08,791 INFO server.HiveServer2 (HiveStringUtils.java:run(622)) - SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down HiveServer2 at ASANDE1/16.155.82.203
************************************************************/
2014-03-17 10:46:22,805 INFO server.HiveServer2 (HiveStringUtils.java:startupShutdownMessage(604)) - STARTUP_MSG:
/************************************************************
STARTUP_MSG: Starting HiveServer2
STARTUP_MSG: host = ASANDE1/16.155.82.203
STARTUP_MSG: args = [-hiveconf, hive.hadoop.classpath=c:\hdp\hive-0.12.0.2.0.6.0-0009\lib\*, -hiveconf, hive.querylog.location=c:\hadoop\logs\hive\history, -hiveconf, hive.log.dir=c:\hadoop\logs\hive]
STARTUP_MSG: version = 0.12.0.2.0.6.0-0009
STARTUP_MSG: classpath = c:\hdp\hadoop-2.2.0.2.0.6.0-0009\etc\hadoop;c:\hdp\hadoop-2.2.0.2.0.6.0-0009\share\hadoop\common\lib\activation-1.1.jar;c:\hdp\hadoop-2.2.0.2.0.6.0-
...............................................................
STARTUP_MSG: build = git://sijenkins-vm3/cygdrive/d/w/bw/project/hive-monarch -r a7f54db5645b645500778b92e7fad8fab7738080; compiled by 'jenkins' on Fri Dec 20 18:29:58 PST 2013
************************************************************/
2014-03-17 10:46:23,677 INFO service.CompositeService (SessionManager.java:init(60)) - HiveServer2: Async execution thread pool size: 100
2014-03-17 10:46:23,677 INFO service.CompositeService (SessionManager.java:init(62)) - HiveServer2: Async execution wait queue size: 100
2014-03-17 10:46:23,678 INFO service.CompositeService (SessionManager.java:init(64)) - HiveServer2: Async execution thread keepalive time: 10
2014-03-17 10:46:23,682 INFO service.AbstractService (AbstractService.java:init(89)) - Service:OperationManager is inited.
2014-03-17 10:46:23,682 INFO service.AbstractService (AbstractService.java:init(89)) - Service:SessionManager is inited.
2014-03-17 10:46:23,682 INFO service.AbstractService (AbstractService.java:init(89)) - Service:CLIService is inited.
2014-03-17 10:46:23,683 INFO service.AbstractService (AbstractService.java:init(89)) - Service:ThriftBinaryCLIService is inited.
2014-03-17 10:46:23,683 INFO service.AbstractService (AbstractService.java:init(89)) - Service:HiveServer2 is inited.
2014-03-17 10:46:23,683 INFO service.AbstractService (AbstractService.java:start(104)) - Service:OperationManager is started.
2014-03-17 10:46:23,683 INFO service.AbstractService (AbstractService.java:start(104)) - Service:SessionManager is started.
2014-03-17 10:46:23,683 INFO service.AbstractService (AbstractService.java:start(104)) - Service:CLIService is started.
2014-03-17 10:46:23,694 INFO hive.metastore (HiveMetaStoreClient.java:open(244)) - Trying to connect to metastore with URI thrift://ASANDE1:9083
2014-03-17 10:46:24,093 INFO hive.metastore (HiveMetaStoreClient.java:open(322)) - Waiting 1 seconds before next connection attempt.
2014-03-17 10:46:25,093 INFO hive.metastore (HiveMetaStoreClient.java:open(332)) - Connected to metastore.
2014-03-17 10:46:25,118 INFO service.AbstractService (AbstractService.java:start(104)) - Service:ThriftBinaryCLIService is started.
2014-03-17 10:46:25,122 INFO service.AbstractService (AbstractService.java:start(104)) - Service:HiveServer2 is started.
2014-03-17 10:46:25,351 INFO thrift.ThriftCLIService (ThriftBinaryCLIService.java:run(76)) - ThriftBinaryCLIService listening on 0.0.0.0/0.0.0.0:10001
2014-03-17 12:27:04,409 INFO server.HiveServer2 (HiveStringUtils.java:startupShutdownMessage(604)) - STARTUP_MSG:
/************************************************************</code>
The query that worked for you does not launch a Map Reduce Job. It looks like there is a problem while launching Map Reduce Jobs. Can you check the hive logs (default location, assuming you are running hiveserver2 as user hive) – /tmp/hive/hive.log to see if it has the full error message.
One common error that you might find in the logs is the permission denied error. Mostly the ODBC driver logs on to as user hadoop which has no write access thus causing a failure in starting a Map Reduce Job. If you change this user to hdfs, your problem might be solved.
The difference between the Select * from SomeTable; and Select colName from SomeTable; queries you ran is that the later, Select colName from SomeTable; requires information from the Hive Metastore service. Since you are looking for a specific column, hive must first check the hive schema definition which is stored in the Hive Metastore.
Your issue is due to the fact that you cannot connect to the Hive Metastore, as indicated by the following message in your log file:
org.apache.thrift.transport.TTransportException: java.net.SocketTimeoutException: Read timed out
Try restarting your Hive Metastore service.