This is my first post! I've been searching for solutions to this to no avail for weeks (on and off)...
I have a java Hive UDF where I want to run SQL against Hive table to map data in memory for later use. I have the connection information correct, but it cannot find the HiveDriver. I have tried adding the Hive jdbc driver jar in hive and exporting it into a runnable jar.
Error below, relevant code after.
hive> add jar /tmp/xhUDFs.jar;
Added [/tmp/xhUDFs.jar] to class path
Added resources: [/tmp/xhUDFs.jar]
hive> CREATE TEMPORARY FUNCTION getRefdata as 'xhUDFs.GetRefdata';
OK
Time taken: 0.042 seconds
hive> select '06001',getRefdata('CBSAname','06001');
java.lang.ClassNotFoundException:
org.apache.hadoop.hive.jdbc.HiveDriver
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:264)
at xhUDFs.GetRefdata.getRefmap(GetRefdata.java:53)
at xhUDFs.GetRefdata.evaluate(GetRefdata.java:35)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.hadoop.hive.ql.exec.FunctionRegistry.invoke(FunctionRegistry.java:954)
at org.apache.hadoop.hive.ql.udf.generic.GenericUDFBridge.evaluate(GenericUDFBridge.java:182)
at org.apache.hadoop.hive.ql.udf.generic.GenericUDF.initializeAndFoldConstants(GenericUDF.java:168)
at org.apache.hadoop.hive.ql.plan.ExprNodeGenericFuncDesc.newInstance(ExprNodeGenericFuncDesc.java:233)
at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.getXpathOrFuncExprNodeDesc(TypeCheckProcFactory.java:1093)
at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.process(TypeCheckProcFactory.java:1312)
at org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispatcher.java:90)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatchAndReturn(DefaultGraphWalker.java:95)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.java:79)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java:133)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWalker.java:110)
at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory.genExprNode(TypeCheckProcFactory.java:209)
at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory.genExprNode(TypeCheckProcFactory.java:153)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genAllExprNodeDesc(SemanticAnalyzer.java:10499)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnalyzer.java:10455)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyzer.java:3822)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyzer.java:3601)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPostGroupByBodyPlan(SemanticAnalyzer.java:8943)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer.java:8898)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:9743)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:9636)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genOPTree(SemanticAnalyzer.java:10109)
at org.apache.hadoop.hive.ql.parse.CalcitePlanner.genOPTree(CalcitePlanner.java:329)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:10120)
at org.apache.hadoop.hive.ql.parse.CalcitePlanner.analyzeInternal(CalcitePlanner.java:211)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:227)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:454)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:314)
at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1164)
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1212)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1101)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1091)
at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:216)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:168)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:379)
at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:739)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:684)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:624)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136) java.lang.ClassNotFoundException:
org.apache.hadoop.hive.jdbc.HiveDriver
Now the code snippet:
try {
Class.forName("org.apache.hadoop.hive.jdbc.HiveDriver");
Connection wConn = DriverManager.getConnection("jdbc:hive2://172.20.2.40:10000/refdata", "", "");
Statement sql = wConn.createStatement();
ResultSet wResult = sql.executeQuery("select distinct zipcode5,cbsa_name,msa_name,pmsa_name,region,csaname,cbsa_div_name from refdata.zip_code_business_all_16_1_2016_orc where primaryrecord='P'");
while (wResult.next()){
censusMap.put(wResult.getString("zipcode5"), wResult.getString("cbsa_name"));
}
} catch (SQLException | ClassNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
Related
I'm trying to learn Liquibase and have set up a simple Gradle project using Liquibase, the Liquibase Groovy DSL, and the Liquibase Gradle plugin.
When I run a task like update or updateSql, an Unrecognized root element changeSet error is thrown. What am I doing wrong?
The changeset file in src/main/db/changeset.groovy contains:
databaseChangeLog {
changeSet(id: '1.0-test-changeset', author='aoife') {
createTable(tableName: 'foo', remarks: 'Foo table for testing liquibase') {
column(name: 'bar', type: 'varchar(60)', remarks: 'lots of bars')
column(name: 'baz', type: 'varchar(15)', remarks: 'even more bazes')
}
}
}
build.gradle contains:
plugins {
id 'groovy'
id 'application'
id 'org.liquibase.gradle' version '2.0.1'
}
version = "1.0"
group = "liquitest"
mainClassName = "liquitest.LiquiTest"
repositories {
maven {
url "${nexusUrl}/maven-public"
}
}
dependencies {
compile 'org.codehaus.groovy:groovy-all:2.5.2'
compile 'com.oracle.jdbc:ojdbc8:12.2.0.1'
compile 'ch.qos.logback:logback-classic:1.2.3'
testCompile 'org.spockframework:spock-core:1.2-RC3-groovy-2.5'
liquibaseRuntime 'org.liquibase:liquibase-core:3.6.1'
liquibaseRuntime 'org.liquibase:liquibase-groovy-dsl:2.0.1'
liquibaseRuntime 'com.oracle.jdbc:ojdbc8:12.2.0.1'
}
liquibase {
activities {
main {
changeLogFile 'src/main/db/changeset.groovy'
url project.ext.oracleInstance
username project.ext.oracleUser
password project.ext.oraclePassword
}
}
}
Output from ./gradlew updateSql:
> Task :updateSQL FAILED
liquibase-plugin: Running the 'main' activity...
Starting Liquibase at Thu, 27 Sep 2018 10:38:37 PDT (version 3.6.1 built at 2018-04-11 09:05:04)
Unexpected error running Liquibase: Unrecognized root element changeSet
liquibase.exception.ChangeLogParseException: Unrecognized root element changeSet
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.codehaus.groovy.reflection.CachedConstructor.invoke(CachedConstructor.java:83)
at org.codehaus.groovy.reflection.CachedConstructor.doConstructorInvoke(CachedConstructor.java:77)
at org.codehaus.groovy.runtime.callsite.ConstructorSite$ConstructorSiteNoUnwrap.callConstructor(ConstructorSite.java:84)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCallConstructor(CallSiteArray.java:60)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callConstructor(AbstractCallSite.java:235)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callConstructor(AbstractCallSite.java:247)
at liquibase.parser.ext.GroovyLiquibaseChangeLogParser$_getChangeLogMethodMissing_closure3.doCall(GroovyLiquibaseChangeLogParser.groovy:96)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:93)
at org.codehaus.groovy.runtime.metaclass.ClosureMetaMethod.invoke(ClosureMetaMethod.java:84)
at groovy.lang.MetaClassImpl.invokeMissingMethod(MetaClassImpl.java:939)
at groovy.lang.MetaClassImpl.invokePropertyOrMissing(MetaClassImpl.java:1262)
at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1215)
at groovy.lang.ExpandoMetaClass.invokeMethod(ExpandoMetaClass.java:1125)
at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1022)
at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:810)
at groovy.lang.GroovyObjectSupport.invokeMethod(GroovyObjectSupport.java:46)
at groovy.lang.Script.invokeMethod(Script.java:80)
at org.codehaus.groovy.runtime.metaclass.ClosureMetaClass.invokeOnDelegationObjects(ClosureMetaClass.java:430)
at org.codehaus.groovy.runtime.metaclass.ClosureMetaClass.invokeMethod(ClosureMetaClass.java:369)
at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1022)
at org.codehaus.groovy.runtime.callsite.PogoMetaClassSite.callCurrent(PogoMetaClassSite.java:69)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCallCurrent(CallSiteArray.java:52)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:154)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:182)
at Script1$_run_closure1.doCall(Script1.groovy:2)
at Script1$_run_closure1.doCall(Script1.groovy)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:93)
at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:325)
at org.codehaus.groovy.runtime.metaclass.ClosureMetaClass.invokeMethod(ClosureMetaClass.java:294)
at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1022)
at org.codehaus.groovy.runtime.callsite.PogoMetaClassSite.call(PogoMetaClassSite.java:42)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:48)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:113)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:117)
at liquibase.parser.ext.GroovyLiquibaseChangeLogParser.processDatabaseChangeLogRootElement(GroovyLiquibaseChangeLogParser.groovy:136)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:93)
at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:325)
at org.codehaus.groovy.runtime.metaclass.ClosureMetaClass.invokeMethod(ClosureMetaClass.java:384)
at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1022)
at org.codehaus.groovy.runtime.callsite.PogoMetaClassSite.callCurrent(PogoMetaClassSite.java:69)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCallCurrent(CallSiteArray.java:52)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:154)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:182)
at liquibase.parser.ext.GroovyLiquibaseChangeLogParser$_getChangeLogMethodMissing_closure3.doCall(GroovyLiquibaseChangeLogParser.groovy:93)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:93)
at org.codehaus.groovy.runtime.metaclass.ClosureMetaMethod.invoke(ClosureMetaMethod.java:84)
at groovy.lang.MetaClassImpl.invokeMissingMethod(MetaClassImpl.java:939)
at groovy.lang.MetaClassImpl.invokePropertyOrMissing(MetaClassImpl.java:1262)
at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1215)
at groovy.lang.ExpandoMetaClass.invokeMethod(ExpandoMetaClass.java:1125)
at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1022)
at org.codehaus.groovy.runtime.callsite.PogoMetaClassSite.callCurrent(PogoMetaClassSite.java:69)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCallCurrent(CallSiteArray.java:52)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:154)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:166)
at Script1.run(Script1.groovy:1)
at Script1$run.call(Unknown Source)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:48)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:113)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:117)
at liquibase.parser.ext.GroovyLiquibaseChangeLogParser.parse(GroovyLiquibaseChangeLogParser.groovy:64)
at liquibase.Liquibase.getDatabaseChangeLog(Liquibase.java:217)
at liquibase.Liquibase.update(Liquibase.java:190)
at liquibase.Liquibase.update(Liquibase.java:274)
at liquibase.Liquibase.update(Liquibase.java:251)
at liquibase.integration.commandline.Main.doMigration(Main.java:1239)
at liquibase.integration.commandline.Main.run(Main.java:191)
at liquibase.integration.commandline.Main.main(Main.java:129)
For more information, please use the --logLevel flag
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task ':updateSQL'.
> Process 'command '/Library/Java/JavaVirtualMachines/jdk1.8.0_171.jdk/Contents/Home/bin/java'' finished with non-zero exit value 255
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1s
The Parameter author contains an equals, but it has to be an colon. This was hard to spot.:) Glad I could help.
changeSet(id: '1.0-test-changeset', author:'aoife') {
If the problem is not caused by "equals" operator, it could be caused by Groovy version as well.
For my case, I got the same error Unrecognized root element changeSet with Groovy 4.0.4, and solved by replacing it with Groovy 3.0.12.
Note that I am running Liquibase directly from the command line using the binary distribution of Liquibase, and I had copied the liquibase-groovy-dsl and groovy-x.y.z jar files (no need for database driver jar though) into the lib directory of the Liquibase distribution. (according to https://github.com/liquibase/liquibase-groovy-dsl).
OS: Windows Server 2012R2
When I tried reloading my spark streaming application from a checkpoint directory, I got the following exception:
java.lang.IllegalArgumentException: requirement failed: Checkpoint directory does not exist: maprfs:/mapr/cellos-mapr/user/mbazarganigilani/checkpoints/22237996-da79-4f13-b142-3ab112b7c374/rdd-1009
at scala.Predef$.require(Predef.scala:233)
at org.apache.spark.rdd.ReliableCheckpointRDD.<init>(ReliableCheckpointRDD.scala:46)
at org.apache.spark.SparkContext$$anonfun$checkpointFile$1.apply(SparkContext.scala:1226)
at org.apache.spark.SparkContext$$anonfun$checkpointFile$1.apply(SparkContext.scala:1226)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111)
at org.apache.spark.SparkContext.withScope(SparkContext.scala:714)
at org.apache.spark.SparkContext.checkpointFile(SparkContext.scala:1225)
at org.apache.spark.streaming.dstream.DStreamCheckpointData$$anonfun$restore$1.apply(DStreamCheckpointData.scala:112)
at org.apache.spark.streaming.dstream.DStreamCheckpointData$$anonfun$restore$1.apply(DStreamCheckpointData.scala:109)
at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:226)
at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:39)
at scala.collection.mutable.HashMap.foreach(HashMap.scala:98)
at org.apache.spark.streaming.dstream.DStreamCheckpointData.restore(DStreamCheckpointData.scala:109)
at org.apache.spark.streaming.dstream.DStream.restoreCheckpointData(DStream.scala:515)
at org.apache.spark.streaming.dstream.DStream$$anonfun$restoreCheckpointData$2.apply(DStream.scala:516)
at org.apache.spark.streaming.dstream.DStream$$anonfun$restoreCheckpointData$2.apply(DStream.scala:516)
at scala.collection.immutable.List.foreach(List.scala:318)
at org.apache.spark.streaming.dstream.DStream.restoreCheckpointData(DStream.scala:516)
at org.apache.spark.streaming.DStreamGraph$$anonfun$restoreCheckpointData$2.apply(DStreamGraph.scala:151)
at org.apache.spark.streaming.DStreamGraph$$anonfun$restoreCheckpointData$2.apply(DStreamGraph.scala:151)
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
at org.apache.spark.streaming.DStreamGraph.restoreCheckpointData(DStreamGraph.scala:151)
at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:158)
at org.apache.spark.streaming.StreamingContext$$anonfun$getOrCreate$1.apply(StreamingContext.scala:864)
at org.apache.spark.streaming.StreamingContext$$anonfun$getOrCreate$1.apply(StreamingContext.scala:864)
at scala.Option.map(Option.scala:145)
at org.apache.spark.streaming.StreamingContext$.getOrCreate(StreamingContext.scala:864)
at UnionStream$.main(UnionStreaming.scala:636)
at UnionStream.main(UnionStreaming.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
I wanted to know if there is any way to override this exception for example something like reloading from previous checkpoint data?
there
I am quite new to Hive, and a java app which accesses hive with kerberos authentication, like below:
try
{
System.setProperty("java.security.krb5.conf", "/haManage/krb5.conf");
StringBuilder sBuilder = new StringBuilder();
sBuilder.append("jdbc:hive2://ha-cluster/default");
sBuilder.append(";zk.quorum=").append("x.x.x.x,x.x.x.x");//ip list
sBuilder.append(";zk.port=").append("24002");
if (isSecureVer) {
sBuilder.append(";user.principal=")
.append("hadoop#HADOOP.COM")
.append(";user.keytab=")
.append("/home/hdclient/gyj/user.keytab")
.append(";sasl.qop=auth-conf;auth=KERBEROS;principal=hive/" +
"hadoop.hadoop.com#HADOOP.COM;zk.principal=zookeeper/hadoop.hadoop.com");
}
url = sBuilder.toString();
logger.info(url);
Class.forName("org.apache.hive.jdbc.HiveDriver");
connToHive = DriverManager.getConnection(url,"","");
} catch (Exception e)
{
logger.error("Error occurs",e);
}
But exception happens, shown below:
Caused by: org.apache.thrift.transport.TTransportException: Cannot open without port.
at org.apache.thrift.transport.TSocket.open(TSocket.java:172) ~[hive-exec-0.14.0.jar:0.14.0]
at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:248) ~[hive-exec-0.14.0.jar:0.14.0]
at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37) ~[hive-exec-0.14.0.jar:0.14.0]
at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52) ~[hive-exec-0.14.0.jar:0.14.0]
at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49) ~[hive-exec-0.14.0.jar:0.14.0]
at java.security.AccessController.doPrivileged(Native Method) ~[na:1.7.0_45]
at javax.security.auth.Subject.doAs(Subject.java:415) ~[na:1.7.0_45]
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1656) ~[hadoop-common-2.6.4.jar:na]
at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49) ~[hive-exec-0.14.0.jar:0.14.0]
at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:190) ~[hive-jdbc-1.1.0.jar:1.1.0]
... 6 common frames omitted
Any effort will be appreciated.
While you have the zookeeper port specified as a query string parameter (needed for kerberos auth) you also need to have the port for hive after the hostname part of the URL. The normal port used by Hive is 10000, so your URL might start like this:
sBuilder.append("jdbc:hive2://ha-cluster:10000/default");
hive.log
Caused by: org.apache.derby.iapi.error.StandardException: Table/View 'DBS' does not exist.
at org.apache.derby.iapi.error.StandardException.newException(Unknown Source) ~[derby-10.10.2.0.jar:?]
at org.apache.derby.impl.sql.compile.FromBaseTable.bindTableDescriptor(Unknown Source) ~[derby-10.10.2.0.jar:?]
at org.apache.derby.impl.sql.compile.FromBaseTable.bindNonVTITables(Unknown Source) ~[derby-10.10.2.0.jar:?]
at org.apache.derby.impl.sql.compile.FromList.bindTables(Unknown Source) ~[derby-10.10.2.0.jar:?]
tions. Either your MetaData is incorrect, or you need to enable "datanucleus.schema.autoCreateTables"
at org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:553) ~[datanucleus-api-jdo-4.2.1.jar:?]
at org.datanucleus.api.jdo.JDOPersistenceManager.jdoMakePersistent(JDOPersistenceManager.java:720) ~[datanucleus-api-jdo-4.2.1.jar:?]
at org.datanucleus.api.jdo.JDOPersistenceManager.makePersistent(JDOPersistenceManager.java:740) ~[datanucleus-api-jdo-4.2.1.jar:?]
at org.apache.hadoop.hive.metastore.ObjectStore.setMetaStoreSchemaVersion(ObjectStore.java:7294) ~[hive-exec-2.0.0.jar:2.0.0]
at org.apache.hadoop.hive.metastore.ObjectStore.checkSchema(ObjectStore.java:7188) ~[hive-exec-2.0.0.jar:2.0.0]
javax.jdo.JDODataStoreException: Error executing SQL query "select "DB_ID" from "DBS"".
at org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:543) ~[datanucleus-api-jdo-4.2.1.jar:?]
at org.datanucleus.api.jdo.JDOQuery.executeInternal(JDOQuery.java:388) ~[datanucleus-api-jdo-4.2.1.jar:?]
at org.datanucleus.api.jdo.JDOQuery.execute(JDOQuery.java:213) ~[datanucleus-api-jdo-4.2.1.jar:?]
I try to run Titan 1.0.0 with Cassandra 3.0.1 backend and Elasticsearch 2.1.1 as the index backend.
After invoking Gremlin
g = TitanFactory.open('/opt/titan/conf/titan-cassandra-es.properties')
I have the following error message:
java.lang.IllegalArgumentException: Could not instantiate implementation: com.thinkaurelius.titan.diskstorage.es.ElasticSearchIndex
at com.thinkaurelius.titan.util.system.ConfigurationUtil.instantiate(ConfigurationUtil.java:55)
at com.thinkaurelius.titan.diskstorage.Backend.getImplementationClass(Backend.java:473)
at com.thinkaurelius.titan.diskstorage.Backend.getIndexes(Backend.java:460)
at com.thinkaurelius.titan.diskstorage.Backend.<init>(Backend.java:147)
at com.thinkaurelius.titan.graphdb.configuration.GraphDatabaseConfiguration.getBackend(GraphDatabaseConfiguration.java:1805)
at com.thinkaurelius.titan.graphdb.database.StandardTitanGraph.<init>(StandardTitanGraph.java:123)
at com.thinkaurelius.titan.core.TitanFactory.open(TitanFactory.java:94)
at com.thinkaurelius.titan.core.TitanFactory.open(TitanFactory.java:62)
at com.thinkaurelius.titan.core.TitanFactory$open.call(Unknown Source)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:45)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:110)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:122)
at groovysh_evaluate.run(groovysh_evaluate:3)
at org.codehaus.groovy.vmplugin.v7.IndyInterface.selectMethod(IndyInterface.java:215)
at org.codehaus.groovy.tools.shell.Interpreter.evaluate(Interpreter.groovy:69)
at org.codehaus.groovy.vmplugin.v7.IndyInterface.selectMethod(IndyInterface.java:215)
at org.codehaus.groovy.tools.shell.Groovysh.execute(Groovysh.groovy:185)
at org.codehaus.groovy.vmplugin.v7.IndyInterface.selectMethod(IndyInterface.java:215)
at org.codehaus.groovy.tools.shell.Shell.leftShift(Shell.groovy:119)
at org.codehaus.groovy.vmplugin.v7.IndyInterface.selectMethod(IndyInterface.java:215)
at org.codehaus.groovy.tools.shell.ShellRunner.work(ShellRunner.groovy:94)
at org.codehaus.groovy.tools.shell.InteractiveShellRunner.super$2$work(InteractiveShellRunner.groovy)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:90)
at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:324)
at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1207)
at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.invokeMethodOnSuperN(ScriptBytecodeAdapter.java:130)
at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.invokeMethodOnSuper0(ScriptBytecodeAdapter.java:150)
at org.codehaus.groovy.tools.shell.InteractiveShellRunner.work(InteractiveShellRunner.groovy:123)
at org.codehaus.groovy.vmplugin.v7.IndyInterface.selectMethod(IndyInterface.java:215)
at org.codehaus.groovy.tools.shell.ShellRunner.run(ShellRunner.groovy:58)
at org.codehaus.groovy.tools.shell.InteractiveShellRunner.super$2$run(InteractiveShellRunner.groovy)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:90)
at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:324)
at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1207)
at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.invokeMethodOnSuperN(ScriptBytecodeAdapter.java:130)
at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.invokeMethodOnSuper0(ScriptBytecodeAdapter.java:150)
at org.codehaus.groovy.tools.shell.InteractiveShellRunner.run(InteractiveShellRunner.groovy:82)
at org.codehaus.groovy.vmplugin.v7.IndyInterface.selectMethod(IndyInterface.java:215)
at org.apache.tinkerpop.gremlin.console.Console.<init>(Console.groovy:144)
at org.codehaus.groovy.vmplugin.v7.IndyInterface.selectMethod(IndyInterface.java:215)
at org.apache.tinkerpop.gremlin.console.Console.main(Console.groovy:303)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
at com.thinkaurelius.titan.util.system.ConfigurationUtil.instantiate(ConfigurationUtil.java:44)
... 48 more
Caused by: org.elasticsearch.client.transport.NoNodeAvailableException: None of the configured nodes are available: []
at org.elasticsearch.client.transport.TransportClientNodesService.ensureNodesAreAvailable(TransportClientNodesService.java:279)
at org.elasticsearch.client.transport.TransportClientNodesService.execute(TransportClientNodesService.java:198)
at org.elasticsearch.client.transport.support.InternalTransportClusterAdminClient.execute(InternalTransportClusterAdminClient.java:86)
at org.elasticsearch.client.support.AbstractClusterAdminClient.health(AbstractClusterAdminClient.java:127)
at org.elasticsearch.action.admin.cluster.health.ClusterHealthRequestBuilder.doExecute(ClusterHealthRequestBuilder.java:92)
at org.elasticsearch.action.ActionRequestBuilder.execute(ActionRequestBuilder.java:91)
at org.elasticsearch.action.ActionRequestBuilder.execute(ActionRequestBuilder.java:65)
at com.thinkaurelius.titan.diskstorage.es.ElasticSearchIndex.<init>(ElasticSearchIndex.java:201)
... 53 more
The following is the configuration settings in titan-cassandra-es.properties.
storage.backend=cassandrathrift
storage.hostname=10.132.x.x
cache.db-cache = true
cache.db-cache-clean-wait = 20
cache.db-cache-time = 180000
cache.db-cache-size = 0.5
index.search.backend=elasticsearch
index.search.hostname=10.132.x.x
index.search.client-only=true
I have run telnet tests to both port 9200 and 9300 with no problem. do i need to specify my elasticsearch cluster.name?
Please help,
thanks.
I just figured out elasticsearch 2.1 is not compatible with Titan 2.1.1. The solution is to use elasticsearch 1.5.1