CannotGetMongoDbConnectionException: Failed to authenticate to database - spring

In the replica-set Mongo Shell, use products;db.auth('worker'.'a*******6'); is alright, but in spring-data-mongondb, I encountered the following problem:
Exception in thread "main" org.springframework.data.mongodb.CannotGetMongoDbConnectionException: Failed to authenticate to database [products], username = [worker], password = [a*******6]
at org.springframework.data.mongodb.core.ReflectiveDbInvoker.authenticate(ReflectiveDbInvoker.java:83)
at org.springframework.data.mongodb.core.MongoDbUtils.doGetDB(MongoDbUtils.java:127)
at org.springframework.data.mongodb.core.MongoDbUtils.getDB(MongoDbUtils.java:94)
at org.springframework.data.mongodb.core.SimpleMongoDbFactory.getDb(SimpleMongoDbFactory.java:197)
at org.springframework.data.mongodb.core.SimpleMongoDbFactory.getDb(SimpleMongoDbFactory.java:185)
at org.springframework.data.mongodb.core.MongoTemplate.getDb(MongoTemplate.java:1595)
at org.springframework.data.mongodb.core.MongoTemplate.execute(MongoTemplate.java:441)
at org.springframework.data.mongodb.core.MongoTemplate.doCreateCollection(MongoTemplate.java:1612)
at org.springframework.data.mongodb.core.MongoTemplate.createCollection(MongoTemplate.java:492)
at com.qixin.appmarket.service.test.TestMongodb.main(TestMongodb.java:24)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:120)

Try updating to spring-data-mongodb to the latest version. I was getting that authentication error while using version 1.7.2.RELEASE but it stopped after updating to 1.8.0.RELEASE.

In MongoDB, you can use:
use products
db.createUser(
{
user: "accountUser",
pwd: "password",
roles: [ "readWrite", "dbAdmin" ]
}
)

Related

The specified container does not exist mounting azure storage to databricks

I am using the below code to mount the azure blob storage container(gen2) in data bricks and getting the below error
ExecutionError: An error occurred while calling o304.mount.
: java.io.FileNotFoundException: Operation failed: "The specified container does not exist.", 404, HEAD, https://<<>>?resource=filesystem&timeout=90
at shaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.AzureBlobFileSystem.checkException(AzureBlobFileSystem.java:1290)
at shaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.AzureBlobFileSystem.getFileStatus(AzureBlobFileSystem.java:630)
at com.databricks.backend.daemon.dbutils.DBUtilsCore.verifyAzureFileSystem(DBUtilsCore.scala:772)
at com.databricks.backend.daemon.dbutils.DBUtilsCore.mount(DBUtilsCore.scala:720)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:380)
at py4j.Gateway.invoke(Gateway.java:295)
at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
at py4j.commands.CallCommand.execute(CallCommand.java:79)
at py4j.GatewayConnection.run(GatewayConnection.java:251)
at java.lang.Thread.run(Thread.java:748)
Caused by: Operation failed: "The specified container does not exist.", 404, HEAD, https://<storage_name>/?resource=filesystem&timeout=90
at shaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.services.AbfsRestOperation.execute(AbfsRestOperation.java:241)
at shaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.services.AbfsClient.getFilesystemProperties(AbfsClient.java:252)
at shaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.AzureBlobFileSystemStore.getFileStatus(AzureBlobFileSystemStore.java:834)
at shaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.AzureBlobFileSystem.getFileStatus(AzureBlobFileSystem.java:628)
... 13 more

Liquibase groovy dsl says Unrecognized root element changeSet

I'm trying to learn Liquibase and have set up a simple Gradle project using Liquibase, the Liquibase Groovy DSL, and the Liquibase Gradle plugin.
When I run a task like update or updateSql, an Unrecognized root element changeSet error is thrown. What am I doing wrong?
The changeset file in src/main/db/changeset.groovy contains:
databaseChangeLog {
changeSet(id: '1.0-test-changeset', author='aoife') {
createTable(tableName: 'foo', remarks: 'Foo table for testing liquibase') {
column(name: 'bar', type: 'varchar(60)', remarks: 'lots of bars')
column(name: 'baz', type: 'varchar(15)', remarks: 'even more bazes')
}
}
}
build.gradle contains:
plugins {
id 'groovy'
id 'application'
id 'org.liquibase.gradle' version '2.0.1'
}
version = "1.0"
group = "liquitest"
mainClassName = "liquitest.LiquiTest"
repositories {
maven {
url "${nexusUrl}/maven-public"
}
}
dependencies {
compile 'org.codehaus.groovy:groovy-all:2.5.2'
compile 'com.oracle.jdbc:ojdbc8:12.2.0.1'
compile 'ch.qos.logback:logback-classic:1.2.3'
testCompile 'org.spockframework:spock-core:1.2-RC3-groovy-2.5'
liquibaseRuntime 'org.liquibase:liquibase-core:3.6.1'
liquibaseRuntime 'org.liquibase:liquibase-groovy-dsl:2.0.1'
liquibaseRuntime 'com.oracle.jdbc:ojdbc8:12.2.0.1'
}
liquibase {
activities {
main {
changeLogFile 'src/main/db/changeset.groovy'
url project.ext.oracleInstance
username project.ext.oracleUser
password project.ext.oraclePassword
}
}
}
Output from ./gradlew updateSql:
> Task :updateSQL FAILED
liquibase-plugin: Running the 'main' activity...
Starting Liquibase at Thu, 27 Sep 2018 10:38:37 PDT (version 3.6.1 built at 2018-04-11 09:05:04)
Unexpected error running Liquibase: Unrecognized root element changeSet
liquibase.exception.ChangeLogParseException: Unrecognized root element changeSet
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.codehaus.groovy.reflection.CachedConstructor.invoke(CachedConstructor.java:83)
at org.codehaus.groovy.reflection.CachedConstructor.doConstructorInvoke(CachedConstructor.java:77)
at org.codehaus.groovy.runtime.callsite.ConstructorSite$ConstructorSiteNoUnwrap.callConstructor(ConstructorSite.java:84)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCallConstructor(CallSiteArray.java:60)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callConstructor(AbstractCallSite.java:235)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callConstructor(AbstractCallSite.java:247)
at liquibase.parser.ext.GroovyLiquibaseChangeLogParser$_getChangeLogMethodMissing_closure3.doCall(GroovyLiquibaseChangeLogParser.groovy:96)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:93)
at org.codehaus.groovy.runtime.metaclass.ClosureMetaMethod.invoke(ClosureMetaMethod.java:84)
at groovy.lang.MetaClassImpl.invokeMissingMethod(MetaClassImpl.java:939)
at groovy.lang.MetaClassImpl.invokePropertyOrMissing(MetaClassImpl.java:1262)
at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1215)
at groovy.lang.ExpandoMetaClass.invokeMethod(ExpandoMetaClass.java:1125)
at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1022)
at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:810)
at groovy.lang.GroovyObjectSupport.invokeMethod(GroovyObjectSupport.java:46)
at groovy.lang.Script.invokeMethod(Script.java:80)
at org.codehaus.groovy.runtime.metaclass.ClosureMetaClass.invokeOnDelegationObjects(ClosureMetaClass.java:430)
at org.codehaus.groovy.runtime.metaclass.ClosureMetaClass.invokeMethod(ClosureMetaClass.java:369)
at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1022)
at org.codehaus.groovy.runtime.callsite.PogoMetaClassSite.callCurrent(PogoMetaClassSite.java:69)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCallCurrent(CallSiteArray.java:52)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:154)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:182)
at Script1$_run_closure1.doCall(Script1.groovy:2)
at Script1$_run_closure1.doCall(Script1.groovy)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:93)
at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:325)
at org.codehaus.groovy.runtime.metaclass.ClosureMetaClass.invokeMethod(ClosureMetaClass.java:294)
at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1022)
at org.codehaus.groovy.runtime.callsite.PogoMetaClassSite.call(PogoMetaClassSite.java:42)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:48)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:113)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:117)
at liquibase.parser.ext.GroovyLiquibaseChangeLogParser.processDatabaseChangeLogRootElement(GroovyLiquibaseChangeLogParser.groovy:136)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:93)
at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:325)
at org.codehaus.groovy.runtime.metaclass.ClosureMetaClass.invokeMethod(ClosureMetaClass.java:384)
at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1022)
at org.codehaus.groovy.runtime.callsite.PogoMetaClassSite.callCurrent(PogoMetaClassSite.java:69)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCallCurrent(CallSiteArray.java:52)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:154)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:182)
at liquibase.parser.ext.GroovyLiquibaseChangeLogParser$_getChangeLogMethodMissing_closure3.doCall(GroovyLiquibaseChangeLogParser.groovy:93)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:93)
at org.codehaus.groovy.runtime.metaclass.ClosureMetaMethod.invoke(ClosureMetaMethod.java:84)
at groovy.lang.MetaClassImpl.invokeMissingMethod(MetaClassImpl.java:939)
at groovy.lang.MetaClassImpl.invokePropertyOrMissing(MetaClassImpl.java:1262)
at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1215)
at groovy.lang.ExpandoMetaClass.invokeMethod(ExpandoMetaClass.java:1125)
at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1022)
at org.codehaus.groovy.runtime.callsite.PogoMetaClassSite.callCurrent(PogoMetaClassSite.java:69)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCallCurrent(CallSiteArray.java:52)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:154)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:166)
at Script1.run(Script1.groovy:1)
at Script1$run.call(Unknown Source)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:48)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:113)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:117)
at liquibase.parser.ext.GroovyLiquibaseChangeLogParser.parse(GroovyLiquibaseChangeLogParser.groovy:64)
at liquibase.Liquibase.getDatabaseChangeLog(Liquibase.java:217)
at liquibase.Liquibase.update(Liquibase.java:190)
at liquibase.Liquibase.update(Liquibase.java:274)
at liquibase.Liquibase.update(Liquibase.java:251)
at liquibase.integration.commandline.Main.doMigration(Main.java:1239)
at liquibase.integration.commandline.Main.run(Main.java:191)
at liquibase.integration.commandline.Main.main(Main.java:129)
For more information, please use the --logLevel flag
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task ':updateSQL'.
> Process 'command '/Library/Java/JavaVirtualMachines/jdk1.8.0_171.jdk/Contents/Home/bin/java'' finished with non-zero exit value 255
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1s
The Parameter author contains an equals, but it has to be an colon. This was hard to spot.:) Glad I could help.
changeSet(id: '1.0-test-changeset', author:'aoife') {
If the problem is not caused by "equals" operator, it could be caused by Groovy version as well.
For my case, I got the same error Unrecognized root element changeSet with Groovy 4.0.4, and solved by replacing it with Groovy 3.0.12.
Note that I am running Liquibase directly from the command line using the binary distribution of Liquibase, and I had copied the liquibase-groovy-dsl and groovy-x.y.z jar files (no need for database driver jar though) into the lib directory of the Liquibase distribution. (according to https://github.com/liquibase/liquibase-groovy-dsl).
OS: Windows Server 2012R2

Grails 2.4.4 : PooledConnection has already been closed

I have been trying to upgrade my project from grails 2.1.2 to grails 2.4.4. This project calls another module(upgraded to java 8) which uses ibatis for database connection. While the module as a standalone works fine, it gives me "Connection Closed" exception when accessed from the grails project.
This applications had been working fine on grails 2.1.2 with java 6. However, the upgrade seems to be breaking something.
Exception:
Caused by: java.sql.SQLException: PooledConnection has already been closed.
at org.apache.tomcat.jdbc.pool.DisposableConnectionFacade.invoke(DisposableConnectionFacade.java:86)
at com.sun.proxy.$Proxy35.prepareStatement(Unknown Source)
at sun.reflect.GeneratedMethodAccessor354.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.springsource.loaded.ri.ReflectiveInterceptor.jlrMethodInvoke(ReflectiveInterceptor.java:1270)
at org.springframework.jdbc.datasource.LazyConnectionDataSourceProxy$LazyConnectionInvocationHandler.invoke(LazyConnectionDataSourceProxy.java:376)
at com.sun.proxy.$Proxy37.prepareStatement(Unknown Source)
at sun.reflect.GeneratedMethodAccessor354.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.springsource.loaded.ri.ReflectiveInterceptor.jlrMethodInvoke(ReflectiveInterceptor.java:1270)
at org.springframework.jdbc.datasource.TransactionAwareDataSourceProxy$TransactionAwareInvocationHandler.invoke(TransactionAwareDataSourceProxy.java:240)
at com.sun.proxy.$Proxy37.prepareStatement(Unknown Source)
at org.apache.ibatis.executor.statement.PreparedStatementHandler.instantiateStatement(PreparedStatementHandler.java:87)
at org.apache.ibatis.executor.statement.BaseStatementHandler.prepare(BaseStatementHandler.java:88)
at org.apache.ibatis.executor.statement.RoutingStatementHandler.prepare(RoutingStatementHandler.java:59)
at org.apache.ibatis.executor.SimpleExecutor.prepareStatement(SimpleExecutor.java:85)
at org.apache.ibatis.executor.SimpleExecutor.doQuery(SimpleExecutor.java:62)
at org.apache.ibatis.executor.BaseExecutor.queryFromDatabase(BaseExecutor.java:324)
at org.apache.ibatis.executor.BaseExecutor.query(BaseExecutor.java:156)
at org.apache.ibatis.executor.BaseExecutor.query(BaseExecutor.java:136)
at org.apache.ibatis.session.defaults.DefaultSqlSession.selectList(DefaultSqlSession.java:148)
at org.apache.ibatis.session.defaults.DefaultSqlSession.selectList(DefaultSqlSession.java:141)
at org.apache.ibatis.session.defaults.DefaultSqlSession.selectOne(DefaultSqlSession.java:77)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.springsource.loaded.ri.ReflectiveInterceptor.jlrMethodInvoke(ReflectiveInterceptor.java:1270)
at org.mybatis.spring.SqlSessionTemplate$SqlSessionInterceptor.invoke(SqlSessionTemplate.java:434)
I have searched for other similar questions online but I dont seem to be facing any of the issues mentioned. I do not see any Abandoned connections in my logs. Also from the DB side I see that there are two connections still alive(minimum idle connection is set to 2).
DataSource.config:
dataSource {
pooled = true
driverClassName = "oracle.jdbc.driver.OracleDriver"
username = xxx
password = yyy
dialect = 'org.hibernate.dialect.Oracle10gDialect'
dbCreate = "none"
properties {
maxActive = 15
maxIdle = 5
minIdle = 2
initialSize = 8
minEvictableIdleTimeMillis = 60000
timeBetweenEvictionRunsMillis = 60000
maxWait = 10000
testOnBorrow = true
validationQuery = "select 1 from dual"
}
}
EDIT 1:
After some more debugging on my side I see that the issue happened after we upgraded our tomcat plugin to version 7.0.55. Earlier we were using 2.1.2. This new plugin uses jdbc-pool through JdbcInterceptor to create the DB connections. This connection is then being sent to the second module(which uses ibatis)
EDIT 2: SOLVED
We tried to create a different datasource around the above mentioned datasource config pointing it to c3p0 and placed it in resources.groovy. This was injected to the ibatis module and we saw that this time round there was no "Connection Pool closed" error. It seems like some issue with the jdbc-pool, but we would like to know if there is some other workaround.
New config in resources.groovy:
dataSource_new (ComboPooledDataSource) { bean ->
idleConnectionTestPeriod = 1 * 60 * 60
testConnectionOnCheckin = true
bean.destroyMethod = 'close'
user = xxx
password = yyy
driverClass = <same as in datasource>
jdbcUrl = zzz
}
BuildConfig.groovy: compile('com.mchange:c3p0:0.9.5.1')

org.apache.thrift.transport.TTransportException: Cannot open without port?

there
I am quite new to Hive, and a java app which accesses hive with kerberos authentication, like below:
try
{
System.setProperty("java.security.krb5.conf", "/haManage/krb5.conf");
StringBuilder sBuilder = new StringBuilder();
sBuilder.append("jdbc:hive2://ha-cluster/default");
sBuilder.append(";zk.quorum=").append("x.x.x.x,x.x.x.x");//ip list
sBuilder.append(";zk.port=").append("24002");
if (isSecureVer) {
sBuilder.append(";user.principal=")
.append("hadoop#HADOOP.COM")
.append(";user.keytab=")
.append("/home/hdclient/gyj/user.keytab")
.append(";sasl.qop=auth-conf;auth=KERBEROS;principal=hive/" +
"hadoop.hadoop.com#HADOOP.COM;zk.principal=zookeeper/hadoop.hadoop.com");
}
url = sBuilder.toString();
logger.info(url);
Class.forName("org.apache.hive.jdbc.HiveDriver");
connToHive = DriverManager.getConnection(url,"","");
} catch (Exception e)
{
logger.error("Error occurs",e);
}
But exception happens, shown below:
Caused by: org.apache.thrift.transport.TTransportException: Cannot open without port.
at org.apache.thrift.transport.TSocket.open(TSocket.java:172) ~[hive-exec-0.14.0.jar:0.14.0]
at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:248) ~[hive-exec-0.14.0.jar:0.14.0]
at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37) ~[hive-exec-0.14.0.jar:0.14.0]
at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52) ~[hive-exec-0.14.0.jar:0.14.0]
at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49) ~[hive-exec-0.14.0.jar:0.14.0]
at java.security.AccessController.doPrivileged(Native Method) ~[na:1.7.0_45]
at javax.security.auth.Subject.doAs(Subject.java:415) ~[na:1.7.0_45]
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1656) ~[hadoop-common-2.6.4.jar:na]
at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49) ~[hive-exec-0.14.0.jar:0.14.0]
at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:190) ~[hive-jdbc-1.1.0.jar:1.1.0]
... 6 common frames omitted
Any effort will be appreciated.
While you have the zookeeper port specified as a query string parameter (needed for kerberos auth) you also need to have the port for hive after the hostname part of the URL. The normal port used by Hive is 10000, so your URL might start like this:
sBuilder.append("jdbc:hive2://ha-cluster:10000/default");

Cannot execute SQL from inside HIVE UDF via jdbc driver

This is my first post! I've been searching for solutions to this to no avail for weeks (on and off)...
I have a java Hive UDF where I want to run SQL against Hive table to map data in memory for later use. I have the connection information correct, but it cannot find the HiveDriver. I have tried adding the Hive jdbc driver jar in hive and exporting it into a runnable jar.
Error below, relevant code after.
hive> add jar /tmp/xhUDFs.jar;
Added [/tmp/xhUDFs.jar] to class path
Added resources: [/tmp/xhUDFs.jar]
hive> CREATE TEMPORARY FUNCTION getRefdata as 'xhUDFs.GetRefdata';
OK
Time taken: 0.042 seconds
hive> select '06001',getRefdata('CBSAname','06001');
java.lang.ClassNotFoundException:
org.apache.hadoop.hive.jdbc.HiveDriver
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:264)
at xhUDFs.GetRefdata.getRefmap(GetRefdata.java:53)
at xhUDFs.GetRefdata.evaluate(GetRefdata.java:35)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.hadoop.hive.ql.exec.FunctionRegistry.invoke(FunctionRegistry.java:954)
at org.apache.hadoop.hive.ql.udf.generic.GenericUDFBridge.evaluate(GenericUDFBridge.java:182)
at org.apache.hadoop.hive.ql.udf.generic.GenericUDF.initializeAndFoldConstants(GenericUDF.java:168)
at org.apache.hadoop.hive.ql.plan.ExprNodeGenericFuncDesc.newInstance(ExprNodeGenericFuncDesc.java:233)
at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.getXpathOrFuncExprNodeDesc(TypeCheckProcFactory.java:1093)
at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.process(TypeCheckProcFactory.java:1312)
at org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispatcher.java:90)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatchAndReturn(DefaultGraphWalker.java:95)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.java:79)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java:133)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWalker.java:110)
at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory.genExprNode(TypeCheckProcFactory.java:209)
at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory.genExprNode(TypeCheckProcFactory.java:153)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genAllExprNodeDesc(SemanticAnalyzer.java:10499)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnalyzer.java:10455)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyzer.java:3822)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyzer.java:3601)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPostGroupByBodyPlan(SemanticAnalyzer.java:8943)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer.java:8898)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:9743)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:9636)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genOPTree(SemanticAnalyzer.java:10109)
at org.apache.hadoop.hive.ql.parse.CalcitePlanner.genOPTree(CalcitePlanner.java:329)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:10120)
at org.apache.hadoop.hive.ql.parse.CalcitePlanner.analyzeInternal(CalcitePlanner.java:211)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:227)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:454)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:314)
at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1164)
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1212)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1101)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1091)
at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:216)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:168)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:379)
at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:739)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:684)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:624)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136) java.lang.ClassNotFoundException:
org.apache.hadoop.hive.jdbc.HiveDriver
Now the code snippet:
try {
Class.forName("org.apache.hadoop.hive.jdbc.HiveDriver");
Connection wConn = DriverManager.getConnection("jdbc:hive2://172.20.2.40:10000/refdata", "", "");
Statement sql = wConn.createStatement();
ResultSet wResult = sql.executeQuery("select distinct zipcode5,cbsa_name,msa_name,pmsa_name,region,csaname,cbsa_div_name from refdata.zip_code_business_all_16_1_2016_orc where primaryrecord='P'");
while (wResult.next()){
censusMap.put(wResult.getString("zipcode5"), wResult.getString("cbsa_name"));
}
} catch (SQLException | ClassNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}

Resources