SQL Error when querying any tables/views on a Databricks cluster via Dbeaver - macos

I am able to connect to the cluster, browse its hive catalog, see tables/views and columns/datatypes
Running a simple select statement from a view on a parquet file produces this error and no other results:
SQL Error [500540] [HY000]: [Databricks][DatabricksJDBCDriver](500540) Error caught in BackgroundFetcher. Foreground thread ID: 180. Background thread ID: 223. Error caught: sun.misc.Unsafe or java.nio.DirectByteBuffer.<init>(long, int) not available.
Standard Databricks cluster:
Standard_DS3_v2
JDBC URL:
jdbc:databricks://<reducted>.1.azuredatabricks.net:443/default;transportMode=http;ssl=1;httpPath=sql/protocolv1/o/<reducted>/<reducted>;AuthMech=3;UID=token;PWD=<reducted>
Advanced Options Spark Config:
spark.databricks.cluster.profile singleNode
spark.databricks.io.directoryCommit.createSuccessFile false
spark.master local[*, 4]
spark.driver.extraJavaOptions -Dio.netty.tryReflectionSetAccessible=true
spark.hadoop.fs.azure.account.key.<reducted>.blob.core.windows.net <reducted>
spark.executor.extraJavaOptions -Dio.netty.tryReflectionSetAccessible=true
parquet.enable.summary-metadata false
My local machine:
Dbeaver Version 22.1.2.202207091909
MacOS version (M1 chip): Monterey 12.4
Java version:
java --version
openjdk 18.0.1 2022-04-19
OpenJDK Runtime Environment Homebrew (build 18.0.1+0)
OpenJDK 64-Bit Server VM Homebrew (build 18.0.1+0, mixed mode, sharing)
I am able to do the following with no errors (Databricks default test dataset):
CREATE TABLE diamonds USING CSV OPTIONS (path "/databricks-datasets/Rdatasets/data-001/csv/ggplot2/diamonds.csv", header "true");
When I run this select color from diamonds; or this select * from diamonds;
I get this:
SQL Error [500618] [HY000]: [Databricks][DatabricksJDBCDriver](500618) Error occured while deserializing arrow data: sun.misc.Unsafe or java.nio.DirectByteBuffer.<init>(long, int) not available
Hence, any select query on any object (parquet file or anything else) causes the error described above.
What could be the problem? Any recommendations how to resolve this error? Why am I able to connect and see the metadata of the schemas/tables/views/columns, but not query or view the data?
P.S. I followed this guide exactly: https://learn.microsoft.com/en-us/azure/databricks/dev-tools/dbeaver#step-3-connect-dbeaver-to-your-azure-databricks-databases

Related

Error when opening GeoMondrian schema file and using MDX query in GeoMondrian Workbench

I'm having trouble using the GeoMondrian Workbench on my Ubuntu 18.04 LTS system. I've followed the installation instructions and have installed the following:
Oracle Java 8
PostgreSQL 9.5
PostGIS 2.5
I've downloaded the GeoMondrian Workbench and the "simple_geofoodmart.sql" file, created a database, and passed the necessary parameters to the workbench.
However, when I try to open the "simple_geofoodmart.xml" schema file, I get the following error:
Error: Schema file /home/tarik/workbench/demo/simple_geofoodmart.xml is invalid. org/opengis/referencing/NoSuchAuthorityCodeException java.lang.NoClassDefFoundError: org/opengis/referencing/NoSuchAuthorityCodeException
Additionally, when I try to use an MDX query, I get the following error:
"Exception in thread 'AWT-EventQueue-0' java.lang.NoClassDefFoundError: Could not initialize class mondrian.olap.fun.GlobalFunTable at mondrian.rolap.RolapSchema$RolapSchemaFunctionTable.defineFunctions(RolapSchema.java:1643)"
I've tried to resolve all the dependencies and have made sure that I have the necessary JAR files in my classpath, but I'm still getting the same errors. Can anyone help me figure out what's going wrong and how to fix it? Thank you!

JDBC connection fails to SAP Hana Cloud: "RTEException: -708 Receive of connect failed."

Similar to https://answers.sap.com/questions/12675326/sap-dbtech-jdbc-708-receive-of-connect-fail.html
Connection to SAP Hana Cloud fails with the error "RTEException: -708 Receive of connect failed."
Steps to reproduce:
java -version
OpenJDK Runtime Environment (build 11.0.15+10-Ubuntu-0ubuntu0.20.04.1)
java -jar ngdbc.jar -V
package package com.sap.db.jdbc, Java Platform API Specification, version 1.4, SAP HANA JDBC Driver, SAP SE, 1.120.05-8c23c50e159e9883edab0e2ebdd4e02c5919cd08
java -jar ngdbc.jar -u DBADMIN,PASSWORD -n BIG-IDENTIFIER.hana.trial-us10.hanacloud.ondemand.com:443 -d test -o encrypt=true -o validatecertificate=false
(the certificate is imported, and the error occurs even without the parameters encrypt and validatecertificate)
Contents of trace log:
ClassLoader: jdk.internal.loader.ClassLoaders$AppClassLoader#55054057
Process-ID: 320850
package package com.sap.db.jdbc, Java Platform API Specification, version 1.4, SAP HANA JDBC Driver, SAP SE, 1.120.05-8c23c50e159e9883edab0e2ebdd4e02c5919cd08 on Java 11.0.15
---- Thread 1eb44e46 main Timestamp: 2022-09-29 09:50:03.162
new Connection 'jdbc:sap://BIG-IDENTIFIER.hana.trial-us10.hanacloud.ondemand.com:443'
password=****
databaseName=test
host=BIG-IDENTIFIER....
options=
cmd=Select top 1 42 as "connect test" fro...
user=DBADMIN
HOSTLIST: [BIG-IDENTIFIER.hana.trial-us10.hanacloud.ondemand.com:443,]
new RTEException: -708 Receive of connect failed.
whereAmIjava.lang.Throwable
at com.sap.db.util.Tracer.whereAmI(Tracer.java:280)
at com.sap.db.rte.comm.RTEException.(RTEException.java:51)
at com.sap.db.rte.comm.BasicSocketComm.receiveInfoRequest(BasicSocketComm.java:587)
at com.sap.db.rte.comm.BasicSocketComm.doInfoRequest(BasicSocketComm.java:84)
at com.sap.db.rte.comm.BasicSocketComm.connectDB(BasicSocketComm.java:187)
at com.sap.db.rte.comm.SocketComm$1.open(SocketComm.java:47)
at com.sap.db.jdbc.topology.Topology.getSession(Topology.java:88)
at com.sap.db.jdbc.Driver.openByURL(Driver.java:1216)
at com.sap.db.jdbc.Driver.connect(Driver.java:313)
at com.sap.db.jdbc.Driver.main(Driver.java:858)
using null
=> FAILED
any thoughts on the why and how to solve it?
Since this is HANA Cloud Trial, I don't think, that you should use -d to specify a tenant database name. Please try to remove this parameter as you may be trying to connect to a non-existing tenant database. You can also refer to this tutorial.
Second thing to consider is the allowlist for IP addresses. Per default connections are blocked if you have not added your client ip to the allowlist. More details can be found in this blog.
Last but not least, please check if your are using the latest version of ngdb.jar. It can be obtained on this site.

How to create external table to DynamoDB on Hive 3.1

I'd like to know if it's possible to have an external table pointing to a DynamoDB table on AWS using Hive.
I'm not using AWS EMR, what I'm using is a Hadoop Stack configured through Apache Ambari.
Hive version: Hive 3.1.0.3.1.4.0-315
What I did was:
Downloaded the EMR Dynamo-Hive connector JARS directly from the maven repository: https://mvnrepository.com/artifact/com.amazon.emr
I loaded all the JARS in hive.aux.jars.path:
emr-dynamodb-hadoop-4.12.0.jar
emr-dynamodb-hive-4.12.0.jar
emr-dynamodb-tools-4.12.0.jar
hive1.2-shims-4.12.0.jar
hive1-shims-4.12.0.jar
hive2-shims-4.12.0.jar
hive2-shims-4.15.0.jar
shims-common-4.12.0.jar
shims-loader-4.12.0.jar
But when I try to create the table with:
CREATE EXTERNAL TABLE dynamo_LabDynamoHive
(id double, nome string)
STORED BY 'org.apache.hadoop.hive.dynamodb.DynamoDBStorageHandler'
TBLPROPERTIES (
"dynamodb.table.name" = "LabDynamoHive",
"dynamodb.column.mapping" = "id:id,nome:nome"
);
I get the following error:
INFO : Starting task [Stage-0:DDL] in serial mode
ERROR : FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. java.lang.RuntimeException: Shim class for Hive version 3.1.1000 does not exist
INFO : Completed executing command(queryId=hive_20200422142624_6ebabdc8-8942-4025-84a8-411505d20895); Time taken: 0.203 seconds
Error: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. java.lang.RuntimeException: Shim class for Hive version 3.1.1000 does not exist (state=08S01,code=1)
I know I'm not loading a Shims JAR for Hive 3, but I'd like to know if any of you have tried and succeded in using an external table with DynamoDB using Hive 3 outside of EMR.
Any help or directions would be greatly appreciated!
problem is apparently the source code of this EMR connector is somewhat outdated and lacks Hive 3.x support recently introduced by AWS for EMR 6.0.
However, you can find a working 3.1 implementation here, forked from the official EMR connector: https://github.com/ramsesrm/emr-dynamodb-connector
Installation steps as follow:
1- compile the mentioned code (mvn clean package)
2- install the 3 JARs in your hive.aux.jars.path, along with aws-java-sdk-core and aws-java-sdk-dynamodb JARs from AWS (shim JARs are not required), 5 in total.
Tha's it. Don't forget to specify the region as a TBLPROPERTIES if you're not using the default US one.

net.sf.jasperreports.engine.JRException: No deserializer defined

I am tring to connect HBASE with jasperreports-server-cp-6.0.1. I have hadoop 2.5.2 and hbase-1.0.1 installed on my system.
I have installed HBasePlugin-0.5.1.nbm plugin in iReport 5.6.0.
I have followed all the steps given in: http://community.jaspersoft.com/wiki/hadoop-hbase
When I write the following Query:
{ "tableName" : "blogposts", "deserializerClass" : "com.jaspersoft.hbase.deserialize.impl.ShellDeserializer" }
In iReport, I am getting the following error:
Message:
net.sf.jasperreports.engine.JRException: No deserializer defined
Level:
SEVERE
Stack Trace:
No deserializer defined
com.jaspersoft.hadoop.hbase.query.HBaseQueryWrapper.<init>(HBaseQueryWrapper.java:152)
com.jaspersoft.hadoop.hbase.HBaseFieldsProvider.getFields(HBaseFieldsProvider.java:50)
com.jaspersoft.ireport.hbase.designer.HBaseFieldsProvider.getFields(HBaseFieldsProvider.java:57)
com.jaspersoft.ireport.hbase.connection.HBaseConnection.readFields(HBaseConnection.java:185)
com.jaspersoft.ireport.designer.wizards.ConnectionSelectionWizardPanel.validate(ConnectionSelectionWizardPanel.java:146)
org.openide.WizardDescriptor$7.run(WizardDescriptor.java:1357)
org.openide.util.RequestProcessor$Task.run(RequestProcessor.java:572)
org.openide.util.RequestProcessor$Processor.run(RequestProcessor.java:997)
Could you please help me with this error (I also tried with iReport 4.0.2, but I received the same error)?
Both iReport and the HBase connector are outdated.
Try using the Apache Phoenix JDBC driver which is compatible with the latest release (6.2) of the Jaspersoft products:
http://community.jaspersoft.com/wiki/how-use-apache-phoenix-jdbc-driver-run-reports-hbase
Thanks!

Worklight 6.2 migration tool error

I am working on migrating data from WL 5.0.6.2 to 6.2. While doing that I encountered problems running data migration tool.
Background:
Approach:
we export the WRKLGHT table from 5062 DB to an intermediate DB for data migration so the running DB is not affected.
DBs:
1. schema: proj
its storing 5062 runtime data, upgrade-worklight-506-60-oracle.sql, upgrade-worklight-60-61-oracle.sql upgrade sql are executed.
2. schema: proj6
its blank DB at first, "create-worklightadmin-oracle.sql" is executed to prepare WLADMIN tables.
WL Project existing:
1. wlProj
its the app running on 5062 WL server
2. wlProj6
its the target project running on 6.2 WL, and data will be migrated to be used by this project
Steps:
1. recreate proj6 schema
1.1 SQLPlus -> connect as SYSTEM -> "drop user proj6 cascade"
1.2. -> "create user proj6 identified by {password}" -> "grant all privileges to proj6"
2. execute "create-worklightadmin-oracle.sql" on proj6 schema
2.1 SQLDeveloper -> connect as user proj6
2.2 run #"{ WL server install dir}/databases/create-worklightadmin-oracle.sql";
2.3 commit
3. Open cmd and run the following command
java -cp ojdbc6.jar;worklight-ant-deployer.jar com.ibm.worklight.config.dbmigration62.MigrationTool -p /wlProj -sourceurl jdbc:oracle:thin:#//192.168.0.**:1521/xe -sourcedriver oracle.jdbc.driver.OracleDriver -sourceuser proj -sourcepassword *** -targeturl jdbc:oracle:thin:#//192.168.0.***:1521/xe -targetdriver oracle.jdbc.driver.OracleDriver -targetuser proj6 -targetpassword *** 2>out.txt
'migrating wlProj-{ios,android}' & 'migrating access data record for wlProj-.....' are shown in cmd.
And in out.txt, the following error is shown.
15 WorklightManagementPU-oracle INFO [main] openjpa.Runtime - Starting OpenJPA 1.2.2
15 WorklightManagementPU-oracle INFO [main] openjpa.jdbc.JDBC - Using dictionary class "org.apache.openjpa.jdbc.sql.OracleDictionary".
com.ibm.worklight.config.dbmigration62.exceptions.MigrationException: FWLSE3406E: The applications migration failed with error The field "description" of instance "ApplicationEntity[id=51, name=wlProj, displayName=, description=null, thumbnail=null, platformVersion=null]" contained a null value; the metadata for this field specifies that nulls are illegal..
at com.ibm.worklight.config.dbmigration62.MigrationTool.run(MigrationTool.java:195)
at com.ibm.worklight.config.dbmigration62.MigrationTool.main(MigrationTool.java:128)
Caused by: <openjpa-1.2.2-r422266:898935 fatal user error> org.apache.openjpa.persistence.InvalidStateException: The field "description" of instance "ApplicationEntity[id=51, name=wlProj, displayName=, description=null, thumbnail=null, platformVersion=null]" contained a null value; the metadata for this field specifies that nulls are illegal.
at org.apache.openjpa.kernel.SingleFieldManager.preFlush(SingleFieldManager.java:540)
at org.apache.openjpa.kernel.SingleFieldManager.preFlush(SingleFieldManager.java:478)
at org.apache.openjpa.kernel.StateManagerImpl.preFlush(StateManagerImpl.java:2829)
at org.apache.openjpa.kernel.PNewState.beforeFlush(PNewState.java:39)
at org.apache.openjpa.kernel.StateManagerImpl.beforeFlush(StateManagerImpl.java:960)
at org.apache.openjpa.kernel.BrokerImpl.flush(BrokerImpl.java:1967)
at org.apache.openjpa.kernel.BrokerImpl.flushSafe(BrokerImpl.java:1927)
at org.apache.openjpa.kernel.BrokerImpl.beforeCompletion(BrokerImpl.java:1845)
at org.apache.openjpa.kernel.LocalManagedRuntime.commit(LocalManagedRuntime.java:81)
at org.apache.openjpa.kernel.BrokerImpl.commit(BrokerImpl.java:1369)
at org.apache.openjpa.kernel.DelegatingBroker.commit(DelegatingBroker.java:877)
at org.apache.openjpa.persistence.EntityManagerImpl.commit(EntityManagerImpl.java:512)
at com.ibm.worklight.config.dbmigration62.ApplicationMigration.migrateApplication(ApplicationMigration.java:138)
at com.ibm.worklight.config.dbmigration62.ApplicationMigration.migrate(ApplicationMigration.java:63)
at com.ibm.worklight.config.dbmigration62.AbstractMigration.run(AbstractMigration.java:66)
at com.ibm.worklight.config.dbmigration62.MigrationTool.run(MigrationTool.java:183)
... 1 more
ERROR 20 : FWLSE3406E: The applications migration failed with error The field "description" of instance "ApplicationEntity[id=51, name=wlProj, displayName=, description=null, thumbnail=null, platformVersion=null]" contained a null value; the metadata for this field specifies that nulls are illegal..
This is recorded as IBM PMR #77138,999,738, and was identified as a defect that is currently being worked on.
There is no local fix to do other than wait for a fix via the support ticket.

Resources