I've installed ODBC Driver from http://hortonworks.com/hdp/addons/
and configured to use my Hive Server 2 on HDP installation.
I'm using Microstrategy Microstrategy Analytics Desktop to run queries. It works fine, until I'm trying to use server side property.
I've configured my ODBC Data Source Administrator/System DSN/Hortoworks Hive ODBC Driver Setup/Advanced Options/Setver Side Properties as follows:
SSP_mapred.job.queue.name = pr
SSP_tez.queue.name = pr
But in Applications on HDP I can see that MSTR is using 'default' queue instead of pr.
What am I doing wrong? In Installation guide for the Hortonworks driver (as well as for Simba Driver) property is calld: SSP_mapred.queue.names=myQueue but it doesnt work as well..
Is there any place I can see the log of this connection and check if the properties are sent to the asever at all?
Regrds
Pawel
Related
I had been using the Databricks JDBC driver version 2.6.22 and tried to upgrade to 2.6.27. However, after upgrading I get messages saying my JDBC URLs are invalid when trying to connect. These JDBC URLs work fine with the old version of the driver and I pull them directly from the Databricks SQL endpoint info, so I expect something else is going on.
Example JDBC URL:
jdbc:spark://[workspace domain]:443/default;transportMode=http;ssl=1;AuthMech=3;httpPath=/sql/1.0/endpoints/[identifier]
I noticed between versions the name went from SimbaSparkJDBC42-2.6.22.1040 to DatabricksJDBC42-2.6.27.1048 and the JAR class name went from com.simba.spark.jdbc.Driver to com.databricks.client.jdbc.Driver. Does dropping Simba mean there was a more major change? Do I need to correct my JDBC URLs somehow?
I'm downloading my driver from here
I'm using DBeaver as my SQL client if that makes a difference.
JDBC URLs for the new databricks driver start with jdbc:databricks: instead of jdbc:spark:. As of now, JDBC URL details in the UI still use the old format, just replace spark with databricks and they should work. Mentioned here
Databricks has a different URL format, check the documentation here
Basically in the url replace spark to databricks and add PWD parameter.
jdbc:databricks://[workspace domain]:443/default;transportMode=http;ssl=1;AuthMech=3;httpPath=/sql/1.0/endpoints/[identifier];PWD=[your_access_token]
PWD is the personal access token. Instructions to get access token.
I'm trying to figure out how to get Tableau to recognize my JDBC Driver when creating a custom JDBC connector using the Tableau Connector SDK . Currently when Tableau loads my custom connector I can tell the driver is not found because there is a link that shows up in the Connector that says:
"Download and install the drivers, and then connect."
Tableau online does not have the drivers for the SAS data I want to connect to: so I cannot download the drivers to install from Tableau
Luckily I already have the .jar files and have placed them at C:\Program Files\Tableau\Drivers on Windows
In the Connector API Reference "driver-resolver" states that: "This is mainly used for ODBC connections but can be used for JDBC as well" but I do not see specific instructions for working with JDBC drivers explicitly. Except for the Postgres JDBC Example which DOES NOT use a driver-resolver. I also have not seen a JDBC example in the Resolvers github samples
In the Tableau logs I can see that the .jar files containing my driver are recognized by searching for their names, they are shown in the logs.
Logs Environment Section Excerpt:
"environment","v":{...all my jar files listed here}
Drivers not being recognized screenshot that says to download the drivers
Finally, here is what my .tdr file looks like with my current driver-resolver definition that does not work
<tdr class='sas_jdbc'>
<connection-resolver>
<connection-builder>
<script file="connectionBuilder.js" />
</connection-builder>
<connection-properties>
<script file="connectionProperties.js" />
</connection-properties>
</connection-resolver>
<driver-resolver>
<driver-match>
<driver-name type='exact'>com.sas.rio.MVADriver</driver-name>
</driver-match>
</driver-resolver>
</tdr>
Can anyone shed some light on this for me? I feel like I'm close. An example using a JDBC driver-resolver in a .tdr file would be nice.
You don't actually need a .tdr file with JDBC, as all the driver resolution happens in the connectionBuilder.js file today. The URL of the connection there includes the driver name. I added a story to our backlog to make this more clear. I should also mention, that hopefully you are using 2019.4 or higher for the best experience. Thanks for using the SDK!
The answer to this is that driver-resolver is not used in JDBC custom connector definitions. The problem I'm experiencing with getting the SAS JDBC Driver to work is because the SAS JDBC Driver is JDBC Type 2.0. JDBC Drivers need to be Type 4.0 to work with a Tableau custom connector using the Tableau Connector SDK.
The resolution is to use a Type 4.0 JDBC Driver which I have not seen from SAS yet.
I am trying to setup a temporary unit test database that has DB2 style syntax. I know Derby fills this role quite nicely on our Java applications.
I have done much searching, and I have seen that you can use the JDBC DB2 driver to connect to Derby - which is cool, except it doesn't seem as true for the ODBC DB2 (or OLEDB) driver allowing a connection to Derby.
I also saw that Cloudscape had a version but following the download instructions, there is a redirect page that states:
There is no replacement for the old Cloudscape ODBC driver in IBM's world. Does anyone know of another source where I can get it? Or another way of connecting to Derby from VB6 (or of creating an ADODB.Connection in VB.net to Derby).
I suppose I would settle for an old version of Derby that the you can connect with (an old) DB2 ODBC driver.
In Talend 5x, I was able to use the Generic ODBC connection to connect to an ODBC source (QuickBooks QODBC). I was able to read and extract data fine from QuickBooks.
I see that Talend 6 doesn't have that ability to connect to Generic ODBC any longer. Can someone suggest an example, workaround or alternative to be able to connect to a Windows ODBC source? I see the JDBC connection - is there an example somewhere I can see if it will do the same thing?
Thanks in advance,
HL
odbc support was removed in Talend 6.0
Presumably, you could rollback to Talend 5.x and Java 1.7. Or, look in the Talend Exchange 3rd-party components for an odbc component.
https://www.talendforge.org/forum/viewtopic.php?id=46670
In Talend 6x you can use tJDBCConnection and the other components that start with tJDBC to make a connection with ODBC. It's a built in Java driver for ODBC.
I like understand the 'Hive ODBC Connector' concept. means What is a use of Hive ODBC Connector in the architecture.
Does it require to set-up the DSN (data source Name ). Can we go for DSN-less configuration ?
Please explain in details
If you have one of the distributions from Cloudera, Hortonworks, MapR, Intel, Microsoft or DataStax, they already come with an ODBC driver in the distribution. The driver is created by Simba Technologies (http://www.simba.com/connectors/apache-hadoop-hive-odbc).
If you're using the Apache version of Hadoop, you can still trial the version of the ODBC driver on the above link for 30 days, however you will need to pay for it to continue use.
I only mention the above as this ODBC driver is a more complete implementation of the ODBC specification than the open source one, and it can also do SQL-HiveQL translation which essentially means that you can plug it into Excel or Tableau or the like and have them issue standard SQL. As mc110 mentioned, you can make DSN or DSN-less connections and there is also a Windows configuration dialog available should you wish to use that.
Also, in the interests of full disclosure, I work for Simba Technologies.
As explained at https://cwiki.apache.org/confluence/display/Hive/HiveODBC, the Hive ODBC connector implements the ODBC API for Hive, potentially allowing a lot of existing well-written ODBC applications to seamlessly use Hive as they would any other database. The link also explains what API calls are and are not supported.
SQLDriverConnect is supported, which implies you can make DSN-less connections. I suggest you read the information from the link for more information. Also, http://www.cloudera.com/content/cloudera-content/cloudera-docs/Connectors/PDF/Cloudera-ODBC-Driver-for-Apache-Hive-Install-Guide.pdf has a section on configuring DSN-less authentication.