I have an amazon redshift db that supports connecting a postgresql client with jdbc
google apps scripts support connecting to a db with jdbc, but only with the mysql, ms sql, and oracle protocol, but not postgresql. If I try, not surprisingly I get error:
'Connection URL uses an unsupported JDBC protocol.'
Looking at some google forums, this has been an issue for several years with no response from google.
Is there any workaround?
thanks
I use Kloudio - a google sheets extension to get this done. I can run and schedule my redshift queries in Kloudio
If you are using amazon redshift then you can connect it through amazon redshift client
Here are the steps:
Write SQL query in the redshift client and save it as a report
use its api key generated and report no. in the embedded link.
use Importdata function to google spreadsheet to import the data automatically. it will refresh by default every one hour.
Thanks
Related
I'm trying to read Oracle Database data on Azure Databricks platform.
Can someone share the step to step process on how I can connect Oracle data to Databricks? I've probably searched the whole internet and read documentation but I can't find a solution that actually works. Not sure if it's because I have the incorrect driver or what.
Here's my process:
Uploaded the ojdbc8.jar file on the cluster libraries (the instant client 19
Tried to connect the data on databricks notebook and it didn't work
Can anyone share their process?
Which jar to upload in the library and where can I find this file?
How to connect? Sample code?
Any better way to do this?
To install library use
pip install cx_Oracle
Then use below code snippet to read in data from an Oracle database
CREATE TABLE oracle_table
USING org.apache.spark.sql.jdbc
OPTIONS (
dbtable 'table_name',
driver 'oracle.jdbc.driver.OracleDriver',
user 'username',
password 'pasword',
url 'jdbc:oracle:thin://#<hostname>:1521/<db>')
To read data from oracle database in pyspark you can follow this article - Reading Data From Oracle Database With Apache Spark
Refer for more information - Oracle | Databricks
I am trying to get my Oracle DB content into GCP - BigQuery specifically, and then keep both in synch. I have not been able to find a standard way of doing this using GCP tools without using third-party software. Has anyone tried this? Any recommendations?
I would like to know if it is possible in Power BI Web to retrieve data from an Oracle database hosted in Oracle Cloud. Or to any other database/cloud combo different that Azure SQL Server.
Thanks!
Yes you can definitely achieve this.
There are connectors available for connecting powerbi desktop to oracle database.
Link
Link 2
I was wondering how I can deploy a JDBC connector in Google Big Query/Google Cloud.
Or is it even possible to do ?
So far, I didn't find any information on the internet nor in the Google Big Query documentation.
Feel free to provide me any links.
Thank you very much in advance!
Assuming you're really asking "how to connect to BigQuery using JDBC", you can download the official BigQuery JDBC drivers from Google's site:
https://cloud.google.com/bigquery/providers/simba-drivers/
You'll then need to deploy that driver wherever your application is running e.g. a GCP Compute instance.
You cannot deploy anything "in" BigQuery itself as it's a managed service.
GAE doesn't support JDBC but is there a way to use it anyway? Is there a way to connect to an external Oracle db to store structured information using the RDBMS pattern? Is the a wrapper or a runtime lib that makes it possible to connect to an external RDBMS using JDBC?
I don't want to use GAE's in-house MySQL cloud hack.
You cannot open sockets on GAE, that's why you cannot open a JDBC connection for any outside RDBMS.
BTW, today they launched a trusted tester program for sockets on GAE:
http://googleappengine.blogspot.com.br/2012/09/app-engine-172-released.html
But I believe this is not the case they are trying to address