Oracle NoSQL Cloud Service - Is it possible to do a connection using instance-principal instead of creating config files? - oracle-nosql

I am using Oracle NoSQL Cloud Service on OCI and I want to write a program using the Oracle NoSQL Database Python SDK.
I did a test using the OCI SDK, I am using instance-principal IAM vs creating config files with tenancy/user ocid and API private keys on the nodes which invoke the noSQL api calls
Is it possible to do a connection using instance-principal instead of creating config files with tenancy/user ocid and API private keys with the Oracle NoSQL Database Python SDK.
I read the examples provided in the documentation https://github.com/oracle/nosql-python-sdk but I cannot find information about instance-principal support

The Oracle NoSQL Database Python SDK works with instance-principals and resource principals. See the documentation https://nosql-python-sdk.readthedocs.io/en/stable/api/borneo.iam.SignatureProvider.html
Here an example using resource principals and Oracle functions
def get_handle():
provider = borneo.iam.SignatureProvider.create_with_resource_principal()
compartment_id = provider.get_resource_principal_claim(borneo.ResourcePrincipalClaimKeys.COMPARTMENT_ID_CLAIM_KEY)
config = borneo.NoSQLHandleConfig(os.getenv('NOSQL_REGION'), provider).set_logger(None).set_default_compartment(compartment_id)
return borneo.NoSQLHandle(config)

Related

Synchronise Oracle database with Google Cloud

I am trying to get my Oracle DB content into GCP - BigQuery specifically, and then keep both in synch. I have not been able to find a standard way of doing this using GCP tools without using third-party software. Has anyone tried this? Any recommendations?

Look for best approach replicating oracle table in S3 bucket

My problem:
I need a data pipeline created from my organization’s Oracle DB (Oracle Cloud Infrastructure) to an AWS S3 bucket. Ideally, I would love for there to be some mechanism for oracle to push new data that has entered the database to be pushed to an S3 bucket as it is added (in whatever format).
Question:
Is this possible with Oracle native, specifically Oracle Cloud Infrastructure?
Or would is there a better solution you have seen?
Note:
I have seen AWS has the Data Sync product, this seems like it could facilitate with this problem, however I am not sure if it is suitable for this specific problem.
An S3 bucket is object storage; it can only hold complete files. You cannot open and update an existing file like you would in a normal file system, even just to add new rows. You will need to construct your whole file outside of Oracle and then push it to S3 with some other mechanism.
You may want to consider the following steps:
Export your data from Oracle Cloud into Oracle Object Storage (similar to S3) using the Oracle Cloud's integration with their object storage. (https://blogs.oracle.com/datawarehousing/the-simplest-guide-to-exporting-data-from-autonomous-database-directly-to-object-storage)
THEN:
Let the customer access the Oracle Object Store as they normally would access S3, using Oracle's Amazon S3 Compatibility API. (https://docs.oracle.com/en-us/iaas/Content/Object/Tasks/s3compatibleapi.htm)
OR:
Use an externally driven script to download the data - either from Oracle Object Store or directly from the database - to a server, then push the file up to Amazon S3. The server could be local, or hosted in either Oracle OCI or in AWS, as long as it has access to both object stores. (https://blogs.oracle.com/linux/using-rclone-to-copy-data-in-and-out-of-oracle-cloud-object-storage)
OR:
You may be able to use AWS Data Sync to move data directly from Oracle Object Storage to S3, depending on networking configuration requirements. (https://aws.amazon.com/blogs/aws/aws-datasync-adds-support-for-on-premises-object-storage/)

Create external data source in Azure Synapse Analytics (Azure SQL Data warehouse) to Oracle

I am trying to create external data source in Azure Synapse Analytics (Azure SQL Data warehouse) to Oracle external database. I am using the following code in SSMS to do that:
CREATE MASTER KEY ENCRYPTION BY PASSWORD = 'myPassword';
CREATE DATABASE SCOPED CREDENTIAL MyCred WITH IDENTITY = 'myUserName', Secret = 'Mypassword';
CREATE EXTERNAL DATA SOURCE MyEXTSource
WITH (
LOCATION = 'oracle://<myIPAddress>:1521',
CREDENTIAL = MyCred
)
I am getting the following error:
CREATE EXTERNAL DATA SOURCE statement failed because the 'TYPE' option is not specified. Specify a value for the 'TYPE' option and try again.
I understand from the below that TYPE is not a required option for Oracle databases.
https://learn.microsoft.com/en-us/sql/t-sql/statements/create-external-data-source-transact-sql?view=azure-sqldw-latest
Not sure how what the problem is here, is this feature still not supported in Azure Synapse Analytics (Azure DW) when it is already available in MS SQL Server 2019? Any ideas are welcome.
Polybase has different versions across the different products with different capabilities. Most of these are described here:
The ability to connect to Oracle is only present in the SQL Server versions, currently 2019. The documentation is quite clear that is only applies to SQL Server and not to Azure Synapse Analytics (formerly Azure SQL Data Warehouse):
https://learn.microsoft.com/en-us/sql/relational-databases/polybase/polybase-configure-oracle?view=sql-server-ver15
In summary, Azure Synapse Analytics and its version of Polybase does not currently support to access external Oracle tables at this time.

Deploying multi-tenant on Amazon RDS (individual DB for each tenant)

I am currently conceptualizing a cloud application (SaaS) and I am looking to create a DB for every user (tenant) that is registering an account.
May I know if it is programmatically possible to create a new DB on Amazon RDS through API or anything?
Databases are created on RDS using standard SQL DDL statements -- not the API.
For example:
To create additional databases, connect to the DB instance and use the SQL command CREATE DATABASE.
http://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/USER_CreateInstance.html

Using JDBC on Google App Engine (GAE)

GAE doesn't support JDBC but is there a way to use it anyway? Is there a way to connect to an external Oracle db to store structured information using the RDBMS pattern? Is the a wrapper or a runtime lib that makes it possible to connect to an external RDBMS using JDBC?
I don't want to use GAE's in-house MySQL cloud hack.
You cannot open sockets on GAE, that's why you cannot open a JDBC connection for any outside RDBMS.
BTW, today they launched a trusted tester program for sockets on GAE:
http://googleappengine.blogspot.com.br/2012/09/app-engine-172-released.html
But I believe this is not the case they are trying to address

Resources