Load .DMP oracle file from local machine to RDS Oracle - oracle

I am wandering around and didn't get an answer to this question.
is there any way to import Oracle .dmp file stored on a local machine to RDS Oracle?
If Yes how to do it?
Else why it's not possible to do so as other databases gave the flexibility to do these kinds of imports through more than one way.

You can't do that. When you import data with Oracle Data Pump, you must transfer the dump file that contains the data from the source database to the target database**. You can transfer the dump file using an Amazon S3 bucket or by using a database link between the two databases.
If your local machine contains a database and you have a network connection between your on-premises database and your Oracle RDS , then you can use NETWORK_LINK, although I don't recommend it. It is much better to tranfer the file using S3 bucket.
https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/Oracle.Procedural.Importing.html

Related

bulk transfer data from Oracle to postgreSQL

I have an instance of Oracle on PROD VPC (in AWS) and a postgreSQL on DEV VPC (in AWS). I want to copy all tables from Oracle to postgreSQL. As they are on different VPC, I am currently using the below process:
Using data pump, export the tables from Oracle PROD and import them to Oracle DEV (on DEV VPC)
Use python to extract data from Oracle DEV in CSV format and load them to postgreSQL using COPY
Is there any other efficient way to do this copy? I am trying to copy all work schemas (DDL and data) from source to destination.
Thanks in advance!
You could use VPC peering to get rid of the copy.What is VPC peering
Next, the oracle foreign data wrapper is a marvel to use for this kind of work. It makes your oracle tables accessible from the postgres database and makes copying data a piece of cake. A nice little python script can always help if there are many tables to handle.

how to connect to oracle database from snowflake?

I have to pull some data from oracle and update the data in snowflake. And ofcourse the size of the data is 5gb.
Is there any procedure to connect to oracle database from snowflake? OR
Do I need to connect them using a programming language as python?
You'll need to unload the data from Oracle and load into Snowflake, as there are no "direct connect" options I've ever heard about.
I'd use SQL*Loader to unload, push the files to AWS S3 (or your cloud vendor's storage), and issue Snowflake COPY INTO TABLE commands, it should be fairly straightforward.
There is no equivalent to Oracle database links in Snowflake. You would need an external process to move the data from Oracle to S3. Then you can configure a Snowpipe task to load from S3 into Snowflake. See Loading Continuously Using Snowpipe for more information.
I would suggest to use python programming to extract and load data from oracle to snowflake. Since your oracle table is being updated daily write python program to generate merge statement dynamically to load your incremental data from oracle to snowflake.
Snowflake supports Java script based stored procedure so you can use stored procedure to generate merge statement dynamically by passing table name as parameter and you can call it via python.
Initial load from oracle to snowflake may take time as you have 5GB data from your source system.

How to export using DATA_PUMP to S3 bucket?

We have RDS (Oracle) instance, I need to export specific Schema into dumpfile. Export works and copies dump file into DATA_PUMP_DIR. Issue is that RDS do not have file directory access.
I need exported DMP file either on S3 or copy to another EC2 instance.
The article: LINK talks about copying data dump file between two RDS instances but not to S3 or EC2.
Third option. I am using it.
Take a look at alexandria-plsql-utils project, and especially look at: amazon_aws_auth_pkg, amazon_aws_s3_pkg and ftp_util_pkg packages.
Install required packages and dependencies.
Do your dump, then with such example code below you can copy file from Amazon RDS Oracle into S3 bucket.
declare
b_blob blob;
begin
b_blob := file_util_pkg.get_blob_from_file ('DATA_PUMP_DIR', 'my_dump.dmp');
amazon_aws_auth_pkg.init ('aws_key_id','aws_secret', p_gmt_offset => 0);
amazon_aws_s3_pkg.new_object('my-bucket-name', 'my_dump.dmp', b_blob, 'application/octet-stream');
end;
`
There are several ways to solve this problem.
First option.
Install a free database version of the Oracle XE version on EC2
instance(It is very easy and fast)
Export a schema from the RDS instance to DATA_PUMP_DIR
directory. Use DBMS_DATAPUMP package or run expdp user/pass#rds on EC2 to create a dump file.
Create database link on RDS instance between RDS DB and Oracle XE
DB.
If you are creating a database link between two DB instances inside
the same VPC or peered VPCs the two DB instances should have a valid
route between them.
Adjusting Database Links for Use with DB Instances in a VPC
Copy the dump files from RDS instance to Oracle XE DB on EC2 uses
the DBMS_FILE_TRANSFER.PUT_FILE via database link
Copy files from the DATA_PUMP_DIR directory Oracle XE on EC2 instance to the S3.
Second option.
Use the obsolete utility exp to export. It has restrictions on the export of certain types of data and is slower.
Run exp user/password#rds on EC2 instance.
Copy files from the directory Oracle XE on EC2 instance to the S3
Original export is desupported for general use as of Oracle Database
11g. The only supported use of Original Export in 11g is backward
migration of XMLType data to a database version 10g release 2 (10.2)
or earlier. Therefore, Oracle recommends that you use the new Data
Pump Export and Import utilities, except in the following situations
which require Original Export and Import:
Original Export and Import
It's now possible to directly access a S3 bucket from a Oracle database. Please have a look at the following documentation: https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/oracle-s3-integration.html
And here the official news that this is supported: https://aws.amazon.com/about-aws/whats-new/2019/02/Amazon-RDS-for-Oracle-Now-Supports-Amazon-S3-Integration/?nc1=h_ls
It seems that the first post was a little bit to early to get this news. But anyway this post lists further good solutions like the database link.

Oracle Data Pump Transfer Between Databases

I have a specific need for data pump and I am having a hard time searching for a solution.
Currently, I have a exp/imp program that exports tables (selectively based on queries) from one database, and imports that same data into another database. This program and the dump files reside on a common server that can access both the source and destination databases. This is a totally automated process. It works good, albeit slowly.
Due to various reasons, I must migrate this program to use data pump. The biggest change now is the location of the dmp files. I also have very limited access to the database servers themselves, but I can run data pump.
The process will be run from the same common server, but the exported files will now reside on the database server for the source database. No issue there. I can create dmp files using expdp.
My issue is how to get that same data into the destination database. When I run impdp, it is expecting a data_pump_dir in the destination area (not source area). Again, this is automated, and I don't have the luxury of being able to transfer dmp files using scp or ftp or anything like that.
What can I use to overcome this problem using datapump?
No reason you cannot configure an external directory on BOTH databases:
CREATE DIRECTORY mydumpdir AS '/whatever/the/path/is';
Then, impdp and expdp will take the DIRECTORY argument as mydumpdir
Make sure you configure permissions for the Oracle schemas/users to read/write to the directory AND the oracle process account should have OS level rights to read/write to that location also. The expdp server should also have write access as it might be trying to write reports to the locations or you might be using to do file cleanup.

How to migrate the existing database to oracle RDS

I need to migrate an existing application's database into oracle RDS database in Amazon Web services.
I have the dump file which is residing on an EC2 instance. The dump has not been taken by me.Also I would like to know how can I take the dump so that it can be imported successfully. The EC2 instance has an oracle regular client.
I have set up the oracle RDS instance in AWS and I am able to connect to the server.
I would like to know how can I import the database dump on RDS.
I am using this command :
imp rdsuser#oracledb FILE=fulldb.dmp TOUSER=rdsuser FROMUSER=SYSTEM log=test.log buffer=100000
Any lead is appreciated.
Also I would like to know what is the best method to import an existing database:
1. to take dump.
2. Or to take the clone of all files of database ( that will require the downtime in the server).
Best strategy is to take dump then import it into RDS . If your DB size is too big then contact AWS guys for help .

Resources