We have RDS (Oracle) instance, I need to export specific Schema into dumpfile. Export works and copies dump file into DATA_PUMP_DIR. Issue is that RDS do not have file directory access.
I need exported DMP file either on S3 or copy to another EC2 instance.
The article: LINK talks about copying data dump file between two RDS instances but not to S3 or EC2.
Third option. I am using it.
Take a look at alexandria-plsql-utils project, and especially look at: amazon_aws_auth_pkg, amazon_aws_s3_pkg and ftp_util_pkg packages.
Install required packages and dependencies.
Do your dump, then with such example code below you can copy file from Amazon RDS Oracle into S3 bucket.
declare
b_blob blob;
begin
b_blob := file_util_pkg.get_blob_from_file ('DATA_PUMP_DIR', 'my_dump.dmp');
amazon_aws_auth_pkg.init ('aws_key_id','aws_secret', p_gmt_offset => 0);
amazon_aws_s3_pkg.new_object('my-bucket-name', 'my_dump.dmp', b_blob, 'application/octet-stream');
end;
`
There are several ways to solve this problem.
First option.
Install a free database version of the Oracle XE version on EC2
instance(It is very easy and fast)
Export a schema from the RDS instance to DATA_PUMP_DIR
directory. Use DBMS_DATAPUMP package or run expdp user/pass#rds on EC2 to create a dump file.
Create database link on RDS instance between RDS DB and Oracle XE
DB.
If you are creating a database link between two DB instances inside
the same VPC or peered VPCs the two DB instances should have a valid
route between them.
Adjusting Database Links for Use with DB Instances in a VPC
Copy the dump files from RDS instance to Oracle XE DB on EC2 uses
the DBMS_FILE_TRANSFER.PUT_FILE via database link
Copy files from the DATA_PUMP_DIR directory Oracle XE on EC2 instance to the S3.
Second option.
Use the obsolete utility exp to export. It has restrictions on the export of certain types of data and is slower.
Run exp user/password#rds on EC2 instance.
Copy files from the directory Oracle XE on EC2 instance to the S3
Original export is desupported for general use as of Oracle Database
11g. The only supported use of Original Export in 11g is backward
migration of XMLType data to a database version 10g release 2 (10.2)
or earlier. Therefore, Oracle recommends that you use the new Data
Pump Export and Import utilities, except in the following situations
which require Original Export and Import:
Original Export and Import
It's now possible to directly access a S3 bucket from a Oracle database. Please have a look at the following documentation: https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/oracle-s3-integration.html
And here the official news that this is supported: https://aws.amazon.com/about-aws/whats-new/2019/02/Amazon-RDS-for-Oracle-Now-Supports-Amazon-S3-Integration/?nc1=h_ls
It seems that the first post was a little bit to early to get this news. But anyway this post lists further good solutions like the database link.
Related
I have an instance of Oracle on PROD VPC (in AWS) and a postgreSQL on DEV VPC (in AWS). I want to copy all tables from Oracle to postgreSQL. As they are on different VPC, I am currently using the below process:
Using data pump, export the tables from Oracle PROD and import them to Oracle DEV (on DEV VPC)
Use python to extract data from Oracle DEV in CSV format and load them to postgreSQL using COPY
Is there any other efficient way to do this copy? I am trying to copy all work schemas (DDL and data) from source to destination.
Thanks in advance!
You could use VPC peering to get rid of the copy.What is VPC peering
Next, the oracle foreign data wrapper is a marvel to use for this kind of work. It makes your oracle tables accessible from the postgres database and makes copying data a piece of cake. A nice little python script can always help if there are many tables to handle.
I am wandering around and didn't get an answer to this question.
is there any way to import Oracle .dmp file stored on a local machine to RDS Oracle?
If Yes how to do it?
Else why it's not possible to do so as other databases gave the flexibility to do these kinds of imports through more than one way.
You can't do that. When you import data with Oracle Data Pump, you must transfer the dump file that contains the data from the source database to the target database**. You can transfer the dump file using an Amazon S3 bucket or by using a database link between the two databases.
If your local machine contains a database and you have a network connection between your on-premises database and your Oracle RDS , then you can use NETWORK_LINK, although I don't recommend it. It is much better to tranfer the file using S3 bucket.
https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/Oracle.Procedural.Importing.html
I have an Oracle SE database that we run in Amazon's RDS service as our production database. I am looking for techniques and/or tools to duplicate the database to a developer edition database that the development team can run on their local systems.
You can generate dump file and transfer it to your local/standalone unix system where oracle is installed. and then import dump file data.
You do not have direct access to the RDS instance file system.
There are several data migration solutions.
The first solution using S3 Bucket.
Use DBMS_DATAPUMP to export to DUMP File on the RDS Instance.
Copy Dump File from the RDS instance to Amazon S3 Bucket.
Download the dump file from the S3 cloud to the developer edition database.
Import data from the dump file using expdp on the developer edition database.
The second solution using Database Link between the RDS Instance and the developer edition database.
Create a Database Link on the RDS Instance to the developer edition database.
Use DBMS_DATAPUMP to export to DUMP File on the RDS Instance.
Use DBMS_FILE_TRANSFER to Copy the Dump File from the RDS Instance to the developer edition database.
Import data from the dump file using expdp on the developer edition database.
The third solution using Oracle XE, if you cannot make a db link between the developer edition database and RDS instance.
Use DBMS_DATAPUMP to export to DUMP File on the RDS Instance.
Create a free instance of EC2(1 CPU core, 1G RAM, 30GD HDD ) or a
paid instance with a large disk and CPU.
Install a free database version of the Oracle XE version on EC2
instance.
Create a Database Link on the RDS Instance to the Oracle XE.
Use DBMS_FILE_TRANSFER to Copy the Dump File from the RDS Instance to the Oracle XE.
Copy files from the Oracle XE on EC2 instance via the sftp protocol on your PC.
Copy files from your PC to the developer edition database.
Import data from the dump file using expdp on the developer edition
database.
And other solutions without using DATA PUMP:
Export/Import Using Oracle SQL Developer.
Oracle Original Export/Import Original Utilities
I have an existing .dmp file on EC2 instance which has access to RDS instance running Oracle 11g on AWS.
I have read Importing Data in AWS RDS, it seems that AWS does not support this kind of direct transfer.
It does require to have a Source Oracle DB from where you have to create/export a .dmp file which you can then transfer to destination RDS instance by establishing a db link.
My question is, is there a way I can transfer/import my existing .dmp file to the DATA_DUMP_DIR on RDS Instance?
Any suggestions?
File access for the RDS instance is forbidden.
Access to the DATA_PUMP_DIR directory only through the db_link and use DBMS_FILE_TRANSFER package.
Option 1
You can do the export of data using the old exp utility on the EC2 instance, this utility also creates export files .dmp, but for a different format. The format is not compatible with impdp expdp.
The exp imp utility can connect over the SQL*NET network to the target database as client-server. This utility is obsolete and has less performance. The dmp file is not created on the server, as when running the utility expdp. The dmp file is written on the side where the utility exp is run (server or client)
$ORACLE_HOME/bin/exp parfile=parfile_exp_full FILE=export.dmp LOG=export.log
And then do the data import using the imp to RDS instance.
$ORACLE_HOME/bin/imp parfile=parfile_imp_full FILE=export.dmp LOG=import.log
Option 2
You can export the data to an CSV file using the utility $ORACLE_HOME/bin/sqlplus -s user/pass#ec2 #csv2.sql.
set heading off
set termout OFF
SET FEEDBACK OFF
SET TAB OFF
set pause off
set verify off
SET UNDERLINE OFF
set trimspool on
set echo off
set linesize 1000
set pagesize 0
set wrap off
spool test2.csv
select code||','||name||','||code_rail from alexs.all_station;
spool off
exit;
And then make the data import to RDS instance using the utility sqlldr.
Eventually, I had to spin up another AWS Instance, Install Oracle XE on it, then place my dump file in the DATA_PUMP_DIR & then follow the AWS RDS Data Import guide.
It's pretty annoying that there is no other way to do this. And having no SSH access to the RDS instace just adds up to that!
Also, the AWS Doc is not clear about the particulars.
Since Februay 2019, you can now import .dmp files from S3 buckets.
Here is the steps I followed after spending years reading AWS docs.
Put all the dump files in S3 (this is beyond the scope of this answer :D)
Create an RDS option group for S3 integration
Add the option S3_INTEGRATION to the group (select the group, then click on ADD OPTION)
Create an IAM role to authorize access to S3
Attach that role to the RDS instance
Connect to the instance using sqlplus
Follow the import procedure as described in the AWS RDS docs (go directly to Step #4)
Replace mys3bucket with the actual name of the bucket
Add the parameter p_s3_prefix if the files are located under a specific directory key
Leave DATA_PUMP_DIR as is, except if you want to override the default location.
To follow the task progression, enter
SELECT text FROM table(
rdsadmin.rds_file_util.read_text_file('BDUMP','dbtask-YOUR_TASK_ID.log')
);
replacing YOUR_TASK_ID by your actual task id :)
Check the content of DATA_PUMP_DIR by typing:
select * from table(RDSADMIN.RDS_FILE_UTIL.LISTDIR('DATA_PUMP_DIR')) order by mtime;
This should list all of your *.dmp files located in S3...
I need to migrate an existing application's database into oracle RDS database in Amazon Web services.
I have the dump file which is residing on an EC2 instance. The dump has not been taken by me.Also I would like to know how can I take the dump so that it can be imported successfully. The EC2 instance has an oracle regular client.
I have set up the oracle RDS instance in AWS and I am able to connect to the server.
I would like to know how can I import the database dump on RDS.
I am using this command :
imp rdsuser#oracledb FILE=fulldb.dmp TOUSER=rdsuser FROMUSER=SYSTEM log=test.log buffer=100000
Any lead is appreciated.
Also I would like to know what is the best method to import an existing database:
1. to take dump.
2. Or to take the clone of all files of database ( that will require the downtime in the server).
Best strategy is to take dump then import it into RDS . If your DB size is too big then contact AWS guys for help .