How to transfer Oracle dump file to AWS RDS instance? - oracle

I have an existing .dmp file on EC2 instance which has access to RDS instance running Oracle 11g on AWS.
I have read Importing Data in AWS RDS, it seems that AWS does not support this kind of direct transfer.
It does require to have a Source Oracle DB from where you have to create/export a .dmp file which you can then transfer to destination RDS instance by establishing a db link.
My question is, is there a way I can transfer/import my existing .dmp file to the DATA_DUMP_DIR on RDS Instance?
Any suggestions?

File access for the RDS instance is forbidden.
Access to the DATA_PUMP_DIR directory only through the db_link and use DBMS_FILE_TRANSFER package.
Option 1
You can do the export of data using the old exp utility on the EC2 instance, this utility also creates export files .dmp, but for a different format. The format is not compatible with impdp expdp.
The exp imp utility can connect over the SQL*NET network to the target database as client-server. This utility is obsolete and has less performance. The dmp file is not created on the server, as when running the utility expdp. The dmp file is written on the side where the utility exp is run (server or client)
$ORACLE_HOME/bin/exp parfile=parfile_exp_full FILE=export.dmp LOG=export.log
And then do the data import using the imp to RDS instance.
$ORACLE_HOME/bin/imp parfile=parfile_imp_full FILE=export.dmp LOG=import.log
Option 2
You can export the data to an CSV file using the utility $ORACLE_HOME/bin/sqlplus -s user/pass#ec2 #csv2.sql.
set heading off
set termout OFF
SET FEEDBACK OFF
SET TAB OFF
set pause off
set verify off
SET UNDERLINE OFF
set trimspool on
set echo off
set linesize 1000
set pagesize 0
set wrap off
spool test2.csv
select code||','||name||','||code_rail from alexs.all_station;
spool off
exit;
And then make the data import to RDS instance using the utility sqlldr.

Eventually, I had to spin up another AWS Instance, Install Oracle XE on it, then place my dump file in the DATA_PUMP_DIR & then follow the AWS RDS Data Import guide.
It's pretty annoying that there is no other way to do this. And having no SSH access to the RDS instace just adds up to that!
Also, the AWS Doc is not clear about the particulars.

Since Februay 2019, you can now import .dmp files from S3 buckets.
Here is the steps I followed after spending years reading AWS docs.
Put all the dump files in S3 (this is beyond the scope of this answer :D)
Create an RDS option group for S3 integration
Add the option S3_INTEGRATION to the group (select the group, then click on ADD OPTION)
Create an IAM role to authorize access to S3
Attach that role to the RDS instance
Connect to the instance using sqlplus
Follow the import procedure as described in the AWS RDS docs (go directly to Step #4)
Replace mys3bucket with the actual name of the bucket
Add the parameter p_s3_prefix if the files are located under a specific directory key
Leave DATA_PUMP_DIR as is, except if you want to override the default location.
To follow the task progression, enter
SELECT text FROM table(
rdsadmin.rds_file_util.read_text_file('BDUMP','dbtask-YOUR_TASK_ID.log')
);
replacing YOUR_TASK_ID by your actual task id :)
Check the content of DATA_PUMP_DIR by typing:
select * from table(RDSADMIN.RDS_FILE_UTIL.LISTDIR('DATA_PUMP_DIR')) order by mtime;
This should list all of your *.dmp files located in S3...

Related

Load .DMP oracle file from local machine to RDS Oracle

I am wandering around and didn't get an answer to this question.
is there any way to import Oracle .dmp file stored on a local machine to RDS Oracle?
If Yes how to do it?
Else why it's not possible to do so as other databases gave the flexibility to do these kinds of imports through more than one way.
You can't do that. When you import data with Oracle Data Pump, you must transfer the dump file that contains the data from the source database to the target database**. You can transfer the dump file using an Amazon S3 bucket or by using a database link between the two databases.
If your local machine contains a database and you have a network connection between your on-premises database and your Oracle RDS , then you can use NETWORK_LINK, although I don't recommend it. It is much better to tranfer the file using S3 bucket.
https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/Oracle.Procedural.Importing.html

Oracle Database Backup in DBeaver

I am new to Oracle databases. I have installed DBeaver (never used this before too) to connect to the database.
I have created a connection (which I believe is called database) and now I am able to see the database tables and everything. How do I take the backup of the Oracle Database in DBeaver so I can use it locally for test purposes before making any change on live database?
I can't find any option to take the backup of connection/database.
To do a proper backup of your Oracle Database, you should use the oracle provided utility, Recovery Manager. It's a command line interface that's called from your DB server shell prompt via 'RMAN'
You can also use Data Pump to export all or part of a database that can be used to import to another database...not really used for recovery of an existing database.
I'm not aware of your tool having interfaces for either of these Oracle features.
You might not need a backup at all for your needs, take a look at Oracle Flashback Technology.
DBeaver does not support oracle database export import. See details here:
https://dbeaver.com/docs/wiki/Backup-Restore/
You need to run the sqlplus tool to create a folder where oracle is going to import/export database dumps. Login should happen as sys as sysdba and enter the password you previously entered during database server installation. Example:
sqlplus sys/[your password] as sysdba
After you successfully logged into sqlplus run the following command (don't forget to set to a different folder that you prefer to use):
create or replace directory DATA_PUMP_DIR as 'D:\Database Backups';
Once this is done exit from sqlplus and enter the following command into the command line (again no sqlplus should be used here)
expdp sys/[your password]#localhost:1521/[listener name] file=your-database-dump-file.dmp owner=[your schema]
Once this is done and finished you can zip your database dump if you would like to upload it somewhere else. (I had 9 GB dump and the zipped size was 1.6 GB)

How to generate and load test data in amazon RDS for Oracle

For a POC, we need 10 GB of data to be available in Oracle RDS instance. Any test data is ok ( like TPC for benchmarking ) , for this requirement is there any specific way to create database and pump the sample data ?
Assuming you have opened the firewall port 1521 to the RDS server, you should be able to use sqlplus, sqlci, sqlldr to connect to the RDS instance as follows:
USER/PASSWORD#//hostName:port/SID
e.g.
sqlplus scott/tigerD#//myhost.this.that.amazon.com:1521/THE_SID
Once you prove this works, use this in the tools to squirt data into the RDS instance.
E.g. .sql INSERT scripts, sqlldr scripts etc.

How to export using DATA_PUMP to S3 bucket?

We have RDS (Oracle) instance, I need to export specific Schema into dumpfile. Export works and copies dump file into DATA_PUMP_DIR. Issue is that RDS do not have file directory access.
I need exported DMP file either on S3 or copy to another EC2 instance.
The article: LINK talks about copying data dump file between two RDS instances but not to S3 or EC2.
Third option. I am using it.
Take a look at alexandria-plsql-utils project, and especially look at: amazon_aws_auth_pkg, amazon_aws_s3_pkg and ftp_util_pkg packages.
Install required packages and dependencies.
Do your dump, then with such example code below you can copy file from Amazon RDS Oracle into S3 bucket.
declare
b_blob blob;
begin
b_blob := file_util_pkg.get_blob_from_file ('DATA_PUMP_DIR', 'my_dump.dmp');
amazon_aws_auth_pkg.init ('aws_key_id','aws_secret', p_gmt_offset => 0);
amazon_aws_s3_pkg.new_object('my-bucket-name', 'my_dump.dmp', b_blob, 'application/octet-stream');
end;
`
There are several ways to solve this problem.
First option.
Install a free database version of the Oracle XE version on EC2
instance(It is very easy and fast)
Export a schema from the RDS instance to DATA_PUMP_DIR
directory. Use DBMS_DATAPUMP package or run expdp user/pass#rds on EC2 to create a dump file.
Create database link on RDS instance between RDS DB and Oracle XE
DB.
If you are creating a database link between two DB instances inside
the same VPC or peered VPCs the two DB instances should have a valid
route between them.
Adjusting Database Links for Use with DB Instances in a VPC
Copy the dump files from RDS instance to Oracle XE DB on EC2 uses
the DBMS_FILE_TRANSFER.PUT_FILE via database link
Copy files from the DATA_PUMP_DIR directory Oracle XE on EC2 instance to the S3.
Second option.
Use the obsolete utility exp to export. It has restrictions on the export of certain types of data and is slower.
Run exp user/password#rds on EC2 instance.
Copy files from the directory Oracle XE on EC2 instance to the S3
Original export is desupported for general use as of Oracle Database
11g. The only supported use of Original Export in 11g is backward
migration of XMLType data to a database version 10g release 2 (10.2)
or earlier. Therefore, Oracle recommends that you use the new Data
Pump Export and Import utilities, except in the following situations
which require Original Export and Import:
Original Export and Import
It's now possible to directly access a S3 bucket from a Oracle database. Please have a look at the following documentation: https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/oracle-s3-integration.html
And here the official news that this is supported: https://aws.amazon.com/about-aws/whats-new/2019/02/Amazon-RDS-for-Oracle-Now-Supports-Amazon-S3-Integration/?nc1=h_ls
It seems that the first post was a little bit to early to get this news. But anyway this post lists further good solutions like the database link.

How to create a dump?

I'm one of the junior DBA working in IT company.In my company there are so many schemas is there.Now my question is How to create a dump file(some times i'm working at home.That time how to use that dump file ).Please suggest me
NOTE:I am using Oracle SQL Developer.
Expdp helps in exporting the database and impdp helps in importing the database. you can directly export one schema to another (in different database also) by using network link concept.
If network link concept is used then the creation of separate expdp file is not required.
For example If you have to export a schema called schema1 with password pwd1 from source database to target database then
first you need admin privileges of your target and source schema.
You can create a network link between source and target schema
CREATE PUBLIC DATABASE LINK example_link
CONNECT TO schema1 IDENTIFIED BY pwd1
USING 'server_name:port/service_name';--(put source database server_name,port and service name)
then create a directory in your target server :-
CREATE OR REPLACE DIRECTORY exp_dir AS 'F:/location';
grant read,write on directory exp_dir to schema1;
After this login to your target server and from command line use the below command:
impdp dba_username/dba_pwd network_link=example_link directory=exp_dir remap_tablespace=source_tbs:target_tbs remap_schema=schema1:schema1 parallel=2
You should use the Oracle Data Pump tool. The tool allows you to export your data into a .dmp file and import it into any database. Here is a video showing how to use the data pump tool in SQLDeveloper. I think this is a relatively new feature in SQLDeveloper, so make sure you have the appropriate versions..
Video Tutorial HERE
From the command line, you can use data pump with the expdp and impdp commands like so..
Set your oracle environment by running the below command and providing your oracle SID
. oraenv
Then you can run your export command..
expdp directory=/bu1/dpdump/ dumpfile=myexport.dmp logfile=mylog.log schemas=users,products,sales
The parameters are as follows..
directory - the directory where to create the dumpfile and log
dumpfile - name of the dump file (should end in .dmp)
logfile - name of the log file (should end in .log)
schemas - comma seperated list of the schemas you want to export
NOTE: you need dba privileges to use datapump. It will prompt you for the credentials
Data Pump Documentation is here
Exporting of ORACLE database objects is controlled by parameters. To get familiar with EXPORT parameters type:
exp help=y
You will get a short description and the default settings will be shown.
The EXPORT utility may be used in three ways:
Interactive dialogue
Controlled through bypassed parameters
Parameterfile controlled
Example to the 2nd option:
exp scott/tiger file=empdept.expdat tables=(EMP,DEPT) log=empdept.log
Take a look at these links for further readings:
Original Export and Import
The ORACLE Import/Export Utilities

Resources