Which is the fastest way to create a test database(with all data) from a production database which is quite big in size (400 GB)? - oracle

I am a java person and not so much familiar with Oracle available features. Please help me.
The requirement is that, we are looking for some virtual(replica/mirror/view) database to be created from Production database just for testing purpose. Once we are done with executing all the automation test cases, delete the virtual database created. So are there any such concepts in Oracle ?
We are on Oracle 12c.
Many apps use same DB(its huge)
PS: We also use docker for deployment and also AWS.

use Rman duplicate to duplicate the test database from production.
https://oracle-base.com/articles/11g/duplicate-database-using-rman-11gr2
you can duplicate from backups or duplicate from active database

You can probably ask your database admin to export the table space to a new test machine which has the same oracle version installed. May require If there are only very few tables, then you can spool your tables out and use sqlloader to load them to a test database ( you will need to manually create the structure of the tables in test environment before hand.
In both cases, you might want to scrub out the sensitive information as per your requirements and standards.

Related

Sybase to Oracle table Migration via Migration Wizard offline

How can I create a script of inserts for my sybase to oracle Migration? The Migration wizard only gives me the option to migrate procedures and triggers and such. But there is no select for just tables. When I try to migrate tables offline and move data. the datamove/ folder is empty. I would also want to only migrate specific tables (ones with long identifiers) because i was able to migrate the rest with Copy to Oracle.
I must also note that i do not want to upgrade to an new version of oracle. Currently on ~12.1 so i need to limit the identifiers.
How can I get the offline scripts for table inserts?
You (probably!) don't want INSERTs for offline migration scripts. If you're just running INSERTs, then the online method would probably suffice.
The point of the Offline strategy is to take the data from your Sybase instance to flat, delimited text files (using BCP), which we can THEN use to load back into an Oracle Database using SQLLDR or External Tables which will be EXPONENTIALLY faster than using INSERT scripts.
Take a look at this whitepaper where I go into offline Sybase migrations in detail.
You can consider DCO-based Sybase-to-Oracle replication via the Sybase Rep Server. This way, not only will you have all data moved, but you will also be able to have DML updates propagated online, which will make your system switchable live.

Which is the fastest way to replicate oracle database deployed in RDS?

For example: Lets say i have two database DB1 and DB2. Now my requirement is to refresh data from DB1 to DB2 every night. DB1 is live database and DB2 is for non business users for data analysis.
My questions:
1) What must be the tool i should use for my requirement? I need a solution that is fast, since the database copy has to be done everyday.
2) Does AWS have any tool to automate the backup and restore the data?
There's a load of ways to do this and the answer comes down to what storage you're using, are they on the same server and then ultimately the size of the database.
RMAN's more a backup / recovery tool but definitely a runner for cloning. If you're not sure what RMAN does then I wouldn't even start to implement it as it's very tricky if you aren't super comfortable with Oracle DB's.
My recommendation is just use Oracle datapump, export the schema's you need to a dump file then ship it over and import them into the other database making sure to overwrite/drop the existing schemas.
Other than doing a differential clone at a SAN level this is probably the quickest and definitely easiest way to get it done

Oracle database, moving changes between databases

We have application where all logic is implemented in oracle database using pl/sql.
We have different oracle databases for development and production.
When developer make changes in development database after testing we move changes from development database to production database using schema compare tool of toad. Problem here is that developer must have password of production database. We want only admin to know this password.
Can somebody advice me better way of moving changes between databases without need of having production database password, what is best practice for this ?
I posted this question on oracle OTN forums and got some advices there. Maybe it will be interesting for somebody.
Her is a link
I do not recommend to use comparison tools for generating of database migration scripts.
Development and production databases (and also test databases) must be identical except for current changes made by developers in development databases. Generally speaking this assertion is not correct, because there are many kinds of differencies between development and production databases, e.g. partitioned objects, additional objects for audit (triggers, tables), replication-based objects (snapshots), different tablespaces etc.
Every developer must know, what changes were made by him and applied to development database.
If developer was able to change schema and data in developer database, then he/she must be able to create programs for these DDL and DML changes.
To delegate the same developer an ability to run these migration programs on production database is a bad idea. But if you don't have a better way of database migration, then you can use one of following:
1. Configure Oracle authentication by OS. OS authentication allows Oracle to pass
control of user authentication to the operating system.
2. TOAD can save passwords without disclose them. DBA will insert required password
to local TOAD installation at developer PC (if developers use PC).

Oracle Export Import

I am working for a company that has two Oracle Databases, lets call them LIVE and TEST. An export is performed every night to take a snapshot of the database for each day TEST is then dropped and recreated using existing table creation scripts, with the import finally putting the exported data from LIVE into the new TEST environment.
My questions are,
Is this really the best way to do this?
What better way is there?
Any URL's to demonstrate these ways, would be great
Instead of import/export use Datapump
check Oracle GoldenGate
check Oracle Streams
If you are using Enterprise Edition then you can look into transportable tablespaces as well, which have the advantage of exactly preserving the physical state of the data files so performance testing is more realistic.

How can I replicate an Oracle 11g database(data+structure) on my local machine for development?

I am working on a test server with an Oracle 11g installed. I was wondering if there is anyway I can replicate the database(environment + data) on my local Linux machine. I am using a CentOS 5.3 on Windows XP with SUN Virtual Box. On Windows I am using sqldeveloper client to connect to the 11g database.
There are a number of ways to move the data over:
Restore an RMAN backup on your test server
Export and import the data using exp/expdp/imp/impdp
Export and import using a transportable tablespace (Further Info)
Use database links to duplicate the data using SQL
You can use the Database Configuration Assistant to generate a template from your production database. This will give you all the parameters and tablespaces, among other things. You will need to tweak the configuration somewhat; for instance the file paths may be wrong, and some parameters may need downsizing. You can then feed that template into DBCA to clone the database on you Linux machine.
To get the schemas and data you should use Data Pump (rather than the older Import / Export utlities). This can be run off the command line or from PL/SQL.
Bear in mind that using production data in a development or test environment can cause you to run foul of data protection laws and other compliance issues. It depends on what your application does and what jurisdiction you operate under. But if your production system contains citizens' personal data you need to be very careful. There are products out there which will apply masking as part of a data import process (Oracle sells one) but they tend to be expensive. Rolling your own masking product can be tricky: if this applies to your situation be sure to get your compliance staff (legal team) involved early.
I would suggest you install Oracle XE which is free to use on your local if your development is not something that is related to core database features. You can then use the methods given above to pump data into Oracle XE and compile your code on it, though for development I don't think you would need data as much as that in production.

Resources