I am working for a company that has two Oracle Databases, lets call them LIVE and TEST. An export is performed every night to take a snapshot of the database for each day TEST is then dropped and recreated using existing table creation scripts, with the import finally putting the exported data from LIVE into the new TEST environment.
My questions are,
Is this really the best way to do this?
What better way is there?
Any URL's to demonstrate these ways, would be great
Instead of import/export use Datapump
check Oracle GoldenGate
check Oracle Streams
If you are using Enterprise Edition then you can look into transportable tablespaces as well, which have the advantage of exactly preserving the physical state of the data files so performance testing is more realistic.
Related
I am a java person and not so much familiar with Oracle available features. Please help me.
The requirement is that, we are looking for some virtual(replica/mirror/view) database to be created from Production database just for testing purpose. Once we are done with executing all the automation test cases, delete the virtual database created. So are there any such concepts in Oracle ?
We are on Oracle 12c.
Many apps use same DB(its huge)
PS: We also use docker for deployment and also AWS.
use Rman duplicate to duplicate the test database from production.
https://oracle-base.com/articles/11g/duplicate-database-using-rman-11gr2
you can duplicate from backups or duplicate from active database
You can probably ask your database admin to export the table space to a new test machine which has the same oracle version installed. May require If there are only very few tables, then you can spool your tables out and use sqlloader to load them to a test database ( you will need to manually create the structure of the tables in test environment before hand.
In both cases, you might want to scrub out the sensitive information as per your requirements and standards.
My colleague running Oracle Database (11g) in AIX and they would like to move this DB to RHEL. I already found Link. However I would like to check if someone have already migrated or used any other best tools.
you have several options. As pointed out before, Oracle Data Pump is the easiest approach. It would lift you from every version >=10g upwards (or even back when you use the VERSION= parameter).
The caveat is:
Size of the database - and your downtime requirements.
In terms of larger databases, Transportable Tablespaces is the usual choice. More work as you will have to rebuild meta information such as synonyms, view, plsql, sequences etc - and in your case you'll have to either CONVERT the tablespaces as you are coming from a Big Endiann platform and going to a Little Endiann. DBMS_FILE_TRANSFER could assist you here as it can restore and covert at the same time whereas RMAN will need a 2-phase operation with staging space for it.
You can speed up transportable tablespaces with RMAN Incremental Backups to avoid most of the copy/convert time. And you can ease it with Full Transportable Export/Import (minimum source: 11.2.0.3 - minimum destination: 12.1.0.1) where Data Pump does the manual work of transportable tablespaces.
And of course there are other techniques such as Create-Table-As-Select or Insert-Append-Select options via Database Links and such.
Just see the big slide deck "Upgrade / Migrate / Consolidate to 12.2" for customer examples - and the "Migrate >230Tb in <24 hours" decks on my page: https://mikedietrichde.com/slides/
Cheers,
Mike
Is there some reason you can't just use Oracle Database Pump?
Create the database on RHEL, make sure you use a compatible character set.
https://docs.oracle.com/cd/B19306_01/server.102/b14215/dp_overview.htm
We are developing a large data migration from Oracle DB (12c) to another system with SSIS. The developers are using a production copy database but the problem is that, due to the complexity of the data transformation, we have to do things in stages by preprocessing data into intermediate helper tables which are then used further downstream. The problem is that all developers are using the same database and screw each other up by running things simultaneously. Does Oracle DB offer anything in terms of developer sandboxing? We could build a mechanism to handle this (e.g. have dev ID in the helper tables, then query views that map to the dev), but I'd much rather use built-in functionality. Could I use Oracle Multitenant for this?
We ended up producing a master subset database of select schemas/tables through some fairly elaborate PL/SQL, then made several copies of this master schema so each dev has his/her own sandbox (as Alex suggested). We could have used Oracle Data Masking and Subsetting but it's too expensive. Another option for creating the subset database wouldn have been to use Jailer. I should note that we didn't have a need to mask any sensitive data.
Note. I would think this a fairly common problem so if new tools and solutions arise, please post them here as answers.
I have a properly functioning up to date 10g database locally that I don't want to mess with. I need to do some queries on a customer's database locally which is a couple versions behind from our current software. I had exported their full db using expdp. The user is the same, and the structure is pretty much the same. What is the proper way of having both databases loaded at the same time?
If I have worded this funny, or am going about this in the wrong way, please let me know! Thanks!
Edit:
There is one main user, and another user for each component/application within the main app.
Use Import Data Pump (impdp) with the "remap schema" option to load the exported schema into another schema in your existing database:
http://www.database.fi/2011/05/using-expdp-impdp-and-changing-schemas-with-remap_schema/
I have a requirement right now where my client's business people have populated a website with a bunch of data. They want the site to go live to production with the UAT data so that on launch day the site is not barren.
Now, the webservers and data centers are managed by a certain Big Blue friend of ours and they refuse to give me a user account on the UAT Database server, not even with access restricted only to the tables my app owns. That situation can be left to another discussion.
So, originally I was simply going to connected up to UAT using SQL Developer, and run it's nifty little INSERT statement export tool which will dump the data from a table into a series of INSERT statements. Since I can't have access to UAT, I can't do that.
Is there a method by which I can literally hand my blue friends some PL/SQL code which will dump all the table data from specified tables to INSERT statements? Preferably to a file (instead of the console)? This way they can take those INSERT statements and execute them against UAT.
I just answered a similar question yesterday. It may not be exactly what you want (and it is still incomplete), but it probably has the information to get you started to complete the scripts yourself. Check it out.
Let the Big Blue friend sort this out. If they don't give you access to the databases then they should populate the production database. Give them a list of tables an let them export them from UAT and import it into production. Export / import or datapump is the standard for these kind of operations, you should not be forced to invent your own because of their lack of cooperation.
have you considered exporting the data from your UAT db and then importing it to your local?