I have a requirement right now where my client's business people have populated a website with a bunch of data. They want the site to go live to production with the UAT data so that on launch day the site is not barren.
Now, the webservers and data centers are managed by a certain Big Blue friend of ours and they refuse to give me a user account on the UAT Database server, not even with access restricted only to the tables my app owns. That situation can be left to another discussion.
So, originally I was simply going to connected up to UAT using SQL Developer, and run it's nifty little INSERT statement export tool which will dump the data from a table into a series of INSERT statements. Since I can't have access to UAT, I can't do that.
Is there a method by which I can literally hand my blue friends some PL/SQL code which will dump all the table data from specified tables to INSERT statements? Preferably to a file (instead of the console)? This way they can take those INSERT statements and execute them against UAT.
I just answered a similar question yesterday. It may not be exactly what you want (and it is still incomplete), but it probably has the information to get you started to complete the scripts yourself. Check it out.
Let the Big Blue friend sort this out. If they don't give you access to the databases then they should populate the production database. Give them a list of tables an let them export them from UAT and import it into production. Export / import or datapump is the standard for these kind of operations, you should not be forced to invent your own because of their lack of cooperation.
have you considered exporting the data from your UAT db and then importing it to your local?
Related
I have been tasked with creating some backups for some Oracle Apex apps (Application Express v4.1.1.00.23). The request is to back up both the applications & referenced db objects (not sure if this means just structure or structure & data).
On the one hand, I would have expected standard db backups to handle most or all of this but I'm very new to Apex so it's all a learning curve.
I'm currently exporting the application from apex and then exporting (using SQL Developer) all the database object dependencies that Apex gives me - although I see that the list doesn't include functions that are used for auth.
This seems a really clunky process that's very prone to mistakes (miss an object, save something to the wrong place, no guarantees of consistency etc).
Does Apex (my version!) offer something to do the job or is there something else I could be doing? I've had a good google but nothing has stood out.
UPDATE: I realise now that I should have included some extra info. I'm currently at a large organisation & I believe our db backups (which I guess/hope are done using rman) are done by a different department. I think the motivation for the request is so that we have some local, easily accessible backup so that if one of the developers messes something up we don't have to go through multiple layers of organisation (& undoubtedly a lot of time) to sort ourselves out. I suspect that some kind of source control would be a great starting point but I'm not sure how far I'll get with that idea - especially as we seem to have little in the way of autonomy over things like servers.
RMAN is the way to go to backup an oracle database:
https://docs.oracle.com/database/121/BRADV/toc.htm
there are tons of material on the hows and whys online; just google "oracle rman" and you'll find what you need (the documentation should cover you as well of course).
cheers
Standard DB backups will include everything you need.
The Apex applications I develop are static, meaning the end users make no changes to the Apex application, and there is no need to make a specific backup other than to store the original apex application .sql installation files in a safe place.
If you must, you can make an export of the database schemas the application uses. For example with the expdp utility.
IN apex you need to take 2 backups one in the workspace
Second your application
Third is while using export import from the database it tends to loose & character in procedure ..So beter use rman and take a complete backup.
I have found this Oracle white paper Life Cycle Management with Oracle Application Express (Revision 2) which does what it says on the tin - including various strategies for exporting, backing up & managing 'lost application development'. It's a really good read and I'll be using it as a template for suggestions of how we can manage our process in future.
Using Toad for Oracle 10.6 with DB Admin Add-in. For our migration process Dev to QA to Prod, we are starting to use the Schema Compare Module to generate the Sync Script DDL. After execution, we want to store the Sync Scripts for historical purposes. Due to policy, we are unable to copy these much of anyplace. Even the Windows server where toad runs is write-restricted.
I am thinking I could create a table with a CLOB column and store the scripts there, unless you folks tell me that this is a Really Bad Idea. I am looking for any tips on things like handling embedded special characters, or any other pitfalls that I may encounter.
Thanks,
JimR
Doing the best with what you have is a good thing. If management won't let you have version control, then jamming it in the database is better than nothing at all. The important thing is to create a process, document it, and follow it for deployments.
As for using schema differences to deploy, take a look at:
http://thedailywtf.com/Articles/Database-Changes-Done-Right.aspx
I have a properly functioning up to date 10g database locally that I don't want to mess with. I need to do some queries on a customer's database locally which is a couple versions behind from our current software. I had exported their full db using expdp. The user is the same, and the structure is pretty much the same. What is the proper way of having both databases loaded at the same time?
If I have worded this funny, or am going about this in the wrong way, please let me know! Thanks!
Edit:
There is one main user, and another user for each component/application within the main app.
Use Import Data Pump (impdp) with the "remap schema" option to load the exported schema into another schema in your existing database:
http://www.database.fi/2011/05/using-expdp-impdp-and-changing-schemas-with-remap_schema/
I am working for a company that has two Oracle Databases, lets call them LIVE and TEST. An export is performed every night to take a snapshot of the database for each day TEST is then dropped and recreated using existing table creation scripts, with the import finally putting the exported data from LIVE into the new TEST environment.
My questions are,
Is this really the best way to do this?
What better way is there?
Any URL's to demonstrate these ways, would be great
Instead of import/export use Datapump
check Oracle GoldenGate
check Oracle Streams
If you are using Enterprise Edition then you can look into transportable tablespaces as well, which have the advantage of exactly preserving the physical state of the data files so performance testing is more realistic.
Hey guys,
I need to figure out a way to back up and also migrate our Oracle database from our production schema to the dev schema and the other way around.
We have bunch of config tables that drive how systems on our platform run, and when setting up new systems or doing maintenance, we need to update our config tables. We want to be able to work on the dev schemas and after setting up a system/feature, we want to be able to migrate all those configs to the dev schemas.
I thought of running a procedure where we give the ID of the system (from the main table) and i would go through all the tables and select nvl(..) and if it doesn't exist, i would insert into, and if it does exist then i just run an update on that row.
This code will get very messy and complicated especially since the whole config schema is very complex and it might be hard to handle all the keys properly.
Another option i was looking at was triggers, so when setting up a new system, there would be a log of all the statements we ran while setting up/editing a system, then we would run it on our production schema.
I'm on a coop term, and have only been working with databases for 6 months, so i don't know that much and any information/advice would be greatly appericiated.
(We use pl/sql)
What about using export / import (or datapump) to bring over the config tables?
Check out data comparison tools like this
Think TOAD has one built in. I'm sure there are others out there too.
It is common to have tables in a schema that are what we call "static data", i.e. the users don't change it because it controls how the application works.
Each change to config data should not be run ad-hoc in the target environment. Instead, you design and code your DML carefully in one or more scripts, which get tested in a dev environment, checked into change control, and can be re-run in any environment when required.