Oracle - side-by-side schema update technology...is there any? - oracle

Is there any technology out there that will allow you to do side-by-side updates of production schemas?
The goal is to have zero down time when applying updates to a schema in production.
Weblogic 10 has a similar feature for their Java EE apps where by you deploy the new version of the app and new connections go to the new app, while the existing connections continue to the old app. When all the old connections complete/timeout, the old app is retired and the new app continues on...zero down time.
Is there something similar in Oracle?

Yes. There is online redefinition package.
DBMS_Redefinition
But I doubt this will give you zero downtime, this doesn't account for every possible change to a schema. This lets you do some table changes. I think you need to define zero and how extensive the changes you want to make. Usually if you change the database, you have to change your client as well. If you changed your database, how would the client switch automatically from the old proc signature to the new proc signature - Instantaneously?
Databases don't work like apps. There either is a FK from tableA to tableB or there isn't... it can't not be there for current connection and exist only for new connection in the same manner as your application can. Databases just aren't the same.
That being said, there is rumor that Oracle is working on package versioning... so you could connect to a specific version of a package to make such a migration simpler. But again... that would work for packages, DBMS_redef would work for tables... but that's not the sum total of your database.

Oracle release today 11gr2, it has edition-based redefinition: http://download.oracle.com/docs/cd/E11882_01/server.112/e10881/chapter1.htm#NEWFTCH1

Depends what you mean, or include, in "schema".
If you want to add or drop an index, that can be done "in-flight", although it will require a lock which may halt activity for a time. In the latest Oracle versions, it doesn't need to hold the lock for the entire time it takes to build the index, just for a moment to lock in the change. If you have short-duration transactions it shouldn't be noticeable.
In some cases that applies to tables as well (eg adding a nullable or default column).
If you use PL/SQL (especially packages), things can be a little more complicated. Enhancements were mooted for 11gR1 to enable the in-flight application upgrade, but it got pushed out and is now expected in 11gR2 (probably out first half next year).
In the meantime, a workaround is a multi-schema solution. Say your data sits in one schema ("yellow") and your current application code is running in "blue" schema, you load your new application into "green schema". You switch your connections, one by one, from blue to green. Once your connections are all using "green", you can retire "blue" until your next upgrade (when "blue" becomes the new app and "green" is retired).
If you have a genuine 24/7 system, you'll probably always have to stage some upgrades. For example, add a new column as optional, upgrade the application to set it, then make it mandatory (possibly with some data change script for pre-existing rows).

Related

How can I determine exactly how much downtime needed during the scheduled maintenance window in Autonomous Database?

How can I determine exactly how much downtime is needed during the scheduled maintenance window in Autonomous Database? Also, how can I find out about what's included in a given patch?
As described in the doc, your database remains available during the maintenance window. Your existing database connections may get disconnected briefly; however, you can immediately reconnect and continue using your database. If your application uses Oracle Transparent Application Continuity (TAC), you avoid these brief disconnects altogether.
In order to see what bug fixes are delivered in these maintenance windows, you can run the following query:
-- To view patch information for all available patches
SELECT * FROM DBA_CLOUD_PATCH_INFO;
-- For patch ADBS-21.7.1.1
SELECT * FROM DBA_CLOUD_PATCH_INFO WHERE PATCH_VERSION = 'ADBS-21.7.1.1';
Disclaimer: I’m a Product Manager at Oracle.

Oracle Application Express - Backups

I have been tasked with creating some backups for some Oracle Apex apps (Application Express v4.1.1.00.23). The request is to back up both the applications & referenced db objects (not sure if this means just structure or structure & data).
On the one hand, I would have expected standard db backups to handle most or all of this but I'm very new to Apex so it's all a learning curve.
I'm currently exporting the application from apex and then exporting (using SQL Developer) all the database object dependencies that Apex gives me - although I see that the list doesn't include functions that are used for auth.
This seems a really clunky process that's very prone to mistakes (miss an object, save something to the wrong place, no guarantees of consistency etc).
Does Apex (my version!) offer something to do the job or is there something else I could be doing? I've had a good google but nothing has stood out.
UPDATE: I realise now that I should have included some extra info. I'm currently at a large organisation & I believe our db backups (which I guess/hope are done using rman) are done by a different department. I think the motivation for the request is so that we have some local, easily accessible backup so that if one of the developers messes something up we don't have to go through multiple layers of organisation (& undoubtedly a lot of time) to sort ourselves out. I suspect that some kind of source control would be a great starting point but I'm not sure how far I'll get with that idea - especially as we seem to have little in the way of autonomy over things like servers.
RMAN is the way to go to backup an oracle database:
https://docs.oracle.com/database/121/BRADV/toc.htm
there are tons of material on the hows and whys online; just google "oracle rman" and you'll find what you need (the documentation should cover you as well of course).
cheers
Standard DB backups will include everything you need.
The Apex applications I develop are static, meaning the end users make no changes to the Apex application, and there is no need to make a specific backup other than to store the original apex application .sql installation files in a safe place.
If you must, you can make an export of the database schemas the application uses. For example with the expdp utility.
IN apex you need to take 2 backups one in the workspace
Second your application
Third is while using export import from the database it tends to loose & character in procedure ..So beter use rman and take a complete backup.
I have found this Oracle white paper Life Cycle Management with Oracle Application Express (Revision 2) which does what it says on the tin - including various strategies for exporting, backing up & managing 'lost application development'. It's a really good read and I'll be using it as a template for suggestions of how we can manage our process in future.

Using liquibase versioning table definitions, not change sets

I'd like to have my version only the latest table definition in my repository, (no change sets), and have liquibase figure out which changes are needed when patching my databases. Please take note that I have a very big database schema (1000+ tables) installed in hundreds of customer sites, with different versions each one, and I really don't know which objects each version has
How can I make a liquibase-based installer for my application, given my set of table definitions, and hundreds of databases with about 12 different versions of objects on each one?
To be more specific, I'd like liquibase to compare my table definitions with the production database, and emit the alter table statements required to make the database current with my latest version.
I could contribute code if necessary in order to get this done
Liquibase and tools like it (for example flyway) are primarily designed to support database migrations. A migration is where every change to the DB is tracked so that it can be replayed on target environments thereby keeping them in sync with development (although time-shifted). It's all about keeping your schema under revision control.
Your use case is a little different. If I understand correctly you're trying to retrofit Liquibase onto a series of environments that you are not 100% certain match your application's current schema?
I would only recommend migration tools like liquibase if you intend to use them going forwards. If all you want is a DB diff tool, I would suggest you look elsewhere.
To perform an initial sync then I would suggest you investigate the diffLog command, coupled with changeLogSync command to initialize liquibase on the target DB.
comparing databases and genrating sql script using liquibase

Oracle database migration from 11g to 12c

I need to do a database migration from Oracle 11g to 12c. But I cannot do a direct export and import kind
of a migration since there are a lot of schema changes which are going to happen. I already have the column mappings
in a sparedsheet with old columns and new columns with all details such as data type, constraints, etc.
There are new columns added to many tables are the default values that should be populated are also known.
So what should be the best approach to do this migration?
There are more ways to do this. Start with getting a dba involved.
To minimize production downtime, you could check if making a logical standby database is feasible in your situation. In that case, make the target database a 12c one, that saves for upgrade time.This target database is in sync with the source database at all times and makes it very valuable. Clone the target database and use that clone to test the migration steps. If the migration fails, you can easily re create a new clone to correct the migration process on.
Working in this way could even enable bi-directional replication, replication from the migrated database back to the source database that could make it possible to revert to the original database in the unlikely event that after production start on the new database things don't work as expected.
Start with adding a dba to the project, a good dba can help minimize downtime and reduce risk.

Migrating and Backing up Schemas (complex database structures)

Hey guys,
I need to figure out a way to back up and also migrate our Oracle database from our production schema to the dev schema and the other way around.
We have bunch of config tables that drive how systems on our platform run, and when setting up new systems or doing maintenance, we need to update our config tables. We want to be able to work on the dev schemas and after setting up a system/feature, we want to be able to migrate all those configs to the dev schemas.
I thought of running a procedure where we give the ID of the system (from the main table) and i would go through all the tables and select nvl(..) and if it doesn't exist, i would insert into, and if it does exist then i just run an update on that row.
This code will get very messy and complicated especially since the whole config schema is very complex and it might be hard to handle all the keys properly.
Another option i was looking at was triggers, so when setting up a new system, there would be a log of all the statements we ran while setting up/editing a system, then we would run it on our production schema.
I'm on a coop term, and have only been working with databases for 6 months, so i don't know that much and any information/advice would be greatly appericiated.
(We use pl/sql)
What about using export / import (or datapump) to bring over the config tables?
Check out data comparison tools like this
Think TOAD has one built in. I'm sure there are others out there too.
It is common to have tables in a schema that are what we call "static data", i.e. the users don't change it because it controls how the application works.
Each change to config data should not be run ad-hoc in the target environment. Instead, you design and code your DML carefully in one or more scripts, which get tested in a dev environment, checked into change control, and can be re-run in any environment when required.

Resources