Oracle11g Database Synchornization - oracle

I have a WPF application with back-end as Oracle11gR2. We need to enable our application to work in both online and offline(disconnected) mode. We are using Oracle standard edition(with single instance) as client database. I am using Sequnece Numbers for Primary Key Columns. Is there anyway to sync my client and server database without any issues in Sequence number columns. Please note that we will restrict creation of basic(master) data to be created only in server.

There are a couple of approaches to take here.
1- Write the sync process to rebuild the server tables (on the client) each time with a SELECT INTO. Once complete, RENAME the current table to a "temp" table, and RENAME the newly created table with the proper name. The sync process should DROP the temp table as one of its first steps. Finally, recreate the indexes and you should be good-to-go.
2- Create a backup of the server-side database, write a shell script to copy it down and restore it on the client.
Each of these options will preserve your sequence numbers. Which one you choose really depends on your skills. If you're more of a developer, you can make #1 work. If you've got some Oracle DBA skills you should be able to make #2 work.
Since you're on 11g, there might be a cleaner way to do this using Data Pump.

Related

Sybase to Oracle table Migration via Migration Wizard offline

How can I create a script of inserts for my sybase to oracle Migration? The Migration wizard only gives me the option to migrate procedures and triggers and such. But there is no select for just tables. When I try to migrate tables offline and move data. the datamove/ folder is empty. I would also want to only migrate specific tables (ones with long identifiers) because i was able to migrate the rest with Copy to Oracle.
I must also note that i do not want to upgrade to an new version of oracle. Currently on ~12.1 so i need to limit the identifiers.
How can I get the offline scripts for table inserts?
You (probably!) don't want INSERTs for offline migration scripts. If you're just running INSERTs, then the online method would probably suffice.
The point of the Offline strategy is to take the data from your Sybase instance to flat, delimited text files (using BCP), which we can THEN use to load back into an Oracle Database using SQLLDR or External Tables which will be EXPONENTIALLY faster than using INSERT scripts.
Take a look at this whitepaper where I go into offline Sybase migrations in detail.
You can consider DCO-based Sybase-to-Oracle replication via the Sybase Rep Server. This way, not only will you have all data moved, but you will also be able to have DML updates propagated online, which will make your system switchable live.

Read only SAS view(on double click too)

We have SAS datasets, for which many people have access to read and write. Many a times user click those tables and open. Table gets locked. To circumvent this problem, I tried to created views in same library, if people double click the view it opens table and locks the table again.
One solution I am thinking of to create view in new library with access=read only option.
Is there read only view option, where in someone double clicks and table does not lock the table. Is it possible to create this view in same library.
I also had to deal with this problem in an environment where we didn't have SAS/SHARE. My solution was to write a batch job that ran at regular intervals doing the following:
Divert the log to a text file.
Attempt to lock the table using a lock statement.
Release the lock immediately if successful.
Parse the log file using a data step.
Extract the usernames of anyone locking the table.
Send an email to all users of the table notifying them that user X was locking it.
Updates to the table only took a fraction of a second each, so although it was possible to catch someone making a legitimate update (or prevent them from doing so), this was very unlikely.
I suggest the best way around this is to create a simple 'data viewer' web application. If you have a mid-tier and a stored process server then you are ready to go, it should only a couple of hours if you have basic javascript / html knowledge.
I wrote a detailed guide for building web apps using SAS in this sgf paper, and a quick summary in this blog post.
The hard part will be convincing your users to use the web app instead of client tools for reading the data!
In the long term it is really best to avoid using SAS datasets and use an actual database instead.
You can create views for those datasets in the same library, but save them to a new SAS folder and give the users read only access to the folder & views. And educate your users about SAS table locks so that they wont get put off if they see lock errors.
If you want users to able to write to those tables, then I recommend having a control framework or process in place.
Example Process:
Users have to submit their code or the data that they want to add / edit,
As an admin you apply those changes in batches / once a week or a day.
Example Control Frame Work:
All tables should be edited / write to using Stored Processes
Create Stored Processes that checks the table lock before edit / write to the tables,
Users will use the SP to write to the tables,
If two users run the same SP at the same time: The second SP to run will see the lock flag and print a message to the user to run the SP again in few mins.

Oracle database migration from 11g to 12c

I need to do a database migration from Oracle 11g to 12c. But I cannot do a direct export and import kind
of a migration since there are a lot of schema changes which are going to happen. I already have the column mappings
in a sparedsheet with old columns and new columns with all details such as data type, constraints, etc.
There are new columns added to many tables are the default values that should be populated are also known.
So what should be the best approach to do this migration?
There are more ways to do this. Start with getting a dba involved.
To minimize production downtime, you could check if making a logical standby database is feasible in your situation. In that case, make the target database a 12c one, that saves for upgrade time.This target database is in sync with the source database at all times and makes it very valuable. Clone the target database and use that clone to test the migration steps. If the migration fails, you can easily re create a new clone to correct the migration process on.
Working in this way could even enable bi-directional replication, replication from the migrated database back to the source database that could make it possible to revert to the original database in the unlikely event that after production start on the new database things don't work as expected.
Start with adding a dba to the project, a good dba can help minimize downtime and reduce risk.

oracle database sandbox

Is it possible to create database server "sandbox"?So there is a master server that contains real data and a sandbox server that should dispatch read request to the master server in case the sandbox does not have cached data.In the case of a write request it should create a local copy of the data and apply changes to that copy without any impact to the master server.
You could build such a thing.
Create a local Oracle database with a database link that points back to the master database.
Copy the DDL for every object you're interested in from the master database to your local database renaming each table (i.e. EMP becomes EMP_LOC).
Create a view in the local database for each table that does a UNION ALL between the remote and local copies of the table.
Create an INSTEAD OF trigger on the local view that writes any changes only to the local table.
While you could do such a thing, however, it's not obvious why you'd want to. It would be a fair amount of work to set up and maintain and performance could easily get dodgy rather easily. And it's not obvious what problem this approach solves-- it wouldn't replace the need to have isolated development, test, and staging environments. And I'm hard-pressed to come up with a lot of use cases where this sort of "sandbox" would be preferable to one of those environments.
#Justin Cave give a good approach.. however maybe you should consider creating a Virtual Machine and take a snapshot of your PROD instance whenever you want to work on something new with the latest data.

Dump Oracle table(s) data to INSERT statements

I have a requirement right now where my client's business people have populated a website with a bunch of data. They want the site to go live to production with the UAT data so that on launch day the site is not barren.
Now, the webservers and data centers are managed by a certain Big Blue friend of ours and they refuse to give me a user account on the UAT Database server, not even with access restricted only to the tables my app owns. That situation can be left to another discussion.
So, originally I was simply going to connected up to UAT using SQL Developer, and run it's nifty little INSERT statement export tool which will dump the data from a table into a series of INSERT statements. Since I can't have access to UAT, I can't do that.
Is there a method by which I can literally hand my blue friends some PL/SQL code which will dump all the table data from specified tables to INSERT statements? Preferably to a file (instead of the console)? This way they can take those INSERT statements and execute them against UAT.
I just answered a similar question yesterday. It may not be exactly what you want (and it is still incomplete), but it probably has the information to get you started to complete the scripts yourself. Check it out.
Let the Big Blue friend sort this out. If they don't give you access to the databases then they should populate the production database. Give them a list of tables an let them export them from UAT and import it into production. Export / import or datapump is the standard for these kind of operations, you should not be forced to invent your own because of their lack of cooperation.
have you considered exporting the data from your UAT db and then importing it to your local?

Resources