Migrating Oracle DB with DMS: uncommitted transactions - oracle

Trying to create a DMS Migration task to migrate from an Oracle instance into an S3 bucket. This migration should be ongoing (CDC) and use DMS Binary Reader.
Seems that uncommitted transactions are also migrated and rolling them back does nothing, so what ends up on S3 is not an accurate representation of the original DB.
Any way to fix this? Can't seem to find any reference to this problem anywhere.
Thanks

Related

Sybase to Oracle table Migration via Migration Wizard offline

How can I create a script of inserts for my sybase to oracle Migration? The Migration wizard only gives me the option to migrate procedures and triggers and such. But there is no select for just tables. When I try to migrate tables offline and move data. the datamove/ folder is empty. I would also want to only migrate specific tables (ones with long identifiers) because i was able to migrate the rest with Copy to Oracle.
I must also note that i do not want to upgrade to an new version of oracle. Currently on ~12.1 so i need to limit the identifiers.
How can I get the offline scripts for table inserts?
You (probably!) don't want INSERTs for offline migration scripts. If you're just running INSERTs, then the online method would probably suffice.
The point of the Offline strategy is to take the data from your Sybase instance to flat, delimited text files (using BCP), which we can THEN use to load back into an Oracle Database using SQLLDR or External Tables which will be EXPONENTIALLY faster than using INSERT scripts.
Take a look at this whitepaper where I go into offline Sybase migrations in detail.
You can consider DCO-based Sybase-to-Oracle replication via the Sybase Rep Server. This way, not only will you have all data moved, but you will also be able to have DML updates propagated online, which will make your system switchable live.

Which is the fastest way to replicate oracle database deployed in RDS?

For example: Lets say i have two database DB1 and DB2. Now my requirement is to refresh data from DB1 to DB2 every night. DB1 is live database and DB2 is for non business users for data analysis.
My questions:
1) What must be the tool i should use for my requirement? I need a solution that is fast, since the database copy has to be done everyday.
2) Does AWS have any tool to automate the backup and restore the data?
There's a load of ways to do this and the answer comes down to what storage you're using, are they on the same server and then ultimately the size of the database.
RMAN's more a backup / recovery tool but definitely a runner for cloning. If you're not sure what RMAN does then I wouldn't even start to implement it as it's very tricky if you aren't super comfortable with Oracle DB's.
My recommendation is just use Oracle datapump, export the schema's you need to a dump file then ship it over and import them into the other database making sure to overwrite/drop the existing schemas.
Other than doing a differential clone at a SAN level this is probably the quickest and definitely easiest way to get it done

Oracle -> Postgresql Log-Based replication

(I do not code on my own, to make things clear)
I am looking for a solution that would allow to replicate data between a, master, Oracle 11g DB and a new PostgreSQL DB. Those are 2 different applications but the need to exchange data in real-time. There are some trigger-based ways but there is quite a big concern that this can affect the master DB efficiency - which we can't do.
I have also come across some log-based solutions, like HVR, but the cost is way too high for 500MB of data to be replicated.
Maybe anyone of You had a similar issue and found a way to deal with it?
Any kind of tips and help will be really appreciated as I am quite short on time
Oracle Archive Logs have different format than Postgres Write Ahead Logs. Despite the general similarity in concept of Oracle Streams, SQL Log Shipping, Postgres Streaming Replication etc, transaction logs <> redo logs <> xlogs and you can't use one provider logs to roll on the other provider engine.
Moreover you can't roll logs over same DB provider different version because of difference in binary format.
Something alike logical replication you can get with Postgres Logical Decoding, Oracle GoldenGate, Heterogeneous Database Replication, AWS DMS. But none of above gives you "Log-Based replication" between different db vendors
You can use a product that specializes in change data capture based data integration. Striim, GoldenGate, Attunity allow you to do CDC from Oracle. Striim also allows you to do CDC from PostgreSQL and write to Oracle as well.
https://striim.com
https://attunity.com

Oracle database migration from 11g to 12c

I need to do a database migration from Oracle 11g to 12c. But I cannot do a direct export and import kind
of a migration since there are a lot of schema changes which are going to happen. I already have the column mappings
in a sparedsheet with old columns and new columns with all details such as data type, constraints, etc.
There are new columns added to many tables are the default values that should be populated are also known.
So what should be the best approach to do this migration?
There are more ways to do this. Start with getting a dba involved.
To minimize production downtime, you could check if making a logical standby database is feasible in your situation. In that case, make the target database a 12c one, that saves for upgrade time.This target database is in sync with the source database at all times and makes it very valuable. Clone the target database and use that clone to test the migration steps. If the migration fails, you can easily re create a new clone to correct the migration process on.
Working in this way could even enable bi-directional replication, replication from the migrated database back to the source database that could make it possible to revert to the original database in the unlikely event that after production start on the new database things don't work as expected.
Start with adding a dba to the project, a good dba can help minimize downtime and reduce risk.

Is it possible to migrate projects on SONAR from the in-memory DB to an Oracle DB?

I am currently setting up SONAR with the in-memory db for an evaluation. Should we wish to use the tool, I would like to then migrate the analysis results onto an Oracle db to use going forward. Is this possible?
No tool is provided to do such a migration, and I advise you not to try to do so.
However, be aware that you will have the possibility to replay the history of your analysis: you can check out old versions of your code and launch an analysis on each one using the "sonar.projectDate" paramater to change the date.

Resources