Cloning a Oracle Database Schema - oracle

I have an Oracle 12c Instance with a scheme 'wadmin' user, this instance has tables, view, data, triggers, sequences etc.
For quick spinning of docker images, I need to clone the db schema as fast as possible , so that I can create another user 'wadmin1' link it to new docker and start my testing.
Any CLI/tools for the same, does oracle provide any options?

I do not know if this is exacly what you are looking for but you can export your Oracle schema using ORACLE DataPump tool. This involves storing exported schema in the Oracle directory. While exporting schema to file you can transform the schema name, omit unnecessary tables or data etc. Exported files with database schema can be later used for imported to new database instance. More information regarding Oracle DataPump you can find here. https://oracle-base.com/articles/10g/oracle-data-pump-10g#SchemaExpImp.
Alternatively you can have scripts that create the database stored in the Git repository and integrate your builds with too called Flyway https://flywaydb.org/ which can be used to automatize of database schema creation. This is also really convenient from source control point of view. All changes on the schema are pull requested.
In my team we use OracleDataPump when we want to recreate the database together with the data, Flyway is used as a part of our continues integration.

Related

Oracle DB Export does not preserve order or dependencies

I'm trying to export an Oracle DB using Oracle SQL Developer having tables, sequences, view, packages, etc. with dependencies on each other.
When I use Tools -> Database Export and select all DDL options, unfortunately the exported SQL file does not preserve the other that is some DB objects should be created before some other.
Is there a way to make the DB export utility preserve object dependencies/order? Or Is there any other tool do you use for this task?
Thank you
Normally expdp does a pretty good job. Problems arise when there are dependencies on objects/users that are not part of the dump. This is because the counter part, impdp, does not add grants on objects that are not created by impdp. I call that the 'not created by me syndrome' that impdp has.
If you have no external dependencies (external meaning to schema's that are not part of the dump), expdp/impdp do a good job for you. You might not be able to use it if you can not have access to the database server since expdp writes it's files on the database server.
If you happen to have access to a database server that is able to connect to the original database, you could pull the data over into your local database using a database link.

Sybase to Oracle table Migration via Migration Wizard offline

How can I create a script of inserts for my sybase to oracle Migration? The Migration wizard only gives me the option to migrate procedures and triggers and such. But there is no select for just tables. When I try to migrate tables offline and move data. the datamove/ folder is empty. I would also want to only migrate specific tables (ones with long identifiers) because i was able to migrate the rest with Copy to Oracle.
I must also note that i do not want to upgrade to an new version of oracle. Currently on ~12.1 so i need to limit the identifiers.
How can I get the offline scripts for table inserts?
You (probably!) don't want INSERTs for offline migration scripts. If you're just running INSERTs, then the online method would probably suffice.
The point of the Offline strategy is to take the data from your Sybase instance to flat, delimited text files (using BCP), which we can THEN use to load back into an Oracle Database using SQLLDR or External Tables which will be EXPONENTIALLY faster than using INSERT scripts.
Take a look at this whitepaper where I go into offline Sybase migrations in detail.
You can consider DCO-based Sybase-to-Oracle replication via the Sybase Rep Server. This way, not only will you have all data moved, but you will also be able to have DML updates propagated online, which will make your system switchable live.

Synchronize oracle ddl changes

I am using oracle with sqldeveloper and have two remote schemas: Schema A and Schema B. I need to update schema B according to schema A. Currently I am using database copy feature of sqldeveloper to manually copy any ddl changes such as a new table, a new constraint added to schema A. Is there a way I can automate the process like have schema B updated automatically every time a ddl change occurs in A? I went through some replication tools but they had quite complex procedures and were not free.

Transferring the project to another server, with all the data in apex oracl

I want to migrate the project to another server, I exported the project, and generated for all the tables. But I can't migrate these tables. Someone can help me with this. ????
Based on your description, I'd say your best bet to migrate any custom schemas is to use Data Pump. Data Pump is made up of three distinct components. They are the command-line clients, expdp and impdp; the DBMS_DATAPUMP PL/SQL package (also known as the Data Pump API); and the DBMS_METADATA PL/SQL package (also known as the Metadata API).
An example export would look like:
expdp hr TABLES=employees DUMPFILE=employees.dmp
That would generate a file you could move to the destination database (where a database directory can map to).
Then an example import would look like:
impdp hr DIRECTORY=dpump_dir1 DUMPFILE=employees.dmp TABLES=employees
Of course, there are many more options than that. Here's the official doc:
https://docs.oracle.com/en/database/oracle/oracle-database/18/sutil/index.html
Also, if you want to move to Oracle's new always free tier, then Adrian Png provides a nice overview that also touches on some APEX related topics here:
https://fuzziebrain.com/content/id/1920/

How to load a H2 database into memory?

I have written a set of unit tests using H2 in embedded mode. Whatever changes tests make to DB stay there.
I know that the recommended approach is to create a blank in-memory database and create the schema when opening the connection.
However I am looking for an alternative approach. I would like to -
Initialize an in memory database with an embedded database file.
Or use embedded db in a way that all the changes are discarded as soon as the connection is closed.
How can I achieve this?
What I do in cases similar to this is to write the SQL script that creates the database and populates the tables. Then the application applies a database migration using Flyway DB.
Other possibilities are to create the database and load the tables from CSV files. The other would be to create the database with a different application and create a file with the SCRIPT command to create a backup. Your main application would have to run the RUNSCRIPT command to restore the database.
I use SQL scripts that create tables and other objects and/or populate them, and run these scripts at the beginning of the application.
One could also create a copy of the populated on-disk DB, package it into a ZIP/JAR archive, and open it read only, to be used to recreate and populate the in-memory DB.

Resources