Oracle Db to PostgreSQL conversion using ora2pg - oracle

I am trying to migrate my Oracle database to PostgreSQL using Ora2pg tool.
Exported DDL file successfully, but when I am trying to import the same into PostgreSQL server getting some errors as below.
There is a check constraint in Oracle as IS JSON condition, when I exported from Ora2PG it is generated as
ALTER TABLE Temp_table ADD CONSTRAINT ensure_json1 CHECK (rpdata IS JSON);
When I try to execute the same in PostgreSQL server, getting "Syntax error at or near JSON".

You don't need this in Postgres.
Postgres has a native JSON datatype that validates JSON automatically. In Oracle you need that check constraint, to "turn" a CLOB into a JSON column (without it, the values aren't validated and certain JSON operations don't work)
Just remove that constraint from your Postgres script (assuming the column is indeed defined as json or ideally jsonb)

Related

Export Oracle database to SQL with CLI tools

The goal is to export an existing test Oracle database to SQL so that it can be run in a test pipeline using a Docker images's /container-entrypoint-initdb.d bootstrap. Currently we mount an oradata export, but it's opaque what the actual data is, hence the desire to use SQL so that it can be maintained via git.
I'm aware you can use the GUI Oracle SQL Developer Export Wizard to export a database to SQL.
Though how to do it via command line tools? I've tried many things like:
impdp test/test#XEPDB1 dumpfile=export.dmp directory=DUMP_DIR sqlfile=ddl.sql
And the best I can achieve is exporting the schema, however it is missing the actual data, e.g.
INSERT INTO currencies_countries (currency_id, country_id) VALUES ('EUR', 'CYP');
You have the exp which exports to a dump file that can be imported into another Oracle DB using imp. More recent are the xpdp and impdp.
See: https://oracle-base.com/articles/10g/oracle-data-pump-10g for instance.
What do you mean by "export to SQL"?
The sqldeveloper client tool has the possibility of exporting a table as SQL inserts, for instance. But to export that table by table would be quite an effort. Moreover, if you have millions of rows, those row by row inserts won't perform well.
Better write to tab-separated text files, which, in case of need may be imported into an Oracle DB using sqlldr, or may somehow be imported to some other kind database. Still, tab-separated files won't work well for tables having CLOB, XMLTYPE or some object-type columns.
You could write to a text file by spooling sqlplus output to a file having set linesize ling enough to fit the columns length, set pagesize 0 (no pages).
Or you could write to a file in an Oracle directory via a pl/sql program in which you use utl_file.

how to obtain blob type data for insertion in oracle?

hello friends I have a problem with the blob data type, I want to migrate some data from one bd to another bd however I have not been able to some tables that have blob type columns, what I have tried is to export a single record in the following way.
first I make a select of the record I want to export to my other bd
select TEMPLATE_DOCUMENT_ID,blob_file from example_table where template_document_id = 32;
then I export the result to obtain the insert
I configure as follows
when I do this I get a script with the data of the record that I want to migrate
if I run this it gives me the following error
Error report -
ORA-01465: invalid hex number
Do you have any idea how I could get the correct data to make my insert?
NOTE: MIGRATION IS DONE FROM ONE ORACLE DATABASE TO ANOTHER ORACLE DATABASE.
Obviously the source database is Oracle. You did not mention what is the target database. In case it is Oracle as well I would suggest using the Oracle Data Pump tool (expdp/impdp). Doc is here: https://docs.oracle.com/cd/B19306_01/server.102/b14215/dp_overview.htm
In case you need it, at least I use it quite often is the VIEW_AS_TABLE option of the tool as it allows me to export a subset of the data.

Table importation fails with error code ORA-31693

I've been receiving backups from an Oracle database into my Oracle database for 2 years now. My company is running version 10.2.0.1.0 and we are receiving the exports from version 12.1.0.2.0. They are using expdp and I'm using impdp. I added a new column into my database, using this script
ALTER TABLE CONTAINERS
ADD ("SHELL" NUMBER(14, 6) DEFAULT 0 );
After running the above on both databases now when they send an export to me the table in question will not import. I receive the following error.
ORA-31693: Table data object "PAS"."CONTAINERS" failed to load/unload and is being skipped due to error:
ORA-02354: error in exporting/importing data
ORA-02373: Error parsing insert statement for table "PAS"."CONTAINERS".
ORA-00904: "SYS_NC00067$": invalid identifier
This error has been going on for a about two weeks, I have tried to resolve the problem multiple ways, this is my last resort as it were.
Any help is greatly appreciated.
Did you try to track down SYS_NC00067? It looks like a system-assigned column name. This sometimes happens when you add a function-based index. Did you create a function-based index on Shell?

Liquibase column data types from generateChangeLog

I have two databases, one on DB2 and one on ORACLE. I've generated change log file via generateChangeLog command. It produced me correct xml file, but only on Oracle database. I was invoking this command on oracle database and in result I got in column data types like NUMBER(*,0) which are not valid on DB2. How can I generateChangeLog with unified data types in liquibase ?
Does some list of data types exists in liquibase ? Which are let's say versitale to all databases ??
Reverse engineering an existing DB schema to a Liquibase XML file always creates the DBMS specific datatypes. You will have to edit the generated XML file to use JDBC types.
The supported "cross-platform" types are documented in the manual:
http://www.liquibase.org/documentation/column.html
To help make scripts database-independent, the following “generic” data types will be converted to the correct database implementation:
BOOLEAN
CURRENCY
UUID
CLOB
BLOB
DATE
DATETIME
TIME
BIGINT
Also, specifying a java.sql.Types.* type will be converted to the correct type as well. If needed, precision can be included. Here are some examples:
java.sql.Types.TIMESTAMP
java.sql.Types.VARCHAR(255)
From my experience the first list is missing INTEGER and DECIMAL which can also be used without problems (at least for Oracle and Postgres - don't have DB2 around to test it).

Loading XMLTYPE data with IMPDP

I am trying to take a schema from an existing database and place it in a new database.
I've created dependant tablespaces for the data and everything seems to work ok except any tables with XMLTYPE columns error and fail with the error message below. The XMLTYPE are unvalidated CLOBs
KUP-11007: conversion error loading table "SCHEMA"."TABLE_NAME"
ORA-01400: cannot insert NULL into (XML_COLUMN)
KUP-11009: data for row: XML_COLUMN : 0X''
Some investigation seemed to indicate that using TABLES=TABLE_NAME instead of SCHEMA=SCHEMA would help but I have had no such luck.
Note that there are no constraints on this column and that some data could indeed be null (though after the import I get 0 of my several million records)
The command I am using to initiate the datapump is:
impdp TABLES=SCHEMA.TABLE_NAME DIRECTORY=DATA_PUMP_DIR DUMPFILE=oracledpexport.dmp LOGFILE=LOGFILE.LOG TABLE_EXISTS_ACTION=REPLACE
We have been facing some problems during ORACLE import process.
The IMPDP process was not able to import tables containing XML data types.
The reason for this is due to a bug in ORACLE 11g R1 version.
The work around for this is to use EXP process to create a dump instead of EXPDP.
For a permanent fix, we have to explicitly save XML Type columns as CLOB
Also, Oracle has confirmed that this issue has been fixed in ORACLE 11gR2 version.

Resources