Extraction of an Oracle table in Pentaho - oracle

When I try to extract a table from Oracle database to Pentaho, there's no result. I get the following message:

Related

Oracle Db to PostgreSQL conversion using ora2pg

I am trying to migrate my Oracle database to PostgreSQL using Ora2pg tool.
Exported DDL file successfully, but when I am trying to import the same into PostgreSQL server getting some errors as below.
There is a check constraint in Oracle as IS JSON condition, when I exported from Ora2PG it is generated as
ALTER TABLE Temp_table ADD CONSTRAINT ensure_json1 CHECK (rpdata IS JSON);
When I try to execute the same in PostgreSQL server, getting "Syntax error at or near JSON".
You don't need this in Postgres.
Postgres has a native JSON datatype that validates JSON automatically. In Oracle you need that check constraint, to "turn" a CLOB into a JSON column (without it, the values aren't validated and certain JSON operations don't work)
Just remove that constraint from your Postgres script (assuming the column is indeed defined as json or ideally jsonb)

Create tables in Oracle the same as they are in Postgres

I have multiple Tables in PostgreSQL
extract_a, extract_b and so on now I want to create all of these tables in Oracle by using Pentaho.
I set the Data type Varchar2(4000) in all my tables fields. Tables created successfully but when I tried to insert the data it give men an error
"Identifier is too long"
How do I create these tables Oracle ?

Is it possible to join a hive table with oracle table?

I have a problem in writing Query using HiveQL.
Is it possible to join a hive table with oracle table?
if yes how?
if no why?
To access data stored in your Hive tables, including joining on them, you will need Oracle Big Data connector.
From the documentation:
Using Oracle SQL Connector for HDFS, you can use Oracle Database to access and analyze data residing in HDFS files or a Hive table. You can also query and join data in HDFS or a Hive table with other database-resident data. If required, you can also load data into the database using SQL.
You first access Hive tables from Oracle Database via external tables . The The external table definition is generated automatically from the Hive table definition. Hive table data can be accessed by querying this external table. The data can be queried with Oracle SQL and joined with other tables in the database.
You can use the Hive table that uses data and can access this Hive table from Oracle Database.

Convert ntext to clob

I have to copy data from one table to another which one table is in Oracle and one is in MSSQL Server. I want to copy the data from MSSQL Server table to Oracle table. The problem is that the MSSQL Server table has one column which is of data type ntext and the destination column in Oracle table is clob.
When I use the query
insert into oracle.table select * from sqlserver.table#mssql; I get the following error:
SQL Error: ORA-00997: illegal use of LONG datatype
Can anyone advice on this please?
I tried it through a PL/SQL Procedure and it worked. I created a cursor, passed in the values to my variables declared in VARCHAR2 and then run an EXECUTE IMMEDIATE for the INSERT INTO....SELECT * FROM <TABLE_NAME>#MSSQL.

Loading XMLTYPE data with IMPDP

I am trying to take a schema from an existing database and place it in a new database.
I've created dependant tablespaces for the data and everything seems to work ok except any tables with XMLTYPE columns error and fail with the error message below. The XMLTYPE are unvalidated CLOBs
KUP-11007: conversion error loading table "SCHEMA"."TABLE_NAME"
ORA-01400: cannot insert NULL into (XML_COLUMN)
KUP-11009: data for row: XML_COLUMN : 0X''
Some investigation seemed to indicate that using TABLES=TABLE_NAME instead of SCHEMA=SCHEMA would help but I have had no such luck.
Note that there are no constraints on this column and that some data could indeed be null (though after the import I get 0 of my several million records)
The command I am using to initiate the datapump is:
impdp TABLES=SCHEMA.TABLE_NAME DIRECTORY=DATA_PUMP_DIR DUMPFILE=oracledpexport.dmp LOGFILE=LOGFILE.LOG TABLE_EXISTS_ACTION=REPLACE
We have been facing some problems during ORACLE import process.
The IMPDP process was not able to import tables containing XML data types.
The reason for this is due to a bug in ORACLE 11g R1 version.
The work around for this is to use EXP process to create a dump instead of EXPDP.
For a permanent fix, we have to explicitly save XML Type columns as CLOB
Also, Oracle has confirmed that this issue has been fixed in ORACLE 11gR2 version.

Resources