Creating DDL Statements from Calcite Schemas - ddl

I have created a bunch of Calcite Schemas and tables. I was wondering how do I convert them to SQL DDL statements.

Related

when using dbms_metadata.get_ddl to generate the ddl for a table can I exclude creating of constraints

I am trying to develop some scripts so I can recreate the oracle schema for a development user when I need to refresh the data from a MySQL production database.
I am using the dbms_metadata.get_ddl function to generate the ddl, but it also creates all the constraints for the tables and I need to wait on creating at least the foreign constraints since I want to load the actual data first.
Is there a way to tell dbms_metadata.get_ddl to not create the indexes and constraints?

How to generate DDL scripts from tables in Oracle NoSql?

I have some tables with huge columns (more than 600 columns) and don't have DDL(create table) scripts for it. I can create the script by seeing the table schema using DESCRIBE keyword in oracle nosql, but it's a huge pain in the back because of manual operation.
Is there any way to generate DDL scripts for existing tables in Oracle NoSql database?
There is no way to do this at this time. It's being considered as an extension to the sql shell. One option in user code would be to use the "describe as json ..." call and parse the JSON description of the table to construct the DDL string

How to export data from tables with BLOBs as SQL inserts

I need to export data from one schema and import it to another. But in the second schema there are tables with different names, different attribute names, etc, but these tables are suitable for data in first schema. So I export data as SQL inserts and manually rewrite names etc. in this inserts.
My problem is with tables which have columns with type BLOB. PL/SQL Developer throws error:
Table MySchema.ENT_NOTIFICATIONS contains one or more BLOB columns.
Cannot export in SQL format, use PL/SQL Developer format instead.
But, when I use PL/SQL Developer format (.pde) it is some kind of raw byte data and I can't change what I need.
Is there any solution to manage this?
Note: I use PL/SQL Developer 10.0.5.1710 and Oracle database 12c

Create tables in Oracle the same as they are in Postgres

I have multiple Tables in PostgreSQL
extract_a, extract_b and so on now I want to create all of these tables in Oracle by using Pentaho.
I set the Data type Varchar2(4000) in all my tables fields. Tables created successfully but when I tried to insert the data it give men an error
"Identifier is too long"
How do I create these tables Oracle ?

Is it possible to join a hive table with oracle table?

I have a problem in writing Query using HiveQL.
Is it possible to join a hive table with oracle table?
if yes how?
if no why?
To access data stored in your Hive tables, including joining on them, you will need Oracle Big Data connector.
From the documentation:
Using Oracle SQL Connector for HDFS, you can use Oracle Database to access and analyze data residing in HDFS files or a Hive table. You can also query and join data in HDFS or a Hive table with other database-resident data. If required, you can also load data into the database using SQL.
You first access Hive tables from Oracle Database via external tables . The The external table definition is generated automatically from the Hive table definition. Hive table data can be accessed by querying this external table. The data can be queried with Oracle SQL and joined with other tables in the database.
You can use the Hive table that uses data and can access this Hive table from Oracle Database.

Resources