Suppose I have an existing schema (s1) with the following tables:
s1.table1
s1.table2
I would like to clone this schema to a new schema (s2) i.e.:
s2.table1
s2.table2
How can I achieve this?
The purpose is for testing without touching the original data.
I'm aware of the EXPORT command but it appears to be able to only export 1 table at a time.
You can copy a table's definition to a new table, without copying the data in the original table using CREATE TABLE ... LIKE:
CREATE TABLE s2.table1 (LIKE s1.table1 INCLUDING ALL);
CREATE TABLE s2.table2 (LIKE s1.table2 INCLUDING ALL);
You can then copy the rows from the original table if needed:
INSERT INTO s2.table1 SELECT * FROM s1.table1;
INSERT INTO s2.table2 SELECT * FROM s1.table2;
Related
I want to load data from a csv file into Vertica. I don't want to create table and the copy data in two separate steps. Instead, I want to create the table, specify the csv file and then let vertica figure out column definitions (names, data type) itself and then load the data.
Something like create table titanic_train () as COPY FROM '/data/train.csv' PARSER fcsvparser() rejected data as table titanic_train_rejected abort on error no commit;
Is it possible?
I guess that if a table has 100s of columns then automating the create table, column definition and data copy would be much easier/faster than doing these steps separately
It's always several steps, no matter what.
Use the built-in bits of Vertica:
CREATE FLEX TABLE foo();
COPY foo FROM '/data/mycsvs/foo.csv' PARSER fCsvParser();
SELECT COMPUTE_FLEXTABLE_KEYS_AND_BUILD_VIEW('foo');
-- THEN, either:
SELECT * FROM foo_view;
-- OR: create a ROS Table:
CREATE TABLE foo_ros AS SELECT * FROM foo_view;
Get a CSV-to-DDL parser from the net, like https://github.com/marco-the-sane/d2l, and install it then:
$ d2l -coldelcomma -chardelquote -drp -copy /data/mycsvs/foo.csv | vsql
So , in the second instance, it's one step, but it calls both d2l and vsql.
I have multiple existing tables stored in hdfs. I would like to create new tables from the existing external tables so that I can bucket, sort, and compress the data.
What is the proper way to create a table from the existing table? I could export the existing table to CSV, then create a new table and import it but it seems like there should be a way to import the data directly from the existing table but I haven't found anything in the documentation or via google.
For some existing table named: source and a newly created table named: target with fields: a,b,c,d
Reading all entries from source and writing to target:
insert overwrite table target select distinct a,b,c,d from source;
This works for both internal and external tables.
I'm trying to move the data from one table TABLE5 another one TABLE5_BKP.
CREATE TABLE TABLE5_BKP AS SELECT * FROM TABLE5;
The table created and the data moved. when I checked the constraints,
The primary key,foreign key etc are not generated but all other constraints like,
SYS_C2211111 Check "COLUMN1" IS NOT NULL
etc are created. What to do in this case? Need to create the primary key,foreign key etc separately? What about indexes and other parameters, which I was not able to check.
You can't implicitly create PK, FK, Indexes, etc. just using
CREATE TABLE tablename AS SELECT *...
You have to specify them after creating. Also I suggest you to use oracle tools, like exp/imp, data pump, etc. if you want to move the database structure from one database to another.
I want to create a table (lets say table_copy) which has same columns as other table (lets call it table_original) in Oracle database, so the query will be like this :
create table table_copy as (select * from table_original where 1=0);
This will create a table, but the constraints of table_original are not copied to table_copy, so what should be done in this case?
Only NOT NULL constraints are copied using Create Table As Syntax (CTAS). Others should be created manually.
You might however query data dictionary view to see the definitions of constraints and implement them on your new table using PL/SQL.
The other tool that might be helpful is Oracle Data Pump. You could import the table using REMAP_TABLE option specifying the name for the new table.
Use a database tool to extract the DDL needed for the constraints (SQL Developer does the job). Edit the resulting script to match the name of the new class.
Execute the script.
If you need to do this programmatically you can use a statement like this:
DBMS_METADATA.GET_DDL('TABLE','PERSON') from DUAL;
I would like to know which is the command to convert a temporary table to permanent table in Oracle.
Other issue is about the index. An index used in a temporary table will be the same used in a permanent table, if I convert it?
You can't convert a table from a temporary table to a permanent table.
You can create a new permanent table that matches the structure of the temporary table
CREATE TABLE new_permanent_table
AS
SELECT *
FROM old_temporary_table
WHERE 1=0;
Or you could get the DDL for the temporary table using the DBMS_METADATA package and manually edit the DDL to create the new permanent table.
Then you can create whatever indexes you would like on the new permanent table and drop the old temporary table. Once the old temporary table is dropped, you can rename the permanent table to use the name of the old temporary table if you would like.