Export Oracle database to SQL with CLI tools - oracle

The goal is to export an existing test Oracle database to SQL so that it can be run in a test pipeline using a Docker images's /container-entrypoint-initdb.d bootstrap. Currently we mount an oradata export, but it's opaque what the actual data is, hence the desire to use SQL so that it can be maintained via git.
I'm aware you can use the GUI Oracle SQL Developer Export Wizard to export a database to SQL.
Though how to do it via command line tools? I've tried many things like:
impdp test/test#XEPDB1 dumpfile=export.dmp directory=DUMP_DIR sqlfile=ddl.sql
And the best I can achieve is exporting the schema, however it is missing the actual data, e.g.
INSERT INTO currencies_countries (currency_id, country_id) VALUES ('EUR', 'CYP');

You have the exp which exports to a dump file that can be imported into another Oracle DB using imp. More recent are the xpdp and impdp.
See: https://oracle-base.com/articles/10g/oracle-data-pump-10g for instance.
What do you mean by "export to SQL"?
The sqldeveloper client tool has the possibility of exporting a table as SQL inserts, for instance. But to export that table by table would be quite an effort. Moreover, if you have millions of rows, those row by row inserts won't perform well.
Better write to tab-separated text files, which, in case of need may be imported into an Oracle DB using sqlldr, or may somehow be imported to some other kind database. Still, tab-separated files won't work well for tables having CLOB, XMLTYPE or some object-type columns.
You could write to a text file by spooling sqlplus output to a file having set linesize ling enough to fit the columns length, set pagesize 0 (no pages).
Or you could write to a file in an Oracle directory via a pl/sql program in which you use utl_file.

Related

How do I export oracle sql developer table schema by using pl/sql scripts?

everyone knows how to export table schema by using Export Wizard,
but how can I export them by using pl/sql script?
For example,
I want to export all Table schema which begin with "SYS"(ex:SYS_ROLE, SYS_USER_ROLE, SYS_USER, etc.)
Many thanks!!
A couple of points.
DBMS_DATAPUMP can be used to export from PLSQL, but the output export dump file(s) will reside on a location accessible directly by the Oracle database instance.
The SYS 'tables' are mostly views. You could try the VIEWS_AS_TABLES option DBMS_DATAPUMP. Or Create tables in a another schema and then export these tables, e.g do:
CREATE TABLE myuser.sys_users AS SELECT * FROM sys.all_users
Then export MYUSERS schema.
The SQL Developer export writes the data from the data grid being viewed to a variety of formats as a local file. I do not know for sure, but SQL Developer command processing is also used by SQLCI. In SQLCI you can set the output to a number of formats including CSV, understands the same commands as SQLCI.
See this page here:
https://oracle-base.com/articles/misc/sqlcl-format-query-results-with-the-set-sqlformat-command
So although not PLSQL, you could write the queries as SQL scripts, setting the appropriate SQLFORMAT option, SPOOL to a local file and that should do it. You could try running this in SQL*Developer as well to see if it works.

Oracle Object Creation and FlashBack

I need to create a table from another table along with it's indexes and constraints in another schema in Oracle. I know about CTAS syntax but it doesn't take the indexes and constraints with it. Is there any way to do it?
Also is there any way to flashback procedure, triggers or package after dropping?
The simplest approach is to treat DDL statements like any other piece of application code, and keep them as scripts in a source control repository.
However, it's easy to be wise after the event. If you're working in an environment where the schema is a bit of a free fire zone there are various options.
The best thing is to use DBMS_METADATA to re-create the DDL statements. These can be saved as scripts, run in other schemas and - crucially - stored somewhere which gets backed-up, ideally source control.
To generate all the DDL for a table and its dependent objects is reasonably straightforward. The DBMS_METADATA functions return clobs, which is not ideal but simple enough to spool them out in SQL*Plus:
SQL> set long 10000
SQL> set heading off
SQL> spool create_tab_t23.sql
SQL> select dbms_metadata.get_ddl('TABLE', 'T23') from dual;
SQL> select dbms_metadata.get_dependent_ddl('INDEX', 'T23') from dual;
SQL> select dbms_metadata.get_dependent_ddl('TRIGGER', 'T23') from dual;
SQL> spool off
Having to specify the individual object types is a bit of a nausea. Fortunately most IDEs (Oracle SQL Developer, PLSQL Developer, TOAD, etc) provide handy right-click menu options to handle all this for us.
The easiest way to copy an entire Oracle table (structure, contents, indexes, constraintes, triggers, etc.) is to use Oracle's export and import utiilities (expdp and impdp).  These are command-line utilities that you run on the database server using parameters that you provide.  Or, you can use OEM (Oracle Enterprise Manager) to run these for you.  Note that they they depend on having at least one "logical directory" defined where the "dump" file can get written to by export and read from by import.
This method work well when you want to copy a table from one schema to another, or from one database to another, and keep the same table name.  If however your goal is to create a copy table in the same schema, but with a different name, then the process gets more complex.  You can still use export, but then with import instead of doing the actual import directly, you have import create a text file for you that contains all of the SQL commands it finds in the export file.  Then you edit that text file to change the index, constraint and trigger names that need to be changed plus change the table name in those commands to the new table_name (but do not change the table name in the "create table..." command). Then rename the existing table to something else and run just the "create table ..." command (with the original table_name) from the script file.  Next, run import to get just the data.  Then rename the new table to the name you want it to have and rename the original table to its original name.  After that, you manually run the other SQL scripts from the script file.  You don't want those triggers, constraints and indexes in place when you do the actual data import.

How to export data from tables with BLOBs as SQL inserts

I need to export data from one schema and import it to another. But in the second schema there are tables with different names, different attribute names, etc, but these tables are suitable for data in first schema. So I export data as SQL inserts and manually rewrite names etc. in this inserts.
My problem is with tables which have columns with type BLOB. PL/SQL Developer throws error:
Table MySchema.ENT_NOTIFICATIONS contains one or more BLOB columns.
Cannot export in SQL format, use PL/SQL Developer format instead.
But, when I use PL/SQL Developer format (.pde) it is some kind of raw byte data and I can't change what I need.
Is there any solution to manage this?
Note: I use PL/SQL Developer 10.0.5.1710 and Oracle database 12c

Export oracle database tables

i am working on a large database ,how do i Export some database tables without having dba privileges .do i have to copy the structures of the tables and using spool command to get the data in a text file then create the tables and inserting data from the text file?
One of the methods would be to install Oracle SQL Developer and export the required table structures and data using the wizard.
Here is the link to a tutorial which can guide you if you go with this option.
http://www.oracle.com/webfolder/technetwork/tutorials/obe/db/sqldev/r30/SQLdev3.0_Import_Export/sqldev3.0_import_export.htm
A second option would be to use SQL Loader to load data in your target tables. But for that you will have to first create the data structures on your target schema and spool the data from your source tables in CSV (comma separated values) or any other eligible format.
Here is a link for SQL Loader.
http://docs.oracle.com/cd/B28359_01/server.111/b28319/ldr_concepts.htm
A third option would be that you create the table structures on the target schema and generate the insert statements from the source schema using a script. Here is a link to such an example.
https://pandazen.wordpress.com/2008/08/18/generate-insert-statement-script-to-extract-data-from-oracle-table/
I would recommend going with the SQL Developer option since it is relatively simple.

Oracle sql result to DBF file export

I would like to export data from a Oracle table into *.dbf file (like excel) through PL/SQL scripts. Are there any codes available?
There are a number of different ways to do this. The easiest way is to use an IDE like SQL Developer or TOAD, which offer this functionality.
If you want to call it from PL/SQL, then then are no built-in Oracle functions. However, it is relatively straightforward to build something using UTL_FILE which can write out value separated records. These can be picked up in Excel.
Note that the default separator - , (comma being the "C" in .CSV) - will cause problems if your exported data contains commas. So you will need to use the Data Import wizard rather than a right-click Open With ...
Incidentally, it is probably a bad idea to use the .dbf suffix. In an Oracle file system the presumed meaning is database file - i.e. part of the database's infrastructure. This is just a convention, but there is no point in needlessly confusing people. Perferred alternatives include .csv, .dmp or .exp.
edit
If your interest is just to export data for transferring to another Oracle database then you should look at using the Data Pump utility. This comes with an API so it can be used from PL/SQL. Alternatively we unload data through external tables declared with a DataPump driver.
You could also consider using the External Tables feature of Oracle. This essentially allows you to map a CSV file to a 'virtual' table and the you can insert into it (and therefore the file.)
Oracle External Tables Concept Guide

Resources