Oracle sql result to DBF file export - oracle

I would like to export data from a Oracle table into *.dbf file (like excel) through PL/SQL scripts. Are there any codes available?

There are a number of different ways to do this. The easiest way is to use an IDE like SQL Developer or TOAD, which offer this functionality.
If you want to call it from PL/SQL, then then are no built-in Oracle functions. However, it is relatively straightforward to build something using UTL_FILE which can write out value separated records. These can be picked up in Excel.
Note that the default separator - , (comma being the "C" in .CSV) - will cause problems if your exported data contains commas. So you will need to use the Data Import wizard rather than a right-click Open With ...
Incidentally, it is probably a bad idea to use the .dbf suffix. In an Oracle file system the presumed meaning is database file - i.e. part of the database's infrastructure. This is just a convention, but there is no point in needlessly confusing people. Perferred alternatives include .csv, .dmp or .exp.
edit
If your interest is just to export data for transferring to another Oracle database then you should look at using the Data Pump utility. This comes with an API so it can be used from PL/SQL. Alternatively we unload data through external tables declared with a DataPump driver.

You could also consider using the External Tables feature of Oracle. This essentially allows you to map a CSV file to a 'virtual' table and the you can insert into it (and therefore the file.)
Oracle External Tables Concept Guide

Related

Export Oracle database to SQL with CLI tools

The goal is to export an existing test Oracle database to SQL so that it can be run in a test pipeline using a Docker images's /container-entrypoint-initdb.d bootstrap. Currently we mount an oradata export, but it's opaque what the actual data is, hence the desire to use SQL so that it can be maintained via git.
I'm aware you can use the GUI Oracle SQL Developer Export Wizard to export a database to SQL.
Though how to do it via command line tools? I've tried many things like:
impdp test/test#XEPDB1 dumpfile=export.dmp directory=DUMP_DIR sqlfile=ddl.sql
And the best I can achieve is exporting the schema, however it is missing the actual data, e.g.
INSERT INTO currencies_countries (currency_id, country_id) VALUES ('EUR', 'CYP');
You have the exp which exports to a dump file that can be imported into another Oracle DB using imp. More recent are the xpdp and impdp.
See: https://oracle-base.com/articles/10g/oracle-data-pump-10g for instance.
What do you mean by "export to SQL"?
The sqldeveloper client tool has the possibility of exporting a table as SQL inserts, for instance. But to export that table by table would be quite an effort. Moreover, if you have millions of rows, those row by row inserts won't perform well.
Better write to tab-separated text files, which, in case of need may be imported into an Oracle DB using sqlldr, or may somehow be imported to some other kind database. Still, tab-separated files won't work well for tables having CLOB, XMLTYPE or some object-type columns.
You could write to a text file by spooling sqlplus output to a file having set linesize ling enough to fit the columns length, set pagesize 0 (no pages).
Or you could write to a file in an Oracle directory via a pl/sql program in which you use utl_file.

I want to bash a file from server side containing a list to a table in database

I've been stuck on how to do this task, python is unavailable, all I can do is pl/sql which I only did once. Basically what I'm trying to do is query a file that contains a list to a table in database, this is the format it contains 123-43763-2748, Please help on how to approach/solve this task. Thank you so much
If I understood you correctly, you have a file that contains data (many rows, I presume?) like the one you posted. Now you'd want to store data from file into a table within an Oracle database. Is that correct?
If so, PL/SQL is not the only option you have. If you want to use it, then you'd first have to acquire access to Oracle directory (that's an object that points to a filesystem directory, usually located on a database server). To do so, you'll have to talk to DBA, they'll grant privileges to your user. Then you'd write a PL/SQL procedure which uses UTL_FILE package, reads the file and inserts values into the database.
Another option - which also includes previously mentioned directory - is to create a external table; it just points to the filesystem file which then acts as if it were an ordinary table, so you can easily query it and use it as a source for a simple INSERT INTO statement (to store data into the target table).
Then, there's my favourite option - SQL Loader, utility that is capable of reading files stored on your own computer (as opposed to previous options), and is really, really fast. You'd create a control file (it says what data to read and where to store those values) and use it with the sqlldr executable.
That's theory. It is useless to write some dummy code because you didn't explain any details - for example, how does the target table look like (its description), whether you have access to database server or not, were you granted certain privileges (e.g. execute on utl_file) etc.

Why External table concept has been established in Oracle?

SQL*Loader: Oracle uses this functionality, through the ORACLE_LOADER access driver to move data from a flat file into the database;
Data Pump: It uses a Data Pump access driver to move data out of the database into a file in an proprietary Oracle format, and back into the database from files of that format.
When a data load can be done by either the SQL*Loader or Data Pump utilities, and data unload can also be done by the Data Pump utility:
Are there any extra benefits that can be achieved by using external tables, that none of the previously mentioned utilities can do by themselves?
The below Oracle table creation command creates a table which looks like an Oracle table.Why are then Oracle telling us to call it as an external table?
create table export_empl_info organization external
( type oracle_datapump
default directory xtern_data_dir
location ('empl_info_rpt.dmp')
) as select * from empl_info;
"Are there any extra benefits that can be achieved by using external
tables, that none of the previously mentioned utilities can do by
themselves?"
SQL*loader and Datapump both require us to load the data into tables before we can access it with the database. Whereas we only access external tables through SELECT statements. It's a much more flexible mechanism.
"Why are then Oracle telling us to call it as an external table?"
umm, because it is external. The data resides in an file (or files) which is controlled by the OS. We can change the data in an external table by running an OS command like
$> cp wnatever.csv external_table_data.csv
There's no redo, rollback, flashback query or any of the other appurtenances of an internal database table.
I think that the primary benefits of external tables for me have been:
i) Not having to execute a host command to import data, which supports a trend in Oracle to control the entire code bade from inside the database. Preprocessing in 11g allows access to remote files through ftp, use of compressed files, combining multiple files into one, etc
ii) More efficient loads, by means of applying complex data transformations during the load process. Aggregations, merges, multitable inserts ... etc
I've used it for data warehouse loads, but any scenario requiring loading of or access to standard data files is a candidate for use of external tables. SQL*Loader still has its place as a tool for loading to an Oracle database from a client or other host system. Data pump is for transfer of data between Oracle databases, so it's rather different.
One limitation of external tables is that they won't process stream data -- records have to be delimited. This was true in 10.2, not sure if it's been permitted since then.
Use the system catalog views ALL/DBA/USER_EXTERNAL_TABLES for information on them
RE: Why external table vs sqlldr for loading data? Mainly to have server managed parallelism vs client managed parallelism.

Script Oracle tables (DDL) with data insert statements into single/multiple sql files

I am needing to export the tables for a given schema, into DDL scripts and Insert statements - and have it scripted such that, the order of dependencies/constraints is maintained.
I came across this article suggesting how to archive the database with data - http://www.dba-oracle.com/t_archiving_data_in_file_structures.htm - not sure if the article is applicable for oracle 10g/11g.
I have seen "export table with data" features in "Sql Developer", "Toad for Oracle", "DreamCoder for Oracle" etc, but i would need to do this one table at a time, and will still need to figure out the right order of script execution manually.
Are there any tools/scripts that can utilize oracle metadata and generate DDL script with data?
Note that some of the tables have CLOB datatype columns - so the tool/script would need to be able to handle these columns.
P.S. I am needing something similar to the "Generate Scripts" feature in SQL Server 2008, where one can specify "script data" option and get back a self-sufficient script with DDL and data, generated in the order of table constraints. Please see: http://www.kodyaz.com/articles/sql-server-script-data-with-generate-script-wizard.aspx
Thanks for your help!
Firstly, recognise that this isn't necessarily possible. A view can use a function in a package that also selects from the view. Another issue is that you might need to load data into tables and then apply constraints, even though this might be slower than the other way round.
In short, you will need to do some work here.
Work out the dependencies in your system. ALL_DEPENDENCIES is the primary mechanism.
Then use DBMS_METADATA.GET_DDL to extract the DDL statements. For small data volumes, I'd extract the constraints separately for applying after the data load.
In current versions you can create external tables to unload data from regular tables into OS files (and obviously go the other way round). But if you've got exotic datatypes (BLOB, RAW, XMLTYPEs, User Defined Types....) it will be more challenging.
I suggest that you use Oracle standard export and import (exp/imp) here, is there a reason why you won't consider it? Note in addition you can use the "indexfile" option on the import to output the SQL statements (unfortunately this doesn't include the inserts) to a file instead of actually executing them.

oracle to flat file

I need to create a flat file and push information into it from oracle database using JSP.
I require a sample code. Help will be appreciated.
If you're looking for an easy way to write different SQL statements to a file, use this procedure: http://www.oracle-developer.net/content/utilities/data_dump.sql
Also you might want to look into DBMS_XSLPROCESSOR.CLOB2FILE.
I think you need to look into Oracle external tables. These are flat files that appear as tables in the Oracle database. You would simply insert data into it using SQL (as per any other database table). Google "Oracle External Tables" for more information.

Resources