Oracle how to export query to a text/csv file - oracle

I was wondering how to go about exporting a query from PL/SQL to an text file or csv file. The query I have in mind exports a huge amount of data (about 1 gig). So I'd also like the data split across multiple files;
out1.csv
out2.csv
out3.csv
I'd like to be able to decide how many files to split it across.
Anyone have any idea how to do this?

Use UTL_FILE.
A well known ( probably the most complete discussion on this topic ) discussion on this can be found at Ask Tom, Here , note that many of the examples there date back to oracle 8, so there may be better ways to do it in your version of Oracle.

Try this
For creating MYDIR
create or replace directory MYDIR as 'F:/DATA/';
Grant all permission to MYDIR via SYS user
execute this procedure
CREATE OR REPLACE PROCEDURE export_to_csv(refcur out sys_refcursor) IS
v_file UTL_FILE.file_type;
v_string VARCHAR2(4000);
CURSOR c_emp IS
SELECT ROLE_ID, ROLE_DESC FROM role_mst;
BEGIN
open refcur for
SELECT ROLE_ID, ROLE_DESC FROM role_mst;
v_file := UTL_FILE.fopen('MYDIR', 'empdata.csv', 'w', 1000);
-- if you do not want heading then remove below two lines
v_string := 'Emp Code, Emp Name';
UTL_FILE.put_line(v_file, v_string);
FOR cur IN c_emp LOOP
v_string := cur.ROLE_ID || ',' || cur.ROLE_DESC;
UTL_FILE.put_line(v_file, v_string);
END LOOP;
UTL_FILE.fclose(v_file);
EXCEPTION
WHEN OTHERS THEN
dbms_output.put_line(sqlerrm);
IF UTL_FILE.is_open(v_file) THEN
UTL_FILE.fclose(v_file);
END IF;
END;

Related

How to Convert Oracle Stored Procedure Output to CSV file

I have seen many examples where you can convert a SQL Query output to a CSV file by calling it from a stored procedure as shown in PL/SQL code below. However, in my case, I have a stored procedure called Management with 3 input parameters. Once these inputs are confirmed, the stored procedure extracts the result with 10 columns based on the code in the PL/SQL.
My aim is to convert this extract from the Management stored procedure to a CSV. The examples I found so far are all to do with converting SQL query only to CSV by calling it from a stored procedure.
I'm trying to avoid writing the code to convert the output to CSV in the Management stored procedure by calling it from another stored procedure. Is this possible?
The Management stored procedure is called by the SQL developer and the values are returned by applying a select query from two tables and applying multiple cases and if statements.
An example will be much appreciated.
CREATE OR REPLACE PROCEDURE export_to_csv
IS
v_file UTL_FILE.file_type;
v_string VARCHAR2 (4000);
CURSOR c_emp
IS
SELECT empno,
ename,
deptno,
sal,
comm
FROM emp;
BEGIN
v_file :=
UTL_FILE.fopen ('CSVDIR',
'empdata.csv',
'w',
1000);
-- if you do not want heading then remove below two lines
v_string := 'Emp Code, Emp Name, Dept, Salary, Commission';
UTL_FILE.put_line (v_file, v_string);
FOR cur IN c_emp
LOOP
v_string :=
cur.empno
|| ','
|| cur.ename
|| ','
|| cur.deptno
|| ','
|| cur.sal
|| ','
|| cur.comm;
UTL_FILE.put_line (v_file, v_string);
END LOOP;
UTL_FILE.fclose (v_file);
EXCEPTION
WHEN OTHERS
THEN
IF UTL_FILE.is_open (v_file)
THEN
UTL_FILE.fclose (v_file);
END IF;
END;
The management SP will return values and you will use those values to create the CSV file with this new SP (export_to_csv)? If this is the case then
Create one Package Spec and define 2 SPs (Management and export_to_csv).
Then in the Package body, copy the logic of Management sp and at the end of Management SP, call your new SP (export_to_csv).
If you already have Package then directly use that package instead of creating the new one.

I am Getting invalid file operation error though file is present in my local system

create or replace directory MYCSV as 'E:\sqlloader\';
grant read, write on directory MYCSV to public;
declare
F UTL_FILE.FILE_TYPE;
V_LINE VARCHAR2 (1000);
V_id NUMBER(4);
V_NAME VARCHAR2(10);
V_risk VARCHAR2(10);
BEGIN
F := UTL_FILE.FOPEN ('MYCSV', 'testfile.csv', 'R');
IF UTL_FILE.IS_OPEN(F) THEN
LOOP
BEGIN
UTL_FILE.GET_LINE(F, V_LINE, 1000);
IF V_LINE IS NULL THEN
EXIT;
END IF;
V_id := REGEXP_SUBSTR(V_LINE, '[^,]+', 1, 1);
V_NAME := REGEXP_SUBSTR(V_LINE, '[^,]+', 1, 2);
V_risk := REGEXP_SUBSTR(V_LINE, '[^,]+', 1, 3);
INSERT INTO loader_tab VALUES(V_id, V_NAME, V_risk);
COMMIT;
EXCEPTION
WHEN NO_DATA_FOUND THEN
EXIT;
END;
END LOOP;
END IF;
UTL_FILE.FCLOSE(F);
END;
/
CSV file content wherein I need to start loading from 1,a,aa and need to skip first 4 lines:
portal,,
ex portal,,
,,
i_id,i_name,risk
1,a,aa
2,b,bb
3,c,cc
4,d,dd
5,e,ee
6,f,ff
7,g,gg
8,h,hh
9,i,ii
10,j,jj
I want to load the data from excel but I am getting an invalid file operation error. Will someone help with this? Not able to load the data from an excel file. I am Getting invalid file operation error though file is present in my local system.
though file is present in my local system
It won't work unless your local system (I presume you mean your own PC) also runs the database into which you're trying to load data. Oracle directory (probably in 99% of all cases) resides on the database server.
I want to load the data from excel
It won't work either, if that's really an Excel file. Code you posted suggests that it is a comma-separated values file (textual, that is), and yes - it should be such a file, not XLSX.

how to export data from around 300 tables in ORACLE DB to csv or txt files

Is there any possibility to export data from around 300 tables within single schema with millions of records to CSV or TXT using any PL/SQL procedure?
What do you propose, which is fastest way to do it? For the moment I do not need to import these exported files to any other schema...
I tried with Toad manually exporting table by table...
you can try following steps.
write a loop to get the table names
use cursors to fetch the data from each table
use SYS.UTL_FILE utilities to write the data to files in any required format.
This is a very high level solution. But i am sure it will work.
I have created a utility by which you can generate PL/SQL procedures to export data from a table. It will take the following parameters, table name, column names, directory name and the delimiter. You can generate 50 procedures for 50 tables in no time to export data from Oracle. Check this link Generate PL/SQL Procedure to export data into CSV
I managed to dynamically go through all tables and get column names and write to a file. I am struggling into part how to fetch data rows from tables dynamically when execute immediate query? how should I save data rows and than fetch it and write to files?
Here is the code:
DECLARE p_table VARCHAR2 (100);
l_file UTL_FILE.FILE_TYPE;
l_string VARCHAR2 (10000);
query_string VARCHAR2 (4000);
BEGIN
FOR tab IN (SELECT *
FROM dba_tables
WHERE owner = 'XYZ' AND table_name LIKE 'XYZ%')
LOOP
p_table := tab.table_name;
l_file :=
UTL_FILE.FOPEN ('my_path',
tab.table_name || '.txt',
'w',
10000);
l_string := NULL;
FOR col_he IN (SELECT *
FROM dba_tab_columns
WHERE owner = 'DWHCO' AND table_name = p_table)
LOOP
CASE
WHEN l_string IS NULL
THEN
l_string := col_he.column_name;
ELSE
l_string := l_string || ',' || col_he.column_name;
END CASE;
END LOOP;
UTL_FILE.PUT_LINE (l_file, l_string); --Printng table columns
query_string := 'select ' || l_string || ' from DWHCO.' || p_table
--Execute immediate query_string into ??????????;
--??????
UTL_FILE.FCLOSE (l_file); END LOOP;END;
The Data Dump procedure is helpful for programmatically exporting many tables to simple formats like CSV.
First, install the package using the above link. The below code creates a directory, cycles through tables, and exports each table as CSV.
create or replace directory temp_dir as 'C:\temp';
begin
for tables in
(
select
owner||'_'||table_name||'.csv' file_name,
'select * from "'||owner||'"."'||table_name||'"' v_sql
from dba_tables
where owner = 'XYZ'
and table_name like 'XYZ%'
order by 1
) loop
data_dump
(
query_in => tables.v_sql,
file_in => tables.file_name,
directory_in => 'TEMP_DIR',
delimiter_in => ',',
header_row_in => true
);
end loop;
end;
/

xmltype character string buffer too small

In my stored procedure:
declare
v_xml xmltype;
begin
open v_cur for
Select xmlelement('el',xmlagg(xmlelement('el2'))) from table;
loop
fetch v_cur into v_xml; -- line where the error
*.....additional logic to parse v_xml*
end loop;
end;
I'm getting a "character string buffer too small" error when the record to be fetched into v_xml has a length > 4000. Do you guys have any idea on how to go about this? Thanks
If you use xmlagg(), you'll have to add .getclobval() to the surrounding xmlelement() since the char limit is 4000 on xmlagg(). Obviously this means you'll be using clobs instead of xmltype but you have no choice, you'll have to cast back to xmltype later if needed. Example below:
declare
v_xml clob; -- Use CLOB
begin
open v_cur for
Select xmlelement("el",xmlagg(xmlelement("el2", tab_col))).getclobval() from table; -- add .getclobval()
loop
fetch v_cur into v_xml; -- line where the error
*.....additional logic to parse v_xml*
end loop;
end;
Maybe you're using old Oracle version? There have been some limitations in past. For me it works with 10 000 000 rows:
declare
v_xml xmltype;
begin
select xmlelement("el", xmlagg(xmlelement("el2")))
into v_xml from (select 1 from dual connect by level <= 10000000);
end;

How to redirect the output of DBMS_OUTPUT.PUT_LINE to a file?

I need to debug in pl/sql to figure times of procedures, I want to use:
SELECT systimestamp FROM dual INTO time_db;
DBMS_OUTPUT.PUT_LINE('time before procedure ' || time_db);
but I don't understand where the output goes to and how can I redirect it to a log file that will contain all the data I want to collect?
DBMS_OUTPUT is not the best tool to debug, since most environments don't use it natively. If you want to capture the output of DBMS_OUTPUT however, you would simply use the DBMS_OUTPUT.get_line procedure.
Here is a small example:
SQL> create directory tmp as '/tmp/';
Directory created
SQL> CREATE OR REPLACE PROCEDURE write_log AS
2 l_line VARCHAR2(255);
3 l_done NUMBER;
4 l_file utl_file.file_type;
5 BEGIN
6 l_file := utl_file.fopen('TMP', 'foo.log', 'A');
7 LOOP
8 EXIT WHEN l_done = 1;
9 dbms_output.get_line(l_line, l_done);
10 utl_file.put_line(l_file, l_line);
11 END LOOP;
12 utl_file.fflush(l_file);
13 utl_file.fclose(l_file);
14 END write_log;
15 /
Procedure created
SQL> BEGIN
2 dbms_output.enable(100000);
3 -- write something to DBMS_OUTPUT
4 dbms_output.put_line('this is a test');
5 -- write the content of the buffer to a file
6 write_log;
7 END;
8 /
PL/SQL procedure successfully completed
SQL> host cat /tmp/foo.log
this is a test
As an alternative to writing to a file, how about writing to a table? Instead of calling DBMS_OUTPUT.PUT_LINE you could call your own DEBUG.OUTPUT procedure something like:
procedure output (p_text varchar2) is
pragma autonomous_transaction;
begin
if g_debugging then
insert into debug_messages (username, datetime, text)
values (user, sysdate, p_text);
commit;
end if;
end;
The use of an autonomous transaction allows you to retain debug messages produced from transactions that get rolled back (e.g. after an exception is raised), as would happen if you were using a file.
The g_debugging boolean variable is a package variable that can be defaulted to false and set to true when debug output is required.
Of course, you need to manage that table so that it doesn't grow forever! One way would be a job that runs nightly/weekly and deletes any debug messages that are "old".
use
set serveroutput on;
for example:
set serveroutput on;
DECLARE
x NUMBER;
BEGIN
x := 72600;
dbms_output.put_line('The variable X = '); dbms_output.put_line(x);
END;
If you are just testing your PL/SQL in SQL Plus you can direct it to a file like this:
spool output.txt
set serveroutput on
begin
SELECT systimestamp FROM dual INTO time_db;
DBMS_OUTPUT.PUT_LINE('time before procedure ' || time_db);
end;
/
spool off
IDEs like Toad and SQL Developer can capture the output in other ways, but I'm not familiar with how.
In addition to Tony's answer, if you are looking to find out where your PL/SQL program is spending it's time, it is also worth checking out this part of the Oracle PL/SQL documentation.
Using UTL_FILE instead of DBMS_OUTPUT will redirect output to a file:
http://oreilly.com/catalog/oraclebip/chapter/ch06.html
As a side note, remember that all this output is generated in the server side.
Using DBMS_OUTPUT, the text is generated in the server while it executes your query and stored in a buffer. It is then redirected to your client app when the server finishes the query data retrieval. That is, you only get this info when the query ends.
With UTL_FILE all the information logged will be stored in a file in the server. When the execution finishes you will have to navigate to this file to get the information.
Hope this helps.
Its possible write a file directly to the DB server that hosts your database, and that will change all along with the execution of your PL/SQL program.
This uses the Oracle directory TMP_DIR; you have to declare it, and create the below procedure:
CREATE OR REPLACE PROCEDURE write_log(p_log varchar2)
-- file mode; thisrequires
--- CREATE OR REPLACE DIRECTORY TMP_DIR as '/directory/where/oracle/can/write/on/DB_server/';
AS
l_file utl_file.file_type;
BEGIN
l_file := utl_file.fopen('TMP_DIR', 'my_output.log', 'A');
utl_file.put_line(l_file, p_log);
utl_file.fflush(l_file);
utl_file.fclose(l_file);
END write_log;
/
Here is how to use it:
1) Launch this from your SQL*PLUS client:
BEGIN
write_log('this is a test');
for i in 1..100 loop
DBMS_LOCK.sleep(1);
write_log('iter=' || i);
end loop;
write_log('test complete');
END;
/
2) on the database server, open a shell and
tail -f -n500 /directory/where/oracle/can/write/on/DB_server/my_output.log
An old thread, but there is another alternative.
Since 9i you can use pipelined table function.
First, create a type as a table of varchar:
CREATE TYPE t_string_max IS TABLE OF VARCHAR2(32767);
Second, wrap your code in a pipelined function declaration:
CREATE FUNCTION fn_foo (bar VARCHAR2) -- your params
RETURN t_string_max PIPELINED IS
-- your vars
BEGIN
-- your code
END;
/
Replace all DBMS_OUTPUT.PUT_LINE for PIPE ROW.
Finally, call it like this:
SELECT * FROM TABLE(fn_foo('param'));
Hope it helps.
Try This:
SELECT systimestamp INTO time_db FROM dual ;
DBMS_OUTPUT.PUT_LINE('time before procedure ' || time_db);

Resources