Generating CSV file from Oracle DB - oracle

I am trying to generate CSV file from my oracle DB and send it as an attachment via mail . I have an issue with the CSV file generated. I could see that certain values from the table is not read properly. Like the value in description column is written in CSV file in two different cells instead of a single cell .
you can see that a value is written in two cells instead of a single cell as rest of the values. Here 'Testing pls ignore mails' should be treated as a single value and should be in the same cell .
I noticed one more thing that the value is in the table as follows:
could this be because my code is unable to read the value in multiple lines ?
Here is my code for creating the CSV file and the field FC_ED_DESC is causing the problems and I am including " l_clob " in the attachment section of the code that I have written for mailing .
DECLARE
l_clob CLOB;
l_attach_text VARCHAR2 (32767);
l_attach_text_h VARCHAR2 (32767);
FC_SV_STATUS_DESC VARCHAR2(200) := 'open';
-- select query from table to get the values needed
CURSOR c1 IS
select FC_ED_RECORD_ID,to_char(FC_ED_UPLOADTIME,'DD.MM.YY')FC_ED_UPLOADTIME,FC_ED_USER_ID ,FC_ED_BROKER,FC_ED_ACT_NUM,FC_ED_POLICY_NUM,FC_ED_AMOUNT,FC_ED_TRANS_TYPE,FC_ED_CURRENCY,to_char(FC_ED_DUE_DATE,'DD.MM.YY')FC_ED_DUE_DATE,FC_ED_SENDER_NAME,FC_ED_DESC,to_char(FC_ED_CREDIT_DATE,'DD.MM.YY')FC_ED_CREDIT_DATE from MYTABLE where FC_ED_EXPCASH_STATUS = 1 ;
BEGIN
-- csv file columns are these
l_attach_text_h :=
'FC_ED_RECORD_ID ,FC_ED_UPLOADTIME ,FC_ED_USER_ID ,FC_ED_BROKER ,FC_ED_ACT_NUM ,FC_ED_POLICY_NUM ,FC_ED_AMOUNT ,FC_SV_STATUS_DESC ,FC_ED_TRANS_TYPE ,FC_ED_CURRENCY ,FC_ED_DUE_DATE ,FC_ED_SENDER_NAME ,FC_ED_DESC ,FC_ED_CREDIT_DATE';
FOR employee_rec in c1
LOOP
DBMS_OUTPUT.put_line('Before loop COUNT Boss ...'||c1%ROWCOUNT);
-- each value is read using loop
l_attach_text :=
employee_rec.FC_ED_RECORD_ID ||','||
employee_rec.FC_ED_UPLOADTIME ||','||
employee_rec.FC_ED_USER_ID ||','||
employee_rec.FC_ED_BROKER ||','||
employee_rec.FC_ED_ACT_NUM ||','||
employee_rec.FC_ED_POLICY_NUM ||','||
employee_rec.FC_ED_AMOUNT ||','||
FC_SV_STATUS_DESC ||','||
employee_rec.FC_ED_TRANS_TYPE ||','||
employee_rec.FC_ED_CURRENCY ||','||
employee_rec.FC_ED_DUE_DATE ||','||
employee_rec.FC_ED_SENDER_NAME ||','||
employee_rec.FC_ED_DESC ||','||
employee_rec.FC_ED_CREDIT_DATE ||chr(13);
l_clob := l_clob || l_attach_text;
END LOOP;
-- adding values
l_clob := l_attach_text_h ||chr(13)|| l_clob;
Can anyone please help me with this ?

The problem is that you are not surrounding your fields with double quotes. Anything that contains a carriage return will be read as a new record in your CSV file.
Since you are using SQL Developer, the easiest way to generate a CSV file is to use the /*csv*/ hint in your query. This will properly quote all the fields.
Put this code into SQL Developer and execute it using F5 to run it in Script mode and you should get a properly quoted CSV output in DBMS_OUTPUT.
SELECT /*csv*/
FC_ED_RECORD_ID,
TO_CHAR (FC_ED_UPLOADTIME, 'DD.MM.YY') FC_ED_UPLOADTIME,
FC_ED_USER_ID,
FC_ED_BROKER,
FC_ED_ACT_NUM,
FC_ED_POLICY_NUM,
FC_ED_AMOUNT,
FC_ED_TRANS_TYPE,
FC_ED_CURRENCY,
TO_CHAR (FC_ED_DUE_DATE, 'DD.MM.YY') FC_ED_DUE_DATE,
FC_ED_SENDER_NAME,
FC_ED_DESC,
TO_CHAR (FC_ED_CREDIT_DATE, 'DD.MM.YY') FC_ED_CREDIT_DATE
FROM MYTABLE
WHERE FC_ED_EXPCASH_STATUS = 1;
Another option is to just run the query you want in SQL Developer, then highlight the results you want to export (or press Control + A), then right click and select Export and choose your Format to be CSV
Update
To fix your existing code, put double quotes around each of the fields you are combining into your l_attach_text variable. See example below.
l_attach_text := '"' ||
employee_rec.FC_ED_RECORD_ID ||'","'||
employee_rec.FC_ED_UPLOADTIME ||'","'||
employee_rec.FC_ED_USER_ID ||'","'||
employee_rec.FC_ED_BROKER ||'","'||
-- ...(the rest of your fields here)
employee_rec.FC_ED_DESC ||'","'||
employee_rec.FC_ED_CREDIT_DATE || '"' || chr(13);

Related

How to Convert Oracle Stored Procedure Output to CSV file

I have seen many examples where you can convert a SQL Query output to a CSV file by calling it from a stored procedure as shown in PL/SQL code below. However, in my case, I have a stored procedure called Management with 3 input parameters. Once these inputs are confirmed, the stored procedure extracts the result with 10 columns based on the code in the PL/SQL.
My aim is to convert this extract from the Management stored procedure to a CSV. The examples I found so far are all to do with converting SQL query only to CSV by calling it from a stored procedure.
I'm trying to avoid writing the code to convert the output to CSV in the Management stored procedure by calling it from another stored procedure. Is this possible?
The Management stored procedure is called by the SQL developer and the values are returned by applying a select query from two tables and applying multiple cases and if statements.
An example will be much appreciated.
CREATE OR REPLACE PROCEDURE export_to_csv
IS
v_file UTL_FILE.file_type;
v_string VARCHAR2 (4000);
CURSOR c_emp
IS
SELECT empno,
ename,
deptno,
sal,
comm
FROM emp;
BEGIN
v_file :=
UTL_FILE.fopen ('CSVDIR',
'empdata.csv',
'w',
1000);
-- if you do not want heading then remove below two lines
v_string := 'Emp Code, Emp Name, Dept, Salary, Commission';
UTL_FILE.put_line (v_file, v_string);
FOR cur IN c_emp
LOOP
v_string :=
cur.empno
|| ','
|| cur.ename
|| ','
|| cur.deptno
|| ','
|| cur.sal
|| ','
|| cur.comm;
UTL_FILE.put_line (v_file, v_string);
END LOOP;
UTL_FILE.fclose (v_file);
EXCEPTION
WHEN OTHERS
THEN
IF UTL_FILE.is_open (v_file)
THEN
UTL_FILE.fclose (v_file);
END IF;
END;
The management SP will return values and you will use those values to create the CSV file with this new SP (export_to_csv)? If this is the case then
Create one Package Spec and define 2 SPs (Management and export_to_csv).
Then in the Package body, copy the logic of Management sp and at the end of Management SP, call your new SP (export_to_csv).
If you already have Package then directly use that package instead of creating the new one.

UNIQUE NAMES IN ORACLE PL/SQL

I have a procedure that converts the result of a specific query in a CSV File which is saved in a directory in the server. My question is that is there any way that i can generate unique names for the file every time i save it?
try this
-- not tested
select to_char( sysdate,'yyyymmddhh24miss' ) || '.txt' File_nm from dual;
spool &File_nm ;
-- run query
spool off;
If you create a SEQUENCE, you can increment the sequence each time you go to write your file.
DECLARE
l_filename VARCHAR2 (200);
BEGIN
SELECT 'somename_' || seq_name.NEXTVAL INTO l_filename FROM DUAL;
END;

how to export data from around 300 tables in ORACLE DB to csv or txt files

Is there any possibility to export data from around 300 tables within single schema with millions of records to CSV or TXT using any PL/SQL procedure?
What do you propose, which is fastest way to do it? For the moment I do not need to import these exported files to any other schema...
I tried with Toad manually exporting table by table...
you can try following steps.
write a loop to get the table names
use cursors to fetch the data from each table
use SYS.UTL_FILE utilities to write the data to files in any required format.
This is a very high level solution. But i am sure it will work.
I have created a utility by which you can generate PL/SQL procedures to export data from a table. It will take the following parameters, table name, column names, directory name and the delimiter. You can generate 50 procedures for 50 tables in no time to export data from Oracle. Check this link Generate PL/SQL Procedure to export data into CSV
I managed to dynamically go through all tables and get column names and write to a file. I am struggling into part how to fetch data rows from tables dynamically when execute immediate query? how should I save data rows and than fetch it and write to files?
Here is the code:
DECLARE p_table VARCHAR2 (100);
l_file UTL_FILE.FILE_TYPE;
l_string VARCHAR2 (10000);
query_string VARCHAR2 (4000);
BEGIN
FOR tab IN (SELECT *
FROM dba_tables
WHERE owner = 'XYZ' AND table_name LIKE 'XYZ%')
LOOP
p_table := tab.table_name;
l_file :=
UTL_FILE.FOPEN ('my_path',
tab.table_name || '.txt',
'w',
10000);
l_string := NULL;
FOR col_he IN (SELECT *
FROM dba_tab_columns
WHERE owner = 'DWHCO' AND table_name = p_table)
LOOP
CASE
WHEN l_string IS NULL
THEN
l_string := col_he.column_name;
ELSE
l_string := l_string || ',' || col_he.column_name;
END CASE;
END LOOP;
UTL_FILE.PUT_LINE (l_file, l_string); --Printng table columns
query_string := 'select ' || l_string || ' from DWHCO.' || p_table
--Execute immediate query_string into ??????????;
--??????
UTL_FILE.FCLOSE (l_file); END LOOP;END;
The Data Dump procedure is helpful for programmatically exporting many tables to simple formats like CSV.
First, install the package using the above link. The below code creates a directory, cycles through tables, and exports each table as CSV.
create or replace directory temp_dir as 'C:\temp';
begin
for tables in
(
select
owner||'_'||table_name||'.csv' file_name,
'select * from "'||owner||'"."'||table_name||'"' v_sql
from dba_tables
where owner = 'XYZ'
and table_name like 'XYZ%'
order by 1
) loop
data_dump
(
query_in => tables.v_sql,
file_in => tables.file_name,
directory_in => 'TEMP_DIR',
delimiter_in => ',',
header_row_in => true
);
end loop;
end;
/

How to deal with sequence in insert from XMLTable?

I have write a PL/SQL function that takes input in XML format for the
following table:
TABLE: TBL_MEDICAL_CENTER_BILLS
Name Null Type
------------- -------- -------------
MED_RECORDNO NOT NULL NUMBER
MED_EMPID NVARCHAR2(10)
MED_BILL_HEAD NVARCHAR2(20)
MED_DATE DATE
MED_AMOUNT FLOAT(126)
Here is the function code:
FUNCTION save_medical_center_bills(medical_bill_data NVARCHAR2 ) RETURN clob IS ret clob;
xmlData XMLType;
v_code NUMBER;
v_errm VARCHAR2(100);
BEGIN
xmlData:=XMLType(medical_bill_data);
INSERT INTO TBL_MEDICAL_CENTER_BILLS SELECT x.* FROM XMLTABLE('/medical_center_bill'
PASSING xmlData
COLUMNS MED_RECORDNO NUMBER PATH 'MED_RECORDNO' default null,
MED_EMPID NVARCHAR2(11) PATH 'employee_id',
MED_BILL_HEAD NVARCHAR2(20) PATH 'bill_head' ,
MED_DATE DATE PATH 'effective_date',
MED_AMOUNT FLOAT PATH 'bill_amount'
) x;
ret:=to_char(sql%rowcount);
COMMIT;
RETURN '<result><status affectedRow='||ret||'>success</status></result>';
EXCEPTION
WHEN OTHERS THEN
v_code := SQLCODE;
v_errm := SUBSTR(SQLERRM, 1, 100);
DBMS_OUTPUT.PUT_LINE (v_code || ' ' || v_errm);
-- '<result><status>Error</status> <error_message>'|| 'Error Code:' || v_code || ' ' || 'Error Message:' || v_errm ||'</error_message> </result>';
RETURN '<result><status>Error</status> <error_message>'|| 'Error Message:' || v_errm ||'</error_message> </result>';
END save_medical_center_bills;
However, I want to keep table's first column MED_RECORDNO as incrementing sequence (at the moment I am keeping it null since I don't know how to put the sequence in the XMLTable clause) and the rest of the
inputs [MED_EMPID, MED_BILL_HEAD , MED_DATE , MED_AMOUNT] will be taken from the XML passed to the function.
I created a sequence and a trigger to keep this sequence incremented for that table column MED_RECORDNO:
CREATE SEQUENCE MED_RECORDNO_SEQ;
create or replace TRIGGER MED_RECORDNO_TRIGGER
BEFORE INSERT ON TBL_MEDICAL_CENTER_BILLS FOR EACH ROW
WHEN (new.MED_RECORDNO is null)
DECLARE
v_id TBL_MEDICAL_CENTER_BILLS.MED_RECORDNO%TYPE;
BEGIN
SELECT MED_RECORDNO_seq.nextval INTO v_id FROM DUAL;
:new.MED_RECORDNO := v_id;
END;
As you can see, my XMLTable is inserting 4 column values in a 5 column table, because columns MED_RECORDNO will take its value from sequence MED_RECORDNO_SEQ using TRIGGER MED_RECORDNO_TRIGGER.
I don't know any thing about doing this. If you have ever experience such things, then please share your idea.
I sort of hinted at this in an earlier answer. You should specify the names of of the columns in the table you are inserting into; this is good practice even if you are populating all of them, as it will avoid surprises if the table structure changes (or differs between environments), and makes it much easier to spot like having columns or values in the wrong order.
INSERT INTO TBL_MEDICAL_CENTER_BILLS (MED_EMPID, MED_BILL_HEAD, MED_DATE, MED_AMOUNT)
SELECT x.MED_EMPID, x.MED_BILL_HEAD, x.MED_DATE, x.MED_AMOUNT
FROM XMLTABLE('/medical_center_bill'
PASSING xmlData
COLUMNS MED_EMPID NVARCHAR2(11) PATH 'employee_id',
MED_BILL_HEAD NVARCHAR2(20) PATH 'bill_head' ,
MED_DATE DATE PATH 'effective_date',
MED_AMOUNT FLOAT PATH 'bill_amount'
) x;
The insert you have should actually work (if the column order in the table matches); the trigger will still replace the null value you get from the XMLTable with the sequence value. At least, until you make the MED_RECORDNO column not-null, and you probably want to if it's the primary key.
Incidentally, if you're on 11g or higher your trigger can assign the sequence straight to the NEW pseudorecord:
create or replace TRIGGER MED_RECORDNO_TRIGGER
BEFORE INSERT ON TBL_MEDICAL_CENTER_BILLS
FOR EACH ROW
BEGIN
:new.MED_RECORDNO := MED_RECORDNO_seq.nextval;
END;
The when null check implies you sometimes want to allow a value to be specified; that is a bad idea as manually inserted values can clash with sequence values, either giving you duplicates or a unique/primary key exception.

How do I convert row into CLOB in the applied trigger after update?

The idea is, I want to clone the record as a CLOB when it is updated.
Why do it in such a way?
There are two different applications A1 and A2, A1 is depended on by A2.
Based on A1 values, calculations are made for values for A2.
The A2 process runs just once per day to calculate the values, but for A1 every field in the TABLE_NAME in question can be altered several times a day and doesn't have a history.
The aim is to create a history which is a CLOB field in a table "NEW_TABLE" of automatic form.
Sorry for my English, but if something is not understandable I can rewrite the question
My Code Here:
CREATE or REPLACE TRIGGER TRIGGER_NAME
AFTER UPDATE
ON TABLE_NAME
FOR EACH ROW
DECLARE
row_record NEW_TABLE%rowtype;
c_xml CLOB;
FUNCTION GetXML(a_tablela varchar2, a_key_1 varchar2, a_key_2 varchar2)
RETURN CLOB
is
x_xml CLOB;
BEGIN
select dbms_xmlgen.getxml('select * from '||a_tablela||' where key_1 = '''||a_key_1||''' and key_2 = '''||a_key_2||'''') into x_xml from dual;
return x_xml;
END;
BEGIN
--** TABLE_NAME Automatically fetches all columns and transforms them to CLOB
c_xml := GetXML('TABLE_NAME', :new.key_1, :new.key_2);
if c_xml is not null then
row_record.TABLE_NAME :=c_xml;
end if;
INSERT INTO NEW_TABLE VALUES row_record;
EXCEPTION
when others then
raise_application_error(-20000,'ERROR: '||to_char(sqlcode));
END;
Now I get error:
ORA-04091: table TABLE_NAME is mutating, trigger/function may not see it.
when I get this record across SELECT statement.
How do I convert row into CLOB in the applied TRIGGER AFTER UPDATE ?
Thanks.
The reason you can't use a select statement is because you're in the trigger, and the table is changing, or 'mutating', as the error says. The only way you can get the data from the row that's being updated here is using new and old:
old.column1
new.column1
Old being the value of the column before the update, new being the value after the update.
Example:
CREATE or REPLACE TRIGGER TRIGGER_NAME
AFTER UPDATE
ON TABLE_NAME
FOR EACH ROW
BEGIN
l_string := 'This is the old value for column 1: ' || old.column1 || '. This is the new value: ' || new.column1;
dbms_output.put_line(l_string);
END;
You won't be able to use dbms_xmlgen because it uses a select statement, which throws the mutating error exception.
I'm not sure I perfectly understand what you're trying to do, but you should be able to build the CLOB yourself just by concatenating yourself with the column names. Like this:
CREATE or REPLACE TRIGGER TRIGGER_NAME
AFTER UPDATE
ON TABLE_NAME
FOR EACH ROW
BEGIN
l_clob := 'Column1 ' || old.column1 || ', Column2 ' || old.column2; --For as many columns as are in the table
--Now you have a clob with all the old values, insert it where you want it
END;
And then go from there. If you really want the XML format you can do that yourself as well, just concatenate the strings together.

Resources