Reading a csv file with PL/SQL - oracle

I have a csv file and i want to load it in oracle apex and when i click submit it must read through the csv file and return all rows in the csv what have null inside.
So i have a table name csvtest and it has fields id, name and age.
when i upload a csv file into oracle apex using a file browser with this these fields in it, i want to read through it and find all the rows that have NULL in the column AGE using plsql and return them else if they none contain null then successfully upload the file
here is the code i have so far, i do not have any way of reading the column
DECLARE
F_FILE UTL_FILE.FILE_TYPE;
V_LINE VARCHAR2 (1000);
V_ID NUMBER(10);
V_NAME VARCHAR2(70);
V_AGE NUMBER(2);
BEGIN
F_FILE := UTL_FILE.FOPEN ('TEMP.CSV', 'R', 32767);
IF UTL_FILE.IS_OPEN(F_FILE) THEN
LOOP
BEGIN
UTL_FILE.GET_LINE(F_FILE, V_LINE, 32767);
IF V_LINE IS NULL THEN
EXIT;
END IF;
V_ID := REGEXP_SUBSTR(V_LINE, '[^,]+', 1, 1);
V_NAME := REGEXP_SUBSTR(V_LINE, '[^,]+', 1, 2);
V_AGE := REGEXP_SUBSTR(V_LINE, '[^,]+', 1, 3);
INSERT INTO EMP_DEPT VALUES(V_ID, V_NAME, V_AGE);
COMMIT;
EXCEPTION
WHEN NO_DATA_FOUND THEN
EXIT;
END;
END LOOP;
END IF;
UTL_FILE.FCLOSE(F_FILE);
END;
/

Since you're using APEX, a very easy way is to use the provided APEX_DATA_PARSER package. Just load your CSV into a blob (or into any table that has a blob column), and then it is a simple (for example)
select line_number, col001, col002, col003, col004, col005,
col006, col007, col008, col009, col010
-- more columns (col011 to col300) can be selected here.
from apex_application_temp_files f,
table( apex_data_parser.parse(
p_content => f.blob_content,
p_add_headers_row => 'Y',
--
p_max_rows => 5,
p_skip_rows => 2,
p_csv_col_delimiter => ';',
--
p_store_profile_to_collection => 'FILE_PARSER_COLLECTION',
p_file_name => f.filename ) ) p
where f.name = :PX_FILE
It can load CSV, JSON, XML, Excel etc etc...
Full docs here
https://docs.oracle.com/en/database/oracle/application-express/19.1/aeapi/PARSE-Function.html#GUID-B815CF74-C469-4F78-9433-643D1339E930
and some more examples on the Oracle APEX blog
https://blogs.oracle.com/apex/super-easy-csv-xlsx-json-or-xml-parsing-about-the-apex_data_parser-package

One way to do this is to use APEX_DATA_PARSER API before you save the data to your table.
The PARSE function enables you to parse XML, XLSX, CSV or JSON files and returns a generic table of the following structure:
LINE_NUMBER COL001 COL002 COL003 COL004 ... COL300
Sample code can be something similar to this:
select line_number, col001,col002,col003,col004,col005,col006,col007,col008
from table(
apex_data_parser.parse(
p_content => {BLOB containing CSV file},
p_file_name => 'test.CSV') );
Then you can easily determine the values of the columns based on their positions and decide whether to save the data to the table or not.

Related

UTL_FILE using append mode creates repeating headers

How to use "A"ppend mode in UTL_FILE package, but only create one header (not repeating)? Is it possible? I'm appending data, but everytime it appends, it will create repeating headers.
My code:
CREATE OR REPLACE PROCEDURE p_test AS
CURSOR c_test IS
select blah, blah from dual;
v_file UTL_FILE.FILE_TYPE;
v_header varchar2(25);
BEGIN
v_file := UTL_FILE.FOPEN(location => 'my_dir',
filename => 'filetest09102019.csv',
open_mode => 'A',
max_linesize => 32767);
If file exists = 0 then --using fgetattr; if I use 1, repeating headers will print
v_header := 'col1, col2, col3';
utl_file.put_line (v_file, v_header);
Else null; end if; --unfortunately headers did not print at all when false/0
FOR cur_rec IN c_test LOOP
UTL_FILE.PUT_LINE(v_file, data from c_test );
END LOOP;
UTL_FILE.FCLOSE(v_file);
EXCEPTION
WHEN OTHERS THEN
UTL_FILE.FCLOSE(v_file);
END;
If you are calling this procedure multiple times, you will get the header appended to the file each time the procedure is called.
You could first check if the file exists before appending the header, e.g. using fgetattr to detect if the file is going to be appended Check if a file exists?
Alternatively, modify your code so that it only calls the procedure once and writes all the data in one go, without appending.
You can solve this with a bit of design. At the moment you have one procedure which opens the file in append mode, writes the header to it, then writes the data to it. What you need is a sub-routine for opening the file. This procedure would be and
implements the following logic:
Test whether the file exists (like this one)
If the file doesn't exist, create the file in Write mode, write the header and then close the file
Open the file in Append mode.
Your existing procedure now just calls the routine described above and writes the data to the opened file. Something this (using borrowed and untested code):
CREATE OR REPLACE PROCEDURE p_test AS
CURSOR c_test IS
select blah, blah from dual;
v_file UTL_FILE.FILE_TYPE;
v_header varchar2(25) := 'col1, col2, col3';
function open_file (p_filename in varchar2
, p_dirname in varchar2
, p_header in varchar2
)
return UTL_FILE.FILE_TYPE
is
fh UTL_FILE.FILE_TYPE;
l_fexists boolean;
l_flen number;
l_bsize number;
l_res number(1);
begin
utl_file.fgetattr(upper(p_DirName), p_FileName, l_fexists, l_flen, l_bsize);
if not l_fexists then
fh := UTL_FILE.FOPEN(location => p_DirName,
filename => p_FileName,
open_mode => 'W',
max_linesize => 32767);
utl_file.put_line (fh, p_header);
utl_file.fclose(fh);
end if;
fh := UTL_FILE.FOPEN(location => p_DirName,
filename => p_FileName,
open_mode => 'A',
max_linesize => 32767);
return fh;
end open_file;
BEGIN
v_file := open_file 'my_dir', 'filetest09102019.csv', v_header);
FOR cur_rec IN c_test LOOP
UTL_FILE.PUT_LINE(v_file, data from c_test );
END LOOP;
UTL_FILE.FCLOSE(v_file);
EXCEPTION
WHEN OTHERS THEN
UTL_FILE.FCLOSE(v_file);
END;
Strictly speaking open_file () doesn't need to a private procedure in this example. But general I think it's good practice to hide low level stuff in separate procedures, because it makes the main body of the code easier to read. Also it is frequently the case that we'll want to do this in more than one place (for more than one type of file) so it's handy to encapsulate.

how to export data from around 300 tables in ORACLE DB to csv or txt files

Is there any possibility to export data from around 300 tables within single schema with millions of records to CSV or TXT using any PL/SQL procedure?
What do you propose, which is fastest way to do it? For the moment I do not need to import these exported files to any other schema...
I tried with Toad manually exporting table by table...
you can try following steps.
write a loop to get the table names
use cursors to fetch the data from each table
use SYS.UTL_FILE utilities to write the data to files in any required format.
This is a very high level solution. But i am sure it will work.
I have created a utility by which you can generate PL/SQL procedures to export data from a table. It will take the following parameters, table name, column names, directory name and the delimiter. You can generate 50 procedures for 50 tables in no time to export data from Oracle. Check this link Generate PL/SQL Procedure to export data into CSV
I managed to dynamically go through all tables and get column names and write to a file. I am struggling into part how to fetch data rows from tables dynamically when execute immediate query? how should I save data rows and than fetch it and write to files?
Here is the code:
DECLARE p_table VARCHAR2 (100);
l_file UTL_FILE.FILE_TYPE;
l_string VARCHAR2 (10000);
query_string VARCHAR2 (4000);
BEGIN
FOR tab IN (SELECT *
FROM dba_tables
WHERE owner = 'XYZ' AND table_name LIKE 'XYZ%')
LOOP
p_table := tab.table_name;
l_file :=
UTL_FILE.FOPEN ('my_path',
tab.table_name || '.txt',
'w',
10000);
l_string := NULL;
FOR col_he IN (SELECT *
FROM dba_tab_columns
WHERE owner = 'DWHCO' AND table_name = p_table)
LOOP
CASE
WHEN l_string IS NULL
THEN
l_string := col_he.column_name;
ELSE
l_string := l_string || ',' || col_he.column_name;
END CASE;
END LOOP;
UTL_FILE.PUT_LINE (l_file, l_string); --Printng table columns
query_string := 'select ' || l_string || ' from DWHCO.' || p_table
--Execute immediate query_string into ??????????;
--??????
UTL_FILE.FCLOSE (l_file); END LOOP;END;
The Data Dump procedure is helpful for programmatically exporting many tables to simple formats like CSV.
First, install the package using the above link. The below code creates a directory, cycles through tables, and exports each table as CSV.
create or replace directory temp_dir as 'C:\temp';
begin
for tables in
(
select
owner||'_'||table_name||'.csv' file_name,
'select * from "'||owner||'"."'||table_name||'"' v_sql
from dba_tables
where owner = 'XYZ'
and table_name like 'XYZ%'
order by 1
) loop
data_dump
(
query_in => tables.v_sql,
file_in => tables.file_name,
directory_in => 'TEMP_DIR',
delimiter_in => ',',
header_row_in => true
);
end loop;
end;
/

Handling headers using utl_file upload procedure in oracle

I am trying to upload a csv file having headers into a table using UTL_FILE, but I am facing an issue. With normal upload using utl_file, the headers are also included and the data results to be recorded in double quotes(which I don't require). If I exclude headers by using an if condition as in below code and declaring headline variable, it uploads the records without headers but still in double quotes. When I manually remove headers from csv file it uploads fine without quotes. But the csv is created newly every day so headers can't be removed from file.
Please suggest some solution which matches the headers between target table and csv then uploads the csv file data. Below is my procedure.
create or replace procedure load_file_new(p_FileDir in varchar2, p_FileName in varchar2)
as
v_FileHandle utl_file.file_type;
v_NewLine varchar2(2000);
v_a varchar2(100);
v_b varchar2(100);
v_c varchar2(100);
v_d varchar2(100);
p_ignore_headerlines number;
begin
v_FileHandle := utl_file.fopen(p_FileDir, p_FileName, 'r', 32767);
p_ignore_headerlines := 1;
if p_ignore_headerlines > 0
then
begin
for i in 1 .. p_ignore_headerlines
loop
utl_file.get_line(v_FileHandle, v_NewLine);
end loop;
end;
end if;
loop
begin
utl_file.get_line(v_FileHandle, v_NewLine);
exception
when no_data_found then
exit;
end;
v_a := regexp_substr(v_NewLine, '("[^"]*"|[^,]+)', 1, 1);
v_b := regexp_substr(v_NewLine, '("[^"]*"|[^,]+)', 1, 2);
v_c := regexp_substr(v_NewLine, '("[^"]*"|[^,]+)', 1, 3);
v_d := regexp_substr(v_NewLine, '("[^"]*"|[^,]+)', 1, 4);
merge into test using dual on (a = v_a)
when not matched then insert (a, b, c, d)
values (v_a, v_b, v_c, v_d)
when matched then update set
b = v_b, c = v_c, d = v_d where a = v_a;
end loop;
utl_file.fclose(v_FileHandle);
commit;
end load_file_new;

Generate the URL of a Remote File using PL/SQL

I created a Concurrent Program that creates an Excel File from a long, parametrized query using PL/SQL.
Once the Program successfully completes, the file is placed in the remote server's directory and is usually around 4 MB in Size.
I'm thinking of an approach to notify the requestor and enable him/her to save the file to their local directory.
However, I cannot use UTL_MAIL to attach and send the file via email due to the 32 Kilobyte Limitation. (Does UTL_MAIL have an attachment limit of 32k).
In the same post, Tom Kyte preferred approach would be to:
store the attachment to the database.
email a very small email with a link. the link points to my database - using a URL.
With that, i was thinking taking the same approach and use the block below to notify the requestor and enable him/her to download the said Excel file:
declare
l_url_link varchar2(100); -- how can i get the URL of the File?
BEGIN
UTL_MAIL.SEND(sender => 'xxx#oracle.com'
, recipients => 'Migs.Isip.23#Gmail.com'
, subject => 'Testmail'
, message => 'Your File is Ready to be downloaded, click the link here: '||l_url_link);
END;
My Questions would be:
How can i generate the "URL" of the Remote file using PL/SQL?
Do the users need to be granted access to the remote server to download the file?
Thank you!
Oracle Database Version:
Oracle Database 11g Enterprise Edition Release 11.2.0.4.0 - 64bit Production
PL/SQL Release 11.2.0.4.0 - Production
"CORE 11.2.0.4.0 Production"
TNS for Solaris: Version 11.2.0.4.0 - Production
NLSRTL Version 11.2.0.4.0 - Production
Here is a pl/sql function I wrote to retrieve the URL of either the concurrent log file or output file. If you write your Excel file to the concurrent output, this should work fine. Let me know how you get on. I have not checked to see if this will give the correct mime-type or extension - not sure how EBS handles this but the function itself will definitely compile as is for 12.1.3.
Spec
FUNCTION get_concurrent_url (p_file_type IN VARCHAR2
,p_request_id IN NUMBER
,p_expiry IN NUMBER)
RETURN VARCHAR2;
Body
/* Get a URL to view the log/output
File Type is LOG or OUT
Request ID is the concurrent request ID
Expiry is in minutes */
FUNCTION get_concurrent_url (p_file_type IN VARCHAR2
,p_request_id IN NUMBER
,p_expiry IN NUMBER)
RETURN VARCHAR2
IS
CURSOR c_gwyuid
IS
SELECT profile_option_value
FROM fnd_profile_options FPO
,fnd_profile_option_values FPOV
WHERE FPO.profile_option_name = 'GWYUID'
AND FPO.application_id = FPOV.application_id
AND FPO.profile_option_id = FPOV.profile_option_id;
CURSOR c_two_task
IS
SELECT profile_option_value
FROM fnd_profile_options FPO
,fnd_profile_option_values FPOV
WHERE FPO.profile_option_name = 'TWO_TASK'
AND FPO.application_id = FPOV.application_id
AND FPO.profile_option_id = FPOV.profile_option_id;
l_request_id NUMBER;
l_file_type VARCHAR2 (3 BYTE);
l_expiry NUMBER;
l_two_task VARCHAR2 (100 BYTE);
l_gwyuid VARCHAR2 (100 BYTE);
l_url VARCHAR2 (1024 BYTE);
BEGIN
l_request_id := p_request_id;
l_file_type := p_file_type;
l_expiry := p_expiry;
FOR i IN c_gwyuid LOOP
l_gwyuid := i.profile_option_value;
END LOOP;
FOR i IN c_two_task LOOP
l_two_task := i.profile_option_value;
END LOOP;
IF l_file_type = 'LOG' THEN
l_url := fnd_webfile.get_url
(file_type => fnd_webfile.request_log
,id => l_request_id
,gwyuid => l_gwyuid
,two_task => l_two_task
,expire_time => l_expiry);
ELSE
l_url := fnd_webfile.get_url
(file_type => fnd_webfile.request_out
,id => l_request_id
,gwyuid => l_gwyuid
,two_task => l_two_task
,expire_time => l_expiry);
END IF;
RETURN l_url;
END get_concurrent_url;
I was able to find a solution for this using a (slightly different) method using the FND_GFM File Uploader Package in Oracle EBS.
FND_GFM is a package usually used in Oracle EBS when uploading files from the front-end application pages.
First, generate the Excel file (xlsx) using the code from the previous post: Create an Excel File (.xlsx) using PL/SQL,
Then the file is inserted into FND_LOBS and removed the from the OS (for good housekeeping), and finally sent as an email using UTL_FILE:
procedure generate_and_send_excel
is
l_content varchar2(250);
l_file_url varchar2(4000);
l_directory varchar2(250);
l_filename varchar2(250);
l_message clob;
l_instance varchar2(100);
l_ebs_url varchar2(100);
begin
/* your excel generation code here */
l_content := 'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet';
l_directory := 'EXT_TAB_DATA';
l_filename := 'report.xlsx';
select instance_name
into l_instance
from v$instance;
select home_url
into l_ebs_url
from icx_parameters;
IMPORT_TO_LOB (p_file_name => l_filename -- this is the actual filename of the saved OS File
, p_directory => l_directory -- should be a defined directory in the Database
, p_content_type => l_content -- standard for Excel Files
, p_program_name => 'your prog here'
, p_program_tag => 'your prog here'
, p_file_url => l_file_url); -- this will be the generated URL of your File
utl_file.fremove(l_directory, l_filename);
l_message := l_message||'<h2 style="color: #5e9ca0;">'||l_title||'</h2>';
l_message := l_message||'<h3 style="color: #2e6c80;">Report is Ready for Download: '||l_filename||'</h3>';
l_message := l_message||'<p>File was generated on '|| sysdate ||' from '||l_instance||'</p>';
l_message := l_message||'<strong>Regards,</strong><br/><strong>Sample Team</strong>';
l_message := l_message||'<br/>Sample#sample.com';
UTL_MAIL.SEND(sender => 'SAMPLE#SAMPLE.com'
, recipients => 'Migs.Isip.23#gmail.com'
, subject => 'Hello message'
, message => l_message
, mime_type => 'text/html; charset=us-ascii');
end generate_and_send_excel;
Procedure below to insert into FND_LOBS (there's no available seeded API):
Procedure IMPORT_TO_LOB (p_file_name IN FND_LOBS.FILE_NAME%TYPE
, p_directory IN dba_directories.directory_name%type
, p_content_type IN FND_LOBS.file_content_type%type
, p_program_name IN FND_LOBS.program_name%type
, p_program_tag IN FND_LOBS.program_tag%type
, p_language IN FND_LOBS.language%type default 'US'
, p_file_format IN FND_LOBS.file_format%type default 'binary'
, p_file_url OUT varchar2)
IS
PRAGMA AUTONOMOUS_TRANSACTION;
lBlob BLOB;
lFile BFILE := BFILENAME(p_directory, p_file_name);
L_ORA_CHARSET VARCHAR2(100);
P_COUNT NUMBER;
BEGIN
SELECT value
into l_ora_charset
FROM nls_database_parameters
where parameter = 'NLS_CHARACTERSET';
insert into FND_LOBS
(
file_id
, file_name
, file_content_type
, file_data
, upload_date
, expiration_date
, program_name
, program_tag
, LANGUAGE
, oracle_charset
, file_format
)
values
(
fnd_lobs_s.NEXTVAL -- FILE_ID
, p_file_name -- FILE_NAME
, p_content_type -- FILE_CONTENT_TYPE
, EMPTY_BLOB() -- FILE_DATA
, sysdate -- UPLOAD_DATE
, NULL -- EXPIRATION_DATE
, p_program_name -- PROGRAM_NAME
, p_program_tag -- PROGRAM_TAG
, p_language -- LANGUAGE
, l_ora_charset -- ORACLE_CHARSET
, p_file_format -- FILE_FORMAT
)
RETURNING file_data INTO lBlob;
DBMS_LOB.OPEN(lFile, DBMS_LOB.LOB_READONLY);
DBMS_LOB.OPEN(lBlob, DBMS_LOB.LOB_READWRITE);
DBMS_LOB.LOADFROMFILE(DEST_LOB => lBlob,
SRC_LOB => lFile,
AMOUNT => DBMS_LOB.GETLENGTH(lFile));
DBMS_LOB.CLOSE(lFile);
DBMS_LOB.CLOSE(lBlob);
commit;
p_file_url := fnd_gfm.construct_download_url (fnd_web_config.gfm_agent, fnd_lobs_s.currval);
END IMPORT_TO_LOB;
Note that this is an AUTONOMOUS_TRANSACTION so it needs to be committed before returning to the calling package/block.
Hope that Helps!

Oracle PLSQL: Help Required: Parsing String Collection or Concatenated String

Whenever the length of string l_long_string is above 4000 characters, the following code is throwing an error:
ORA-01460: unimplemented or unreasonable conversion requested
Instead of the nested regexp_substr query, when I try to use
SELECT column_value
FROM TABLE(l_string_coll)
it throws:
ORA-22905: cannot access rows from a non-nested table item
How can I modify the dynamic query?
Notes:
- l_string_coll is of type DBMS_SQL.VARCHAR2S, and comes as input to my procedure (here, i have just shown as an anonymous block)
- I'll have to manage without creating a User-defined Type in DB schema, so I am using the in-built DBMS_SQL.VARCHAR2S.
- This is not the actual business procedure, but is close to this. (Can't post the original)
- Dynamic query has to be there since I am using it for building the actual query with session, current application schema name etc.
/*
CREATE TABLE some_other_table
(word_id NUMBER(10), word_code VARCHAR2(30), word VARCHAR2(255));
INSERT INTO some_other_table VALUES (1, 'A', 'AB');
INSERT INTO some_other_table VALUES (2, 'B', 'BC');
INSERT INTO some_other_table VALUES (3, 'C', 'CD');
INSERT INTO some_other_table VALUES (4, 'D', 'DE');
COMMIT;
*/
DECLARE
l_word_count NUMBER(10) := 0;
l_counter NUMBER(10) := 0;
l_long_string VARCHAR2(30000) := NULL;
l_dyn_query VARCHAR2(30000) := NULL;
l_string_coll DBMS_SQL.VARCHAR2S;
BEGIN
-- l_string_coll of type DBMS_SQL.VARCHAR2S comes as Input to the procedure
FOR i IN 1 .. 4100
LOOP
l_counter := l_counter + 1;
l_string_coll(l_counter) := 'AB';
END LOOP;
-- Above input collection is concatenated into CSV string
FOR i IN l_string_coll.FIRST .. l_string_coll.LAST
LOOP
l_long_string := l_long_string || l_string_coll(i) || ', ';
END LOOP;
l_long_string := TRIM(',' FROM TRIM(l_long_string));
dbms_output.put_line('Length of l_long_string = ' || LENGTH(l_long_string));
/*
Some other tasks in PLSQL done successfully using the concatenated string l_long_string
*/
l_dyn_query := ' SELECT COUNT(*)
FROM some_other_table
WHERE word IN ( SELECT TRIM(REGEXP_SUBSTR(str, ''[^,]+'', 1, LEVEL)) word
FROM ( SELECT :string str FROM SYS.DUAL )
CONNECT BY TRIM(REGEXP_SUBSTR(str, ''[^,]+'', 1, LEVEL)) IS NOT NULL )';
--WHERE word IN ( SELECT column_value FROM TABLE(l_string_coll) )';
EXECUTE IMMEDIATE l_dyn_query INTO l_word_count USING l_long_string;
dbms_output.put_line('Word Count = ' || l_word_count);
EXCEPTION
WHEN OTHERS
THEN
dbms_output.put_line('SQLERRM = ' || SQLERRM);
dbms_output.put_line('FORMAT_ERROR_BAKCTRACE = ' || dbms_utility.format_error_backtrace);
END;
/
How can I modify the dynamic query?
First of all. Based on the code you've provided, there is absolutely no need to use dynamic, native or DBMS_SQL dynamic SQL at all.
Secondly, SQL cannot operate on "strings" that are greater than 4K bytes in length(Oracle versions prior to 12c), or 32K bytes(Oracle version 12cR1 and up, if MAX_STRING_SIZE initialization parameter is set to EXTENDED).
PL/SQL, on the other hand, allows you to work with varchar2() character strings that are greater than 4K bytes (up to 32Kb) in length. If you just need to count words in a comma separated sting, you can simply use regexp_count() regular expression function(Oracle 11gr1 and up) as follows:
set serveroutput on;
set feedback off;
clear screen;
declare
l_str varchar2(100) := 'aaa,bb,ccc,yyy';
l_numOfWords number;
begin
l_numOfWords := regexp_count(l_str, '[^,]+');
dbms_output.put('Number of words: ');
dbms_output.put_line(to_char(l_numOfWords));
end;
Result:
Number of words: 4

Resources