Better Package for File Compression using PL/SQL? - oracle

I have an xlsx file sample.xlsx stored in a remote directory with around 1,699 KB in Size.
I have tried two popular PL/SQL packages (UTL_COMPRESS and AS_ZIP) that compresses them into gzip and zip, rescpectively.
With the code below using AS_ZIP, I have compressed the file to 1,619 KB:
declare
g_zipped_blob blob;
l_file_name varchar2(100) := 'sample.xlsx';
l_directory varchar2(100) := 'EXT_TAB_DATA';
begin
as_zip.add1file( g_zipped_blob, l_file_name, as_zip.file2blob(l_directory, l_file_name));
as_zip.finish_zip( g_zipped_blob );
as_zip.save_zip( g_zipped_blob, l_directory, 'my2.zip' );
dbms_lob.freetemporary( g_zipped_blob );
end;
With the code below (taken from the original post) using UTL_COMPRESS, I have compressed the file to 1,618 KB:
DECLARE
in_filename VARCHAR2(100) := 'sample.xlsx';
l_directory varchar2(100) := 'EXT_TAB_DATA';
src_file BFILE;
v_content BLOB;
v_blob_len INTEGER;
v_file utl_file.file_type;
v_buffer RAW(32767);
v_amount BINARY_INTEGER := 32767;
v_pos INTEGER := 1;
BEGIN
src_file := bfilename(l_directory, in_filename);
dbms_lob.fileopen(src_file, dbms_lob.file_readonly);
v_content := utl_compress.lz_compress(src_file, 9);
v_blob_len := dbms_lob.getlength(v_content);
v_file := utl_file.fopen(l_directory,
in_filename || '.gz',
'wb');
WHILE v_pos < v_blob_len LOOP
dbms_lob.READ(v_content, v_amount, v_pos, v_buffer);
utl_file.put_raw(v_file, v_buffer, TRUE);
v_pos := v_pos + v_amount;
END LOOP;
utl_file.fclose(v_file);
EXCEPTION
WHEN OTHERS THEN
IF utl_file.is_open(v_file) THEN
utl_file.fclose(v_file);
END IF;
RAISE;
END;
Although minimal, it seems that UTL_COMPRESS has better compression in terms of file size.
I was wondering if there was some unseen advantage using the custom AS_ZIP over the Oracle-supplied UTL_COMPRESS?
Thank you.

Anton Scheffer explains why he wrote the AS_ZIP package in this blog post . It should answer your question. Basically it's to support additional zip formats.
Also it has a link to a more recent version of the package than the link in your post.
As for which one to use, my standard line is always to use the Oracle built-in functionality unless we really need the something extra from a third-party offering.
Using Oracle's standard functionality means:
Oracle Support covers us
we don't have to maintain the code
our code base is that much simpler for new joiners

Related

how to upload a regular file (eg cwallet.sso ) to data_pump_dir in oracle db?

Currently I am doing this to upload cwallet.sso (ie a "normal" file, not an export, etc.) to an Oracle Autonomous Database...
BEGIN
DBMS_CLOUD.GET_OBJECT(
object_uri => 'https://objectstorage.us-ashburn-1.oraclecloud.com/p/Uasdfadsfasdf7Icmer6HMkv/n/sadf/b/paul/o/cwallet.sso',
directory_name => 'DATA_PUMP_DIR');
END;
/
I would prefer to not rely on object store as I have the cwallet.sso locally and so it would seem an unnecessary additional step. Is there a straightforward PL/SQL command to just upload the file from local location to DATA_PUMP_DIR (or any dir really)? I couldn't quite tell from doc.
Autonomous Database does offer access to "directories" and "files." Under the covers, these are implemented as a virtual filesystem with the storage coming from your database, so it is charged to you as database quota.
It's a little awkward, but you can get files into this filesystem with a PL/SQL procedure if you're able to load your input into a BLOB:
PROCEDURE write_file(
directory_name IN VARCHAR2,
file_name IN VARCHAR2,
contents IN BLOB
)
IS
l_file UTL_FILE.file_type;
l_data_len INTEGER;
l_buffer RAW(32000);
l_pos INTEGER := 1;
l_amount INTEGER := 32000;
BEGIN
-- Get the data length to write
l_data_len := DBMS_LOB.getlength(contents);
-- Write the contents to local file
l_file := UTL_FILE.FOPEN(directory_name, file_name, 'wb', l_amount);
WHILE l_pos < l_data_len
LOOP
DBMS_LOB.read(contents, l_amount, l_pos, l_buffer);
UTL_FILE.PUT_RAW(l_file, l_buffer, TRUE);
l_pos := l_pos + l_amount;
END LOOP;
UTL_FILE.FCLOSE(l_file);
EXCEPTION
WHEN OTHERS THEN
UTL_FILE.FCLOSE(l_file);
RAISE;
END write_file;
How you get your data into a BLOB depends on your client.

Returning a single large clob from a restful service in Oracle APEX

Is there any way to return a clob as text (or file) without splitting into smaller pieces first?
I tried creating a GET handler with PL/SQL source that would just do this:
declare
txt clob;
begin
...-- set clob
htp.p(txt);
end;
But then I was getting the error ORA-06502: PL/SQL: numeric or value error.
I then tried cutting the clob in smaller segments and calling htp.p multiple times, which worked, but I was wondering if there was a way to send the whole thing in one go.
The doc states that htp.p and htp.prn take only VARCHAR2 so you're limited by the max size of a varchar2 and if the clob length exceeds that it will throw an error. This is what you can do:
Loop through the clob in 4k chunks and output using htp.prn. Avoid using htp.p in a loop because that generates a newline characters which could mess up the output, for example if you're generating json. It's also good practice to let the browser know what he's getting by setting the mime header.
DECLARE
l_clob CLOB;
l_amt INTEGER := 4000;
l_pos INTEGER := 1;
l_buf VARCHAR2(4000);
BEGIN
owa_util.mime_header ('text/html', true);
l_clob := '....';
LOOP
BEGIN
dbms_lob.read(
l_clob,
l_amt,
l_pos,
l_buf
);
l_pos := l_pos + l_amt;
-- need htp.prn since htp.p generates a newline char at the end.
htp.prn(l_buf);
EXCEPTION
WHEN no_data_found THEN
EXIT;
END;
END LOOP;
END;

Update oracle reports from 6i to 10g

I run oracle reports 6i in bellow code
IF :Global.Report_id IN ('XB_RFMODSM_DESA') THEN
Add_Parameter(pl_login1,'P_MONTH',TEXT_PARAMETER,:BLK_REPORT.BILL_CYCLE_CODE);
Add_Parameter(pl_login1,'LOCATION_CODE',TEXT_PARAMETER, vvc_location);
Add_Parameter(pl_login1,'FEEDER_NO1',TEXT_PARAMETER,vch_feeder);
Rep_id := LTRIM(rtrim(:Global.Report_id));
Run_Product(REPORTS, Rep_id, SYNCHRONOUS, RUNTIME, FILESYSTEM, pl_login1, NULL);
END IF;
Now i want to run in oracle 10g.
What change i made for run oracle reports in 10g?
Thanks
The usual way of doing that is to use WEB.SHOW_DOCUMENT.
Here's a (slightly formatted) copy/paste (in case the link gets broken) of an example Sarah posted on OTN forums. See if it helps.
DECLARE
repid REPORT_OBJECT;
v_rep VARCHAR2(100);
rep_status VARCHAR2(20);
plid ParamList;
vParamValue number;
BEGIN
plid := Get_parameter_List('tmp');
IF NOT Id_Null(plid) THEN
Destroy_parameter_List( plid );
END IF;
plid := Create_parameter_List('tmp');
add_parameter(plid, 'p_parameter',text_parameter,to_char(:block.item));
add_parameter(plid, 'PARAMFORM', TEXT_parameter, 'NO');
repid := FIND_REPORT_OBJECT('REPORT6');
SET_REPORT_OBJECT_PROPERTY(repid,REPORT_COMM_MODE,SYNCHRONOUS);
SET_REPORT_OBJECT_PROPERTY(repid,REPORT_DESTYPE,cache);
SET_REPORT_OBJECT_PROPERTY(repid,REPORT_DESFORMAT,'PDF');
SET_REPORT_OBJECT_PROPERTY(repid,REPORT_OTHER, 'paramform=no');
v_rep := RUN_REPORT_OBJECT(repid,plid);
rep_status := REPORT_OBJECT_STATUS(v_rep);
WHILE rep_status in ('RUNNING','OPENING_REPORT','ENQUEUED')
LOOP
rep_status := report_object_status(v_rep);
END LOOP;
/*Display report in the browser*/
WEB.SHOW_DOCUMENT('http://Machine_name:Port/reports/rwservlet/getjobid' ||
substr(v_rep,instr(v_rep,'_',-1)+1) || '?' ||
'server=Report_server_name&P_parameter=' ||:block.item ||
'&paramform=no');
END;
Alternative to what #Littlefoot pointed out you may use Report conversion tool of Fusion Middleware. After processing with that tool Rp2Rro.pll should be attached through the forms' attached libraries part.
After all, you may add a ad-hoc Procedure as :
Procedure Pr_Print_Rp2Rro(
Rep_id in out varchar2,
i_param_name varchar2,
i_param_var varchar2,
i_param_frm varchar2, -- 'Yes','No'
i_repsrv varchar2,
i_desname varchar2,
i_destype varchar2 default 'FILE',
i_desformat varchar2 default 'PDF'
) Is
pl_login1 ParamList;
arr_param_name owa.vc_arr;
arr_param_var owa.vc_arr;
Begin
pl_login1 := Get_Parameter_List('REPPARAM');
if not Id_Null(pl_login1) then
Destroy_Parameter_List('REPPARAM');
end if;
pl_login1 := Create_Parameter_List('REPPARAM');
Add_Parameter(pl_login1, 'PARAMFORM', Text_Parameter, i_param_frm);
Add_Parameter(pl_login1, 'RP2RROREPORTSERVER', Text_Parameter, i_repsrv );
Add_Parameter(pl_login1, 'RP2RRODESTYPE', Text_Parameter, i_destype );
Add_Parameter(pl_login1, 'RP2RRODESNAME', Text_Parameter, i_desname );
Add_Parameter(pl_login1, 'RP2RRODESFORMAT', Text_Parameter, i_desformat );
for i in 1..100
loop
arr_param_name(i) := substr(i_param_name,instr(i_param_name,'|',1,i)+1,
instr(i_param_name,'|',1,1+i)-instr(i_param_name,'|',1,i)-1);
arr_param_var(i) := substr(i_param_var,instr(i_param_var,'|',1,i)+1,
instr(i_param_var,'|',1,1+i)-instr(i_param_var,'|',1,i)-1);
if length(arr_param_name(i)) > 0 then
Add_Parameter( pl_login1, arr_param_name(i) , Text_Parameter, arr_param_var(i) );
end if;
end loop;
Rep_id := ltrim(rtrim(:Global.Report_id));
Rp2rro.Rp2rro_Run_Product(Reports, Rep_id, Synchronous, Runtime,Filesystem, pl_login1,null);
End;
this could be called ( from a button as an example ) like :
declare
vvc_location tabFeederDesign.vvc_location%type;
vch_feeder tabFeederDesign.vch_feeder%type;
v_Rep_id varchar2(500):='Rep123';
v_server varchar2(500):='mySrv';
v_file varchar2(500):='file456';
begin
Pr_Print_Rp2Rro(v_Rep_id,'|P_MONTH|LOCATION_CODE|FEEDER_NO1|','|'||:BLK_REPORT.BILL_CYCLE_CODE||'|'||vvc_location||'|'||vch_feeder||'|','No',v_server,v_file);
end;

File corruption with UTL_FILE script

I have definitely searched FAR and WIDE for an answer to this, but I can't find anything! I am using a UTL_FILE script to pull down some file BLOBS from an Oracle table and save them to a file directory. It's working for a lot of the files, but I have narrowed it down by process of elimination that it's having issues for files that have an "unconventional" file name, albeit still a valid one, the files are getting corrupted in the transfer. They may have only been 30kb originally, but export as 5kb and cannot be opened. So I know it's not a large file size issue. The files open just fine through the application, have a valid MIME type encoding, and would otherwise open fine on a file system, but UTL_FILE doesn't seem to like them. They are files that have an extra "." in them ie: john.smith.doc, or a pound sign ie: Smith #12345.doc or parentheses, etc. I cannot change the source file names in the Oracle table, but I have been concatenating an ID number on to them when saving them out so I can reference it as a key for an ETL load into SQL file table later. Maybe I also need to write a complicated REGEXP to rename the files on the fly and strip out the bad characters, but I'm not sure that will work because I don't know at what point UTL_FILE is choking on them. If it's at the source, then that won't help. Has anyone else encountered this problem? Here is my script:
DECLARE
CURSOR C1 IS Select FILE_ID || '---' || substr(DOCUMENTLOCATION,1,instr
(DOCUMENTLOCATION,'.')-1)||'.doc' as FILE_NAME, FILE_BLOB, FILE_ID
From DOCUMENTS d inner join CASEJOURNAL c on d.FILE_ID = c.JOURNALENTRYID
where (JOURNAL_ENTRY_TYPE = 117 or JOURNAL_ENTRY_TYPE = 3) AND
c.DOCUMENTLOCATION Is Not Null AND d.MIME_TYPE = 'application/msword'
AND FILE_ID BETWEEN 785 AND 3380;
l_file UTL_FILE.FILE_TYPE;
l_buffer RAW(32000);
l_amount INTEGER := 32000;
l_pos INTEGER := 1;
l_blob BLOB;
l_blob_len INTEGER;
l_filename varchar2(255);
BEGIN
--Select BLOB file into variables
FOR I in C1
LOOP
Select FILE_ID || '---' || substr(DOCUMENTLOCATION,1,instr
(DOCUMENTLOCATION,'.')-1) ||'.doc' as FILE_NAME, FILE_BLOB INTO l_filename,
l_blob From DOCUMENTS d inner join CASEJOURNAL c on d.FILE_ID =
c.JOURNALENTRYID where (JOURNAL_ENTRY_TYPE = 117 or JOURNAL_ENTRY_TYPE =
3) AND c.DOCUMENTLOCATION Is Not Null AND d.MIME_TYPE
= 'application/msword' and d.FILE_ID = I.FILE_ID;
-- Define the output directory
l_file := UTL_FILE.FOPEN('\\myfiledirectory',l_filename,'wb',32000);
l_pos := 1;
l_amount := 32000;
--Get length of BLOB file and save to variable.
l_blob_len := DBMS_LOB.getlength(l_blob);
-- Write the data to the file
--If small enough for single write:
IF l_blob_len < 32000 THEN
UTL_FILE.PUT_RAW (l_file, l_blob);
UTL_FILE.FFLUSH(l_file);
--Write in pieces if larger than 32k
ELSE
l_pos := 1;
WHILE l_pos < l_blob_len AND l_amount > 0
LOOP
DBMS_LOB.read(l_blob, l_amount, l_pos, l_buffer);
UTL_FILE.PUT_RAW(l_file, l_buffer);
UTL_FILE.FFLUSH(l_file);
--Set start position for next write
l_pos := l_pos + l_amount;
--Set end position if less than 32k.
l_blob_len := l_blob_len - l_amount;
IF l_blob_len < 32000 THEN
l_amount := l_blob_len;
END IF;
END LOOP;
END IF;
UTL_FILE.FCLOSE(l_file);
END LOOP;
END;
The file name isn't going to affect how the bytes are written out once the file has been opened. You seem to be truncating the file if it's more than 32k. Your loop does this:
WHILE l_pos < l_blob_len AND l_amount > 0
LOOP
... but then you change both l_pos and l_blob_len within the loop; once the adjusted l_pos falls below the remaining l_blob_len you exit the loop, too early. You don't need to adjust l_blob_len, or even adjust l_amount - that is the maximum number of bytes to read, it doesn't matter if it's higher than what's left.
So change the loop to:
WHILE l_pos < l_blob_len AND l_amount > 0
LOOP
DBMS_LOB.read(l_blob, l_amount, l_pos, l_buffer);
UTL_FILE.PUT_RAW(l_file, l_buffer);
UTL_FILE.FFLUSH(l_file);
--Set start position for next write
l_pos := l_pos + l_amount;
END LOOP;
Not really related to your problem, but you don't need to reselect the data inside your cursor loop. You've already got the values you need in your i cursor variable, so you can do:
FOR I in C1
LOOP
l_filename := i.file_name;
l_blob := i.file_blob;
-- Define the output directory
...
Or don't bother with the l_filename and l_blob local variables at all; since you only refer to them inside the cursor loop anyway, use i.file_name and i.file_blob directly everywhere, e.g.
l_file := UTL_FILE.FOPEN('\\myfiledirectory',i.file_name,'wb',32000);
l_blob_len := DBMS_LOB.getlength(i.file_blob);
etc.

Error when decoding base 64 to blob

I am using the following function to convert a large base64 encoded file(image or voice) into a blob file and store it in the Oracle database (Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 - Production).
I am able to store it and retrieve it but the image is getting corrupted. Only a potion of the image is getting retrieved. I tried using small images(11KB size) and it is working fine. But for larger images(88KB to 700KB) only a portion of the image is retrieved.
The problem is with the base-64 decoding. Earlier I was not able to get even the smaller image due to corruption, but when I increased the buffer size, it came fine. Now the buffer size is at its maximum at 32767 as its the maximum for varchar2 and raw.
Can anyone provide a suitable workaround or solution.
function decode_base64(p_clob_in in clob) return blob is
v_blob blob;
v_result blob;
v_offset integer;
v_buffer_size binary_integer := 32767; -- 24, 48, 3072
v_buffer_varchar varchar2(32767);
v_buffer_raw raw(32767);
begin
if p_clob_in is null then
return null;
end if;
dbms_lob.createtemporary(v_blob, true);
v_offset := 1;
for i in 1 .. ceil(dbms_lob.getlength(p_clob_in) / v_buffer_size)
loop
dbms_lob.read(p_clob_in, v_buffer_size, v_offset, v_buffer_varchar);
v_buffer_raw := utl_raw.cast_to_raw(v_buffer_varchar);
v_buffer_raw := utl_encode.base64_decode(v_buffer_raw);
dbms_lob.writeappend(v_blob, utl_raw.length(v_buffer_raw), v_buffer_raw);
v_offset := v_offset + v_buffer_size;
end loop;
v_result := v_blob;
dbms_lob.freetemporary(v_blob);
return v_result;
end decode_base64;
The code that i use to call the function and insert the blob into the table is given below...
PROCEDURE create_notes (
p_task_id IN NUMBER
,p_note_title IN VARCHAR2
,p_note_detail IN VARCHAR2
,p_attach_name IN VARCHAR2
,p_attachment IN CLOB
,p_attach_type IN VARCHAR2
,x_return_code OUT VARCHAR2
,x_return_message OUT VARCHAR2
)
IS
l_blob_data BLOB;
BEGIN
.
.
.
IF p_attachment IS NOT NULL THEN
SELECT incident_id INTO l_pk1_value FROM csf_ct_tasks where task_id = p_task_id;
l_blob_data := xx_utl_base64.decode_base64(p_attachment);
INSERT INTO fnd_lobs
(file_id, file_name, file_content_type, upload_date,
expiration_date, program_name, program_tag, file_data,
LANGUAGE, oracle_charset, file_format
)
VALUES (l_media_id, p_attach_name,p_attach_type, -- 'audio/mpeg','application/pdf','image/jpeg'
SYSDATE,
NULL, 'FNDATTCH', NULL, l_blob_data, --l_blob_data,EMPTY_BLOB ()
'US', 'UTF8', 'binary'
)
RETURNING file_data
INTO x_blob;
COMMIT;
END IF:
Attaching the original picture and its decoded version, below.
I got the below code from net. It worked like a charm. Dont know whats the problem with my old code though.
FUNCTION base64decode(p_clob CLOB)
RETURN BLOB
IS
l_blob BLOB;
l_raw RAW(32767);
l_amt NUMBER := 7700;
l_offset NUMBER := 1;
l_temp VARCHAR2(32767);
BEGIN
BEGIN
DBMS_LOB.createtemporary (l_blob, FALSE, DBMS_LOB.CALL);
LOOP
DBMS_LOB.read(p_clob, l_amt, l_offset, l_temp);
l_offset := l_offset + l_amt;
l_raw := UTL_ENCODE.base64_decode(UTL_RAW.cast_to_raw(l_temp));
DBMS_LOB.append (l_blob, TO_BLOB(l_raw));
END LOOP;
EXCEPTION
WHEN NO_DATA_FOUND THEN
NULL;
END;
RETURN l_blob;
END;
I tried your function with a v_buffer_size of 8192 and it worked fine. I've tried several numbers smaller than 32767 and they all worked fine, so try something less than that.
For those who's still looking for a correct solution - you need to decode input data in multiples of 4. In case input contains non-base64 symbols (which are ignored by built-in function utl_encode.base64_decode), it might lead to incorrect results on large files.
I've found a lot of samples on the web which do not correctly decode, posting my code below
FUNCTION base64_decode(p_content CLOB) RETURN BLOB
IS
C_CHUNK_SIZE CONSTANT INTEGER := 12000; -- should be a multiple of 4
C_NON_BASE64_SYM_PATTERN CONSTANT VARCHAR2(20) := '[^A-Za-z0-9+/]';
l_chunk_buf VARCHAR2(12000);
l_chunk_b64_buf RAW(9000);
l_chunk_offset INTEGER := 1;
l_chunk_size INTEGER;
l_res BLOB;
FUNCTION get_next_full_base64_chunk(l_data CLOB, p_cur_pos IN OUT INTEGER, p_desired_size INTEGER, p_cur_size IN OUT INTEGER) RETURN VARCHAR2 IS
l_res VARCHAR2(12000);
l_tail_desired_size INTEGER;
BEGIN
l_res := dbms_lob.substr(l_data, p_desired_size, p_cur_pos);
p_cur_pos := p_cur_pos + p_desired_size;
IF l_res IS NULL THEN
RETURN NULL;
END IF;
l_res := regexp_replace(l_res, C_NON_BASE64_SYM_PATTERN, '');
p_cur_size := p_cur_size + length(l_res);
l_tail_desired_size := 4 - mod(p_cur_size, 4);
IF l_tail_desired_size = 4 THEN
RETURN l_res;
ELSE
RETURN l_res || get_next_full_base64_chunk(l_data, p_cur_pos, l_tail_desired_size, p_cur_size);
END IF;
END;
BEGIN
dbms_lob.createtemporary(l_res, false);
WHILE true
LOOP
l_chunk_size := 0;
l_chunk_buf := get_next_full_base64_chunk(p_content, l_chunk_offset, C_CHUNK_SIZE, l_chunk_size);
EXIT WHEN l_chunk_buf IS NULL;
l_chunk_b64_buf := utl_encode.base64_decode(utl_raw.cast_to_raw(l_chunk_buf));
dbms_lob.writeappend(l_res, utl_raw.length(l_chunk_b64_buf), l_chunk_b64_buf);
END LOOP;
RETURN l_res;
END;

Resources