I'm on a remote server without privileges to create a directory and I have a clob column (a xml code) that I want to see. As I'm using a very old version of PL/SQL_developer (8.0.4) and I can't update to a new one, with a single "select X from T", I get "CLOB" as result. So, searching on the Internet I found this in the AskTOM and I try to use the plsql solution
declare
my_var varchar2(32000 char); --tried with long, didn't work too.
begin
for x in ( SELECT X from T)
loop
my_var := dbms_lob.substr( x.X, 32000, 1 );
dbms_output.put_line(my_var);
end loop;
end;
But when i try to run, I have "ORA-20000 ORU-10027 buffer overflow limit of 10000 bytes".
I try to increase the limit with DBMS_OUTPUT.ENABLE(32000); but got error too "ORA-06502: PL/SQL: numeric or value error: character string buffer too small", I only can decrease the limit of 10000.
I know I don't have the SET serveroutput ON, but when I tried to add this line, guess what, error: "ORA-00922: missing or invalid option" but if I put 4000 instead of 32000 it works, show the first 4000b of data, so, I don't need this line.
So, I can't print, since the variable is too big, and I can't write the text to a file, since I don't have privilegies, there is any other way to see that variable?
Related
I'm currently looking at migrating CLOB data from ORACLE into Postgres from an external file. I have created my table in Postgres and the data type I'm using is TEXT which will replicate using a CLOB in ORACLE and now I just need to get my data in.
So far what I've done is extract a CLOB column from ORACLE into a file as per the below, it is only 1 CLOB from 1 COLUMN so Iām trying to load the contents of this entire CLOB into 1 column in Postgres..
CREATE TABLE clob_test (
id number,
clob_col CLOB);
DECLARE
c CLOB;
CURSOR scur IS
SELECT text
FROM dba_source
WHERE rownum < 200001;
BEGIN
EXECUTE IMMEDIATE 'truncate table clob_test';
FOR srec IN scur LOOP
c := c || srec.text;
END LOOP;
INSERT INTO clob_test VALUES (1, c);
COMMIT;
END;
/
DECLARE
buf CLOB;
BEGIN
SELECT clob_col
INTO buf
FROM clob_test
WHERE id = 1;
dbms_advisor.create_file(buf, 'TEST_DIR', 'clob_1.txt');
END;
/
This works fine and generates the clob_1.txt file containing all the contents of the ORACLE CLOB column CLOB_COL. Below is an example of the file output, it seems to contain every possible character you can think of including "~"...
/********** Types and subtypes, do not reorder **********/
type BOOLEAN is (FALSE, TRUE);
type DATE is DATE_BASE;
type NUMBER is NUMBER_BASE;
subtype FLOAT is NUMBER; -- NUMBER(126)
subtype REAL is FLOAT; -- FLOAT(63)
...
...
...
END;
/
My problem now is how do I get the entire contents of this 1 file into 1 record in Postgres so it simulates exactly how the data was originally stored in 1 record in ORACLE?
Effectively what I'm trying to achieve is similar to this, it works but the formatting is awful and doesn't really mirror how the data was originally stored.
POSTGRES> insert into clob_test select pg_read_file('/home/oracle/clob_1.txt');
I have tried using the COPY command but I'm having 2 issues. Firstly if there is a carriage return it will see that as another record and split the file up and the second issue is I can't find a delimiter which isn't being used in the file. Is there some way I can bypass the delimiter and just tell Postgres to COPY everything from this file without delimiters as it's only 1 column?
Any help would be great š
Note for other answerers: This is incomplete and will still put the data into multiple records; the question also wants all the data in a single field.
Use COPY ... FROM ... CSV DELIMITER e'\x01' QUOTE e'\x02'. The only thing this can't handle is actual binary blobs, which, as I understand it, is not permitted in CLOB (I have never used Oracle myself). This only avoids the delimiter issue; it will still insert the data into one row per line of the input.
I'm not sure how to go about fixing that issue, but you should be aware that it's probably not possible to do this correctly in all cases. The largest field value PG supports is 1gb, while CLOB supports up to 4GB. If you need to correctly import >1GB CLOBs, the only route available is PG's large object interface.
I use SQL Developer to run queries on Oracle databases, the DBMS_OUTPUT buffer size is set by default (20000) in SQL Developer.
When I run the query against DEV database, everything is OK.
When I run the same query against Production database, the same SQL Developer session, I get error :
ORA-20000: ORU-10027: buffer overflow, limit of 10000 bytes.
Have you an idea why ? There is any limitation set on database side.
You didn't reveal how exactly you call procedures from dbms_output package so I can just guess. In fact I got same error and found a cause in my case which might be of course different than yours.
In my case I had a buffer of length 10000 (in PLSQL Developer, which is not important here) and I called only the dbms_output.put procedure, not dbms_output.put_line. The put procedure keeps filling buffer and does not flush it, even if string argument contains newline character.
Compare these two attempts of writing 1001 lines of ten-characters-long string (nine visible + line terminator):
begin
dbms_output.enable(10000);
for i in 1..1001 loop
dbms_output.put('123456789' || chr(10)); -- fails here
end loop;
end;
vs.
begin
dbms_output.enable(10000);
for i in 1..1001 loop
dbms_output.put_line('123456789'); -- works
end loop;
end;
The first example fails with ORA-20000: ORU-10027: buffer overflow, limit of 10000 bytes because it tries to put 10010 chars into buffer.
(The real story from which this minimalistic example was synthetized was printing package source code from dba_source table. Every line of source source code in text column is terminated by newline character which must be trimmed before sending into put_line - without trimming it would be doubled; sending text into sole put method would cause the buffer trouble.)
I have an Oracle PL/SQL routine that takes a BLOB as a parameter. The BLOB contains a .jpg file. I want to assign the BLOB parameter to a local variable. I then want to insert (or update) a BLOB column in a table the BLOB varaible.
I have tried something like this:
declare
vATTACHMENT blob;
begin
dbms_lob.createtemporary(vATTACHMENT, false, dbms_lob.session);
dbms_lob.write(vATTACHMENT, dbms_lob.lobmaxsize, 1, :pATTACHMENT));
-- do some stuff
insert into attachments (attachment, file_name)
values (vATTACHMENT, vFILE_NAME);
end;
But I get the following error:
ORA-06502: PL/SQL: numeric or value error
ORA-06512: at "SYS.DBMS_LOB", line 811
ORA-06512: at line 21
I have also tried a direct assignment like vATTACHMENT := :pATTACHMENT; but that doesn't want to work either.
I think you can use DEFAULT in a variable declaration to assign a value to it without using the assignment operator :=, e.g.:
declare
vATTACHMENT blob DEFAULT :pATTACHMENT;
begin
-- rest of your code...
I can only get the exact error when pAttachment is null or empty (tested in 10.2.0.5). If it isn't, I get ORA-21560 instead, as it doesn't like lobmaxsize. If I do this instead, it's OK:
dbms_lob.write(vATTACHMENT, dbms_lob.getlength(:pATTACHMENT), 1, :pATTACHMENT));
But from your comments you have some issue referencing the bind variable more than once, and for some reason I don't quite understand you can't do assignments in PL/SQL as := is misinterpreted - which makes using PL/SQL at all somewhat impractical, I'd have thought. I'm a bit unclear if you're running this as an anonymous block directly from your client; if so maybe you should consider making it a stored procedure to avoid both those issues? You could then just do:
vATTACHMENT := :pATTACHMENT;
... though then it would be a parameter passed to the procedure rather than a bind variable, and you wouldn't need to both with the copy in the first place (as Dave Costa suggests).
If you are stuck with running it like this you could incur a context switch and do:
select :pATTACHMENT into vATTACHMENT from dual;
But that's not ideal; if you just want to make a copy, why aren't you using the copy procedures?
dbms_lob.copy(vATTACHMENT, :pATTACHMENT, dbms_lob.getlength(:pATTACHMENT));
... which like that still breaks your re-referencing-bind-variables restriction, but this one does understand lobmaxsize:
dbms_lob.copy(vATTACHMENT, :pATTACHMENT, dbms_lob.lobmaxsize);
You'll still get an error (ORA-22994 I think) if :pATTACHMENT is empty or null, so you'll need to make sure it isn't before calling the block, or check inside.
The fourth parameter to DBMS_LOB.WRITE is supposed to be either a RAW or a VARCHAR2. If I understand you correctly, you're binding an existing BLOB to the :pATTACHMENT placeholder, so you're passing an incorrect type.
Why do you need to assign it to a temporary BLOB at all? It seems to me that this should work:
insert into attachments (attachment, file_name)
values (:pATTACHMENT, vFILE_NAME);
For instance, this runs fine:
DECLARE
PROCEDURE insert_blob( p_blob BLOB) IS
BEGIN
INSERT INTO t_dave (b) VALUES (p_blob);
END;
BEGIN
insert_blob( empty_blob() );
END;
/
I have following cursor definition
cMultiplier NUMBER := 100000000000000000 ;
CURSOR CR_TABLE1 IS
SELECT to_char((COL_ID * cMultiplier) + SEQ,'0999999999999999999') "NEW_COL"
FROM TABLE1;
Then this cursor is being fetched as
FETCH CR_TABLE1
BULK COLLECT INTO AR_TABLE1 LIMIT I_BULK_LIMIT;
EXIT WHEN AR_TABLE1.COUNT = 0;
Where AR_TABLE1 is of type
TYPE T_TABLE1 IS TABLE OF CR_TABLE1%ROWTYPE;
AR_TABLE1 T_TABLE1;
The test values for COL_ID is 1 for all cases and the test values for SEQ is 1234567654322 (13 digit number). This value is being inserted as of length 19 in another table of type VARCHAR.
The problem is No sooner the cursor comes to FETCH, it throws exception stating ORA-06500: PL/SQL: storage error
I know it has to do something with select statement, but i am converting it into a string (varchar). Why i am running into this issue?
What value are you assigning to I_BULK_LIMIT?
The PLS-06500 error often means the application has run out of memory. So you need to look at the variables you assign in your program. But the memory used by the array is the most likely culprit. If the limit is currently set to more than a few thousand you should consider setting a lower limit.
The problem is in your mask, oracle adds a leading space for positive numbers or a "-" for negative numbers, this is causing the resulting string to be of 20 characters. Add FM to the format (like this: FM0999999999999999999). That way Oracle will suppress the leading space.
I have a stored procedure in Oracle and I'm using an out parameter in it..
I want to know how to display the output in Toad..
You just need to declare a variable to store the value in, and then do whatever you want with the data afterwards. If you are just wanting to see the output, dbms_output is probably the easiest way to go:
declare
-- declare variable to store out data in. Make sure datatype is correct
v_out VARCHAR2(50);
begin
-- call procedure, assigning value of out parameter to variable you declared
my_proc(
p_in => 3,
p_out => v_out
);
-- display value now in variable
dbms_output.put_line('Value of p_out: '||v_out);
end;
In the Toad schema browser, click the 'Execute' button, which will generate some test code for calling your procedure, and writing the OUT parameter via dbms_output. Check the output in the dbms_output window (you may need to activate output in the dbms_output window using the two leftmost icons)
In Toad after execution of a query you can see Multiple Options like Data Grid, Auto Trace, DBMS Output etc...
Goto option DBMS Output.
If Output is Turn Off (Red Dot), then click over it to Turn it On (Green).
Now Execute your query with CTRL+Enter
This will show result after Poling Frequency Seconds.
Trial Code :
DECLARE
c number(4);
BEGIN
c := 4;
dbms_output.put_line(c);
END;
/