Oracle 11g maximum statement length possible - oracle

I am facing the issue with very long statements in Oracle 11g (~220k-symbol selects with a lot of autogenerated IN (...) clauses). Selects are fired by JDBC causing SQLException 17410 No more data from socket.
On the other hand, selects with the same structure but with shorter lengths (~100k symbols) are executed fine.
The problem is that there is no reference in the docs on how to manage maximum statement length. There is just a note: The limit on how long a SQL statement can be depends on many factors, including database configuration, disk space, and memory which is not informative at all.
Can anyone share an experience on how to estimate this maximum length and what database tweaks (if any) can help to increase it?

Having thousands of expressions in IN (...) seems to be a bad design. Better insert such values into a (temporary) table and limit the result by JOIN or IN (SELECT ...)
Anyway, the statement length is almost unlimited. I have an application where I run large statements like this:
DECLARE
cmd DBMS_SQL.VARCHAR2A;
cur INTEGER := DBMS_SQL.OPEN_CURSOR;
res INTEGER;
BEGIN
cmd(1) := 'INSERT INTO';
cmd(cmd.LAST+1) := 'some columns, ';
cmd(cmd.LAST+1) := 'and some more,'; -- each line is limited to 32767 chars
DBMS_SQL.PARSE(cur, cmd, cmd.FIRST, cmd.LAST, TRUE, DBMS_SQL.NATIVE);
res := DBMS_SQL.EXECUTE(cur);
DBMS_SQL.CLOSE_CURSOR(cur);
END;
Note, DBMS_SQL.EXECUTE does not fetch any data. For SELECT the procedure would be
DECLARE
cmd DBMS_SQL.VARCHAR2A;
cur INTEGER := DBMS_SQL.OPEN_CURSOR;
res INTEGER;
refCur SYS_REFCURSOR;
BEGIN
cmd(1) := 'SELECT ';
cmd(cmd.LAST+1) := 'some columns, ';
cmd(cmd.LAST+1) := 'and some more,'; -- each line is limited to 32767 chars
DBMS_SQL.PARSE(cur, cmd, cmd.FIRST, cmd.LAST, TRUE, DBMS_SQL.NATIVE);
res := DBMS_SQL.EXECUTE(cur);
refCur := DBMS_SQL.TO_REFCURSOR(cur);
FETCH refCur BULK COLLECT INTO ...;
CLOSE refCur;
END;
or use DBMS_SQL.EXECUTE_AND_FETCH if you prefer. Procedure DBMS_SQL.RETURN_RESULT may also help.
So far I never faced any limitation with this method.

Related

Finding last number generated by a sequence before it got deleted

We found that there is a sequence "_SEQUENCE" is not present in one of our client's UAT instance. We don't how it got deleted and when. It is a very crucial sequence because the numbers generated by this sequence are used as unique column values in many tables across DB. In other words, no two columns (of specific column types) in any two tables in the DB will have the same number as value. On few of these columns, we have unique index also.
We can create the sequence again but we don't know what should be the initial value of the sequence because we don't know what was the last number the old sequence generated. If we set a wrong number as the initial value and by chance, if it generates the same number for the same column of a table which is already present, we may end up getting unique key violation exception.
We can set the initial value to a very big number but that is the last option. Now;
Is it possible to find the last number the sequence "_SEQUENCE" generated before it got deleted?
Is it possible to find which process deleted the sequence "_SEQUENCE" and when?
The flashback operation is not available for a sequence, while it's available for tables. A mechanishm might be produced by the contribution of a DDL trigger mostly created in SYS or SYSTEM database users. As an example consider this procedure :
create or replace procedure pr_ddl_oper as
v_oty varchar2(75) := ora_dict_obj_type;
v_don varchar2(75) := ora_dict_obj_name;
v_evt varchar2(75) := ora_sysevent;
v_olu varchar2(75) := nvl(ora_login_user,'Unknown Schema');
v_sql ora_name_list_t;
v_stm clob;
v_sct owa.vc_arr;
n pls_integer;
n_max pls_integer := 10000; -- max number of rows for CLOB object
-- to hold object's source.
begin
v_sct(1) := 'SESSIONID';
v_sct(2) := 'IP_ADDRESS';
v_sct(3) := 'TERMINAL';
v_sct(4) := 'OS_USER';
v_sct(5) := 'AUTHENTICATION_TYPE';
v_sct(6) := 'CLIENT_INFO';
v_sct(7) := 'MODULE';
for i in 1..7
loop
v_sct(i) := sys_context('USERENV',v_sct(i));
end loop;
select decode(v_sct(1),0,null,v_sct(1)),decode(upper(v_sct(3)),
'UNKNOWN',null,v_sct(3))
into v_sct(1),v_sct(3)
from dual;
n := ora_sql_txt( v_sql );
if n > n_max then
n := n_max;
end if;
for i in 1..n
loop
v_stm := v_stm || v_sql(i);
end loop;
insert into log_ddl_oper(event_time,usr,evnt,stmt,sessionid,ip,terminal,os_user,
auth_type,object_type,object_name,client_info,module_info)
values(sysdate,v_olu,v_evt,v_stm,v_sct(1),v_sct(2),v_sct(3),v_sct(4),v_sct(5),
v_oty,v_don,v_sct(6),v_sct(7));
end;
which could be called by this trigger as to be a future reference :
--| Compiling this trigger, especially for Production Systems, should be handled with care |
create or replace trigger system.trg_admin_ddl before ddl on database
declare
begin
pr_ddl_oper;
end;
By connecting SYS user you can query the last_number column for your dropped Sequence :
select last_number
from dba_sequences
as of timestamp to_timestamp('2018-11-27 23:50:17', 'YYYY-MM-DD HH24:MI:SS') s
where s.sequence_name = 'MYSEQ';
where the timestamp value could be detected by
select l.event_time
--> returns to_timestamp('2018-11-27 23:50:17', 'YYYY-MM-DD HH24:MI:SS')
--> to use for the above SQL Select statement
from log_ddl_oper l
where l.object_name = 'MYSEQ'
and l.evnt = 'DROP'

Oracle 12c CLOB data type is not working as expected

I have This Oracle 12c Procedure
CREATE OR REPLACE PROCEDURE LOGINCHECK(SQLQRY IN CLOB)
AS
C INTEGER;
N INTEGER;
RC SYS_REFCURSOR;
stmt clob:= To_Clob('begin ' || sqlqry || '; end;');
BEGIN
C := SYS.DBMS_SQL.OPEN_CURSOR;
SYS.DBMS_SQL.PARSE(C,stmt ,DBMS_SQL.native);
N := SYS.DBMS_SQL.EXECUTE(C);
SYS.DBMS_SQL.GET_NEXT_RESULT(C,RC);
SYS.DBMS_SQL.RETURN_RESULT(RC);
EXCEPTION
WHEN NO_DATA_FOUND THEN
NULL;
when OTHERS then
RAISE;
END LOGINCHECK;
I Call This Procedure in Anonymous Block Like This (Download XML Data from here: Link)
declare stmt clob := 'INWARDPKG.MACHINEINWARD_VALIDATING(XMLDOC => XMLTYPE.CREATEXML(paste xml from link))'; --The parameter value is a xml you can download it from above link
begin
LOGINCHECK(SQLQRY => STMT);
end;
But I am getting Error PLS-00172: string literal too long.
If i reduce xml size to 40-50 elements like remove some elements. this works fine.
In your first line declare stmt clob := 'INWARDPKG.MACHINEINWARD_VALIDATING... you are defining your CLOB. Since you are using a string literal to define your CLOB, you are facing the limits of string literals (see Oracle 12c Documenation).
To solve your problem you have to build your CLOB step by step, using the DBMS_LOB package and appending strings not longer than 4000 bytes until your CLOB is complete.
The basic idea:
DECLARE
C CLOB := TO_CLOB('First 4000 bytes');
V VARCHAR2(4000);
BEGIN
V := 'Next 4000 bytes';
DBMS_LOB.WRITEAPPEND(C, LENGTH(V), V);
-- more WRITEAPPEND calls until C is complete
DBMS_OUTPUT.PUT_LINE('CLOB-Length: ' || DBMS_LOB.GETLENGTH(C));
END;

Selecting big size docs from oracle db

So want to get Id's of documents which are bigger than 60 mb:
SELECT
DOCS.ID
FROM DOCS
where LENGTH(DOCS.DOCUMENT) > (60*1024*1024)
and I get this error :
SQL Error: ORA-00932: inconsistent datatypes: expected NUMBER got LONG BINARY
00932. 00000 - "inconsistent datatypes: expected %s got %s"
document is savad as a long raw ...
probably should cast somehow to long???
The only solution I came up with was To alter table from long raw to BLOB, then for some divine reason you have to recreate ALL indexes for that table... it's the only way.
Indeed this is a problem, but you can use PL/SQL as a workaround:
create or replace function get_doc_length(iDocId in number) return number is
aLong long;
begin
select d.document into aLong from docs d where d.id = iDocId;
return length(aLong);
end;
/
This function (which cannot use a LONG as parameter, unfortunately, so must be specific to table DOCS) can then be used like this:
SELECT DOCS.ID
FROM DOCS
where get_doc_length(DOCS.ID) > (60*1024*1024)
EDIT : ok, this does not work for big LONGs. You have to dig even more into SQL's arcanes to make this work:
create or replace function get_doc_length(iDocId in number) return number is
myQuery varchar2(200);
myCursor binary_integer;
myRes pls_integer;
myDoc clob;
long_val long;
long_len integer;
buf_len integer;
cur_pos number;
begin
myQuery := 'select d.document from docs d where d.id = ' || iDocId;
-- Create cursor, parse and bind.
myCursor := DBMS_SQL.OPEN_CURSOR;
DBMS_SQL.PARSE(myCursor, myQuery, DBMS_SQL.NATIVE);
DBMS_SQL.DEFINE_COLUMN_LONG(myCursor, 01);
myRes := DBMS_SQL.EXECUTE(myCursor);
-- Fetch row (only one normally)
if DBMS_SQL.FETCH_ROWS(myCursor) > 0 then
-- Create CLOB.
DBMS_LOB.CREATETEMPORARY(myDoc, false, DBMS_LOB.CALL);
-- Piecewise fetching of the LONG column, appending to the CLOB.
buf_len := 32760;
cur_pos := 0;
loop
DBMS_SQL.COLUMN_VALUE_LONG(myCursor, 01, buf_len, cur_pos, long_val, long_len);
exit when long_len = 0;
DBMS_LOB.APPEND(myDoc, long_val);
cur_pos := cur_pos + long_len;
end loop;
end if;
DBMS_SQL.CLOSE_CURSOR(myCursor);
return length(myDoc);
end;
/
The DBMS_SQL package allows to convert a LONG into a CLOB by looping on it and appending pieces of the LONG gradually.
If you want the binary length of the CLOB instead of its char length (this may differ), you can use dbms_lob.getlength instead of length as this other SO post shows.
SELECT DOCS.ID
FROM DOCS
WHERE dbms_lob.getlength(DOCS.DOCUMENT) > (60*1024*1024)

Oracle - How to handle 32K+ string length in variables

I am using oracle 11g. Whenever I encountered strings larger than varchar2 size limit, In sql server I use to split the data into multiple variables as below and then join them while execution. However Oracle seems to be expecting 32K combined size before execution. I am getting "ORA-20000: ORU-10028: line length overflow, limit of 32767 bytes per line" error.
I am using these variables in an oralce script (not stored procs). Last 2 statements are throwing the above error and individually I am able to show the value.
Thanks in advance.
DECLARE
sViewQuery varchar2(32000);
sViewSelectQuery varchar2(32000);
BEGIN
---Assign values of 32,000 letter string (dynamic query)
sViewSelectQuery:='32K string...';
sViewQuery:='32K string..';
DBMS_OUTPUT.PUT_LINE(sViewQuery||sViewSelectQuery);
EXECUTE IMMEDIATE sViewQuery||sViewSelectQuery;
END;
You can use DBMS_SQL Package for this:
DECLARE
stmt DBMS_SQL.VARCHAR2A;
c number;
res number;
BEGIN
stmt(1) := 'create view view_a (';
stmt(2) := 'col_a, ';
stmt(3) := 'col_b, ';
stmt(4) := 'col_c) as '
stmt(5) := 'select ';
stmt(6) := 'col_bb, ';
stmt(7) := 'col_cc + col_ee + DECODE(...), ';
stmt(8) := 'col_dd) ';
stmt(9) := 'from table_b ';
stmt(10) := 'where ... ';
-- each element can have up to 32K characters, number of elements is (almost) unlimited
c := DBMS_SQL.open_cursor;
DBMS_SQL.parse(c, stmt, 1,10, TRUE, DBMS_SQL.NATIVE);
res := DBMS_SQL.execute(c);
DBMS_SQL.close_cursor(c);
END;
You should use a CLOB, Character Large OBject. You can handle 32K+ string length, since CLOB can contain up to 4GB of data.
For more info: http://docs.oracle.com/javadb/10.3.3.0/ref/rrefclob.html

bulk collect dynamic sql

I have to write a dynamic sql cursor where there are several possibilities in which the select query will be generated. Hence I am chosing dynamic and I am Using DBMS_SQL package to dynamically create a cursor and dynamically fetch the data.
However , Result set is going to be huge . around 11GB (there are 2.4 million records and the select statement will be approx 80 cols long assumning about 50Byte varchar per column)
Hence I cannot open the cursor at once . I want to know if there is a feature wherein i can fetch the data from the curosr keeping the curosr open for Blocks of say 1000 records at time(I will have to do this dynamically)
Please find the code attached which only fetches and prints the value of the columns (one sample case ) I want to use bul collect here \
Thanks
---------------code sample--------------------------------------
--create or replace type TY_DIMDEAL AS TABLE OF VARCHAR2(50) ;
create or replace procedure TEST_PROC (po_recordset out sys_refcursor)
as
v_col_cnt INTEGER;
v_ind NUMBER;
rec_tab DBMS_SQL.desc_tab;
v_cursor NUMBER;
lvar_output number:=0;
lvar_output1 varchar2(100);
lvar_output3 varchar2(100);
lvar_output2 varchar2(100);
LVAR_TY_DIMDEAL TY_DIMDEAL;
lvarcol varchar2(100);
begin
--
LVAR_TY_DIMDEAL := TY_DIMDEAL();
lvar_output1 := '';
v_cursor := dbms_sql.open_cursor;
dbms_sql.parse(v_cursor, 'select to_char(Field1) , to_char(fiel2) , to_char(field3) from table,table2 ', dbms_sql.native);
dbms_sql.describe_columns(v_cursor, v_col_cnt, rec_tab);
FOR v_pos in 1..rec_tab.LAST LOOP
LVAR_TY_DIMDEAL.EXTEND();
DBMS_SQL.define_column( v_cursor, v_pos ,LVAR_TY_DIMDEAL(v_pos),20);
END LOOP;
-- DBMS_SQL.define_column( v_cursor, 1 ,lvar_output1,20);
--DBMS_SQL.define_column( v_cursor, 2 ,lvar_output2,20);
--DBMS_SQL.define_column( v_cursor, 3 ,lvar_output3,20);
v_ind := dbms_sql.execute( v_cursor );
LOOP
v_ind := DBMS_SQL.FETCH_ROWS( v_cursor );
EXIT WHEN v_ind = 0;
lvar_output := lvar_output+1;
dbms_output.put_line ('row number '||lvar_output) ;
FOR v_col_seq IN 1 .. rec_tab.COUNT LOOP
LVAR_TY_DIMDEAL(v_col_seq):= '';
DBMS_SQL.COLUMN_VALUE( v_cursor, v_col_seq,LVAR_TY_DIMDEAL(v_col_seq));
dbms_output.put_line (LVAR_TY_DIMDEAL(v_col_seq));
END LOOP;
END LOOP;
end TEST_PROC;
Fetching data from a cursor in blocks of reasonable size, while keeping the cursor open, is one of PL/SQL Best Practices.
The above document (see Code 38 item) sketches an approach for when the select list is not known until runtime. Basically:
Define an appropriate type to fetch results into. Let's assume that all the returned columns will by of of type VARCHAR2:
-- inside DECLARE
Ty_FetchResults IS TABLE OF DBMS_SQL.VARCHAR2_TABLE;
lvar_results Ty_FetchResults;
Before each call to DBMS_SQL.FETCH_ROWS, call DBMS_SQL.DEFINE_ARRAY to enable batch fetching.
Call DBMS_SQL.FETCH_ROWS to fetch 1000 rows from the cursor.
Call DBMS_SQL.COLUMN_VALUE to copy the fetched data into your result array.
Process the results, record by record, in a FOR loop. Don't worry about the number of fetched records: if there are records to process, the FOR loop will run correctly; if the result array is empty, the FOR loop will not run.
Exit from the loop when the number of fetched records is less than the expected size.
Remember to DBMS_SQL.CLOSE the cursor.
Your loop body should look like this:
LOOP
FOR j IN 1..v_col_cnt LOOP
DBMS_SQL.DEFINE_ARRAY(v_cursor, j, lvar_results(j), 1000, 1);
END LOOP;
v_ind := DBMS_SQL.FETCH_ROWS(v_cursor);
FOR j IN 1..v_col_cnt LOOP
lvar_results(j).DELETE;
DBMS_SQL.COLUMN_VALUE(v_cursor, j, lvar_results(j));
END LOOP;
-- process the results, record by record
FOR i IN 1..lvar_results(1).COUNT LOOP
-- process a single record...
-- your logic goes here
END LOOP;
EXIT WHEN lvar_results(1).COUNT < 1000;
END LOOP;
-- don't forget: DBMS_CLOSE(v_cursor);
See also Doing SQL from PL/SQL: Best and Worst Practices.
LIMIT CLAUSE CAN COME TO RESCUE!
PL/SQL collections are essentially arrays in memory, so massive
collections can have a detrimental effect on system performance due to
the amount of memory they require. In some situations, it may be
necessary to split the data being processed into chunks to make the
code more memory-friendly. This “chunking” can be achieved using the
LIMIT clause of the BULK COLLECT syntax.
YOU CAN USE LIMIT CLAUSE AFTER BULK COLLECT INTO CLAUSE TO LIMIT YOUR RS.
AFTER YOU EXCEED TO LIMIT YOU CAN FETCH REMAINING ROWS.
SEE THIS ARTICLE
http://www.dba-oracle.com/plsql/t_plsql_limit_clause.htm

Resources