Error : ORA-01704: string literal too long. Dynamically assign CLOB to variable - oracle

My problem is quite nontrivial.
I should execute PL/SQL script which runs INSERT INTO query. It looks like:
DECLARE
newId NUMBER(38,0) := &1;
BEGIN
Insert into FOO ("ID", "DESCRIPTION")
values (newId+1, 'LARGE CLOB WHICH PRODUCES EXCEPTION');
-- A LOT OF INSERT QUERIES
END;
/
exit;
So, I found that it's a good idea to assign CLOB to VARCHAR2 variable as It could be 32767 bytes long. My goal is to do that for each INSERT INTO query. Like:
--assign CLOB to VARCHAR2 variable
-- INSERT variable instead of CLOB type
I want to point out that I have a lot of INSERT INTO queries in script, so I should reassign variable before each INSERT INTO query, how can I do this?

You're getting the ORA-01704 because your string literal is more than 4000 bytes, which is the size limit for string literals in SQL calls. In PL/SQL the limit is 32k, so if all your values are less than that you can assign them to a PL/SQL variable and use that for the insert:
DECLARE
newId NUMBER(38,0) := &1;
newDescription varchar2(32767); -- or clob
BEGIN
newDescription := 'LARGE CLOB WHICH PRODUCES EXCEPTION';
Insert into FOO ("ID", "DESCRIPTION")
values (newId+1, newDescription);
newDescription := 'ANOTHER LARGE CLOB WHICH PRODUCES EXCEPTION';
Insert into FOO ("ID", "DESCRIPTION")
values (newId+1, newDescription);
...
END;
/
If any of the values are more than 32k you'll need a PL/SQL CLOB variable, and will need to construct that by appending shorted (<32k) string literals, which is messy.
Using multiple insert statements may not be the best way to go anyway. You might be able to use SQL*Loader or an external table to load the data more simply. Or you could read the values using utl_file, e.g. into the same PL/SQL variable, and then insert in a loop - which would be less code and easier to maintain.
You could also use a collection to hold the string values:
DECLARE
TYPE stringTab IS table of varchar2(32767); -- or clob
newDescriptions stringTab := new stringTab();
BEGIN
newDescriptions.extend;
newDescriptions(newDescriptions.last) := 'LARGE CLOB WHICH PRODUCES EXCEPTION';
newDescriptions.extend;
newDescriptions(newDescriptions.last) := 'ANOTHER LARGE CLOB WHICH PRODUCES EXCEPTION';
forall i in newDescriptions.first..newDescriptions.last
insert into FOO ("ID", "DESCRIPTION")
values (&1 + 1, newDescriptions(i));
END;
/
... which will be a trade-off between performance and (maybe) readability, against memory usage by the collection. And you can populate that in the block, or again read the values into the collection from a file, if that's feasible for your situation.
You can still generate this from queries against an existing table, with something like:
set pages 0
set lines 32767
set long 32767
set define off
select 'DECLARE' || chr(10)
|| ' newId NUMBER(38,0) := &1;' || chr(10)
|| ' newDescription varchar2(32767);' || chr(10)
|| 'BEGIN'
from dual;
select ' newDescription := q''[' || description || ']'';' || chr(10)
|| ' newId := newId + 1;' || chr(10)
|| ' insert into FOO ("ID", "DESCRIPTION") values (newId, newDescription);' || chr(10)
from foo;
select 'END;' || chr(10)
|| '/' || chr(10)
|| 'exit'
from dual;
set define on
I've used the alternative quoting mechanism in case any of your string values contain single quotes, but you'll need to pick a suitable quote delimiter. And again this assumes none of your CLOB values exceeds 32k.
I'd also reconsider if you really want to do this with a script full of insert statements; if the data is coming from a table anyway then an export/import might be more appropriate.

You can also declare a clob variable in PL/SQL. See the lob semantics and the DBMS_LOB package to manipulate it. It can be something like:
DECLARE
myLOBFromDatabase CLOB;
BEGIN
SELECT lobComun INTO myLOBFromDatabase FROM table WHERE id=1;
/* Manipulate lob wth dbms_lob package at will here */
Insert into FOO ("ID", "DESCRIPTION")
values (newId+1, myLOBFromDatabase );
END;

I have a little strange resolve of this, but it works.
DECLARE
JS CLOB;
BEGIN
JS := TO_CLOB( 'THE FIRST PART OF CLOB SHOULD BE LESS THEN 32K
......
......'||
'.......
THIS IS SECOND PART IS LESS THEN 32K TOO'||
'.......
LAST PART');
END;
This way got me to insert more than 1M

Related

VARCHAR2(32767) not able to handle strings in stored procedure

I am concatenating string using cursor (to form query to execute later). Here, the query that will be formed is going to be way bigger that what VARCHAR2(32767) can handle. There fore, I am getting error on proc execution - ORA-06502: PL/SQL: numeric or value error: character string buffer too small.
I used CLOB data type as well bu got error ORA-06502: PL/SQL: numeric or value error.
My code is here below:
CREATE OR REPLACE PROCEDURE sp_Market
IS
Names VARCHAR2(32767);
BEGIN
DECLARE CURSOR cur IS ('Select ID, Order_of, field_name
FROM pld_medicare_config');
BEGIN
FOR i IN cur
LOOP
Names := Names || i.sqql;
END LOOP;
dbms_output.put_line(Names);
END;
END sp_Market;
How can I handle my string of queries and what data type is there to accomplish the task?
CLOB is OK (as far as I can tell); I doubt queries you store in there are that big.
Remove dbms_output.put_line call from the procedure; I suspect it is the one that raises the error.
I'm not sure how you got any runtime error, as your procedure won't compile.
The valid PL/SQL version would look something like this:
create or replace procedure sp_market is
names varchar2(32767);
begin
for r in (
select id, order_of, field_name
from pld_medicare_config
)
loop
names := names || ' ' || r.field_name;
end loop;
names := ltrim(names);
dbms_output.put_line(names);
end sp_market;
If names needs to be longer, change the datatype to clob.
Use the CLOB datatype and append data using the dbms_lob.writeappend procedure. This is the reference (Oracle 18c).
The error probably origins with the dbms_output.put_line call. The procedure is defined for varchar2 arguments only which means that an implicit conversion takes place during the call. It will fail for clob contents longer than 32767 chars/bytes.
Alternatively you may declare a collection over varchar2(4000) and fill the collection elements sequentially:
CREATE OR REPLACE PROCEDURE sp_Market
IS
TYPE tLongString IS TABLE OF VARCHAR2(4000) INDEX BY BINARY_INTEGER;
cNames tLongString;
BEGIN
DECLARE CURSOR cur IS Select ID, Order_of, field_name, sqql FROM pld_medicare_config;
BEGIN
FOR i IN cur
LOOP
cNames(cNames.COUNT+1) := i.sqql;
END LOOP;
END;
END sp_Market;
Note
Rectified code, will compile now.

Trying to use a FORALL to insert data dynamically to a table specified to the procedure

I have the need to dynamic know the name of the table that has the same data structure as many others and I can pass in a generic associative array that is of the same structure. Here is the proc
PROCEDURE INSRT_INTER_TBL(P_TABLE_NAME IN VARCHAR2, P_DATA IN tt_type)
IS
BEGIN
FORALL i IN P_DATA.FIRST .. P_DATA.LAST
EXECUTE IMMEDIATE
'INSERT INTO ' || P_TABLE_NAME ||
' VALUES :1'
USING P_DATA(i);
END INSRT_INTER_TBL;
I am getting the following error
ORA-01006: bind variable does not exist
What am I missing here?
So I had to specify all the columns necessary to insert to the table out in the insert statement like:
PROCEDURE INSRT_INTER_TBL(P_TABLE_NAME IN VARCHAR2, P_DATA IN inter_invc_ln_item_type)
IS
BEGIN
FORALL i IN P_DATA.FIRST .. P_DATA.LAST
EXECUTE IMMEDIATE
'INSERT INTO ' || P_TABLE_NAME || ' (ITEM_PK, pk, units, amt) ' ||
' VALUES (:P_INVC_LN_ITEM_PK, :PK, :UNITS, :AMT)'
USING IN P_DATA(i).item_pk, P_DATA(i).pk, P_DATA(i).units, P_DATA(i).amt;
END INSRT_INTER_TBL;
The TABLE operator works better than a FORALL here. It uses less code and probably skips some SQL-to-PL/SQL context switches.
--Simple record and table type.
create or replace type tt_rec is object
(
a number,
b number
);
create or replace type tt_type is table of tt_rec;
--Sample schema that will hold results.
create table test1(a number, b number);
--PL/SQL block that inserts TT_TYPE into a table.
declare
p_table_name varchar2(100) := 'test1';
p_data tt_type := tt_type(tt_rec(1,1), tt_rec(2,2));
begin
execute immediate
'
insert into '||p_table_name||'
select * from table(:p_data)
'
using p_data;
commit;
end;
/
You can run the above code in this SQL Fiddle.
Try VALUES (:1) i.e. have brackets around :1

Identifier Too long Exception

I have a query of more than 4000 characters which is formed from different varaibles having varchar2 datatype of size 2000
example
query1 varcahr2(2000):='string 1';
query2 varchar2(2000):='string2';
query3 varcahr2 (2000):= string3';
I have declared a variable query varchar2(32000)
query := query1|| query2 || query3 ;
create table t (
id number,
querystring varchar2(4000));
I tried to get the first 4000 characters from the query variable it is not working. Can anyone please help?
declare
querystring1 varchar2(2000) := "string1";
querystring2 varchar2(2000) := "string2";
l_query varchar2(32000);
query varchar2(4000);
begin
l_query := querystring1 || querystring2 ;
select substr(l_query,1,4000) into query from dual;
insert into lib_query_table values('1',query);
end;
You are using double quotes around your query string literals, instead of single quotes. That makes Oracle interpret them as identifier names - so as soon as your literal is more than 30 characters long you'll get that exception. With shorter strings you'd still get an error, but something like 'unknown identifier'.
Replace your double quotes with single ones:
declare
querystring1 varchar2(2000) := 'string1';
querystring2 varchar2(2000) := 'string2';
l_query varchar2(32000);
query varchar2(4000);
begin
l_query := querystring1 || querystring2 ;
select substr(l_query,1,4000) into query from dual;
insert into lib_query_table values (1, query);
end;
You don't need the query from dual, you can do:
query := substr(l_query, 1, 4000);
You could skip that variable and do:
insert into lib_query_table (id, querystring)
values (1, substr(l_query, 1, 4000);
or even:
insert into lib_query_table (id, querystring)
values (1, substr(querystring1 || querystring2, 1, 4000));
As your ID column is a number you should not insert the value for that as a string '1' - just use a number. You probably want a sequence to set that value, eventually.
Also not directly related, but when you're concatenating parts of a query together, say where one string is the select list and the second is the from clause etc., make sure you have whitespace at the end of each part (or the start of the next part), or the combined string might end up invalid.
Variable-length character string having maximum length size bytes or characters. You must specify size for VARCHAR2. Minimum size is 1 byte or 1 character. Maximum size is: 32767 bytes or characters if MAX_STRING_SIZE = EXTENDED 4000 bytes or characters .
You can look CLOB data type

Error : ORA-01704: string literal too long

While I try to set the value of over 4000 characters on a field that has data type CLOB, it gives me this error :
ORA-01704: string literal too long.
Any suggestion, which data type would be applicable for me if I have to set value of unlimited characters although for my case, it happens to be of about 15000 chars.
Note : The long string that I am trying to store is encoded in ANSI.
What are you using when operate with CLOB?
In all events you can do it with PL/SQL
DECLARE
str varchar2(32767);
BEGIN
str := 'Very-very-...-very-very-very-very-very-very long string value';
update t1 set col1 = str;
END;
/
Proof link on SQLFiddle
Try to split the characters into multiple chunks like the query below and try:
Insert into table (clob_column) values ( to_clob( 'chunk 1' ) || to_clob( 'chunk 2' ) );
It worked for me.
To solve this issue on my side, I had to use a combo of what was already proposed there
DECLARE
chunk1 CLOB; chunk2 CLOB; chunk3 CLOB;
BEGIN
chunk1 := 'very long literal part 1';
chunk2 := 'very long literal part 2';
chunk3 := 'very long literal part 3';
INSERT INTO table (MY_CLOB)
SELECT ( chunk1 || chunk2 || chunk3 ) FROM dual;
END;
Hope this helps.
The split work until 4000 chars depending on the characters that you are inserting. If you are inserting special characters it can fail.
The only secure way is to declare a variable.
Though its a very old question but i think sharing experience still might help others:
Large text can be saved in a single query if we break-down it in chunks of 4000 bytes/characters by concatinating them using '||'
Running following query will tell you:
Required Number of chunks containing 4000 bytes
Remaining bytes
Since, in given example you are trying to save text contining 15000 bytes (characters), so,
select 15000/4000 chunk,mod(15000,4000) remaining_bytes from dual;
Result:
That means, you need to concatenate 3 chunks of 4000 bytes and one chunk of 3000 bytes, so it would be like:
INSERT INTO <YOUR_TABLE>
VALUES (TO_CLOB('<1st_4K_bytes>') ||
TO_CLOB('<2nd_4K_bytes>') ||
TO_CLOB('<3rd_4K_bytes>') ||
TO_CLOB('<last_3K_bytes>)');
create a function that return a clob
create function ret_long_chars return clob is
begin
return to_clob('put here long characters');
end;
update table set column = ret_long_chars;
INSERT INTO table(clob_column) SELECT TO_CLOB(q'[chunk1]') || TO_CLOB(q'[chunk2]') ||
TO_CLOB(q'[chunk3]') || TO_CLOB(q'[chunk4]') FROM DUAL;
Accepted answer did not work for me in sql developper but combination of this answer and another one did :
DECLARE
str varchar2(32767);
BEGIN
update table set column = to_clob('Very-very-...-very-very-very-very-very-very long string value');
END;
/

Oracle PL\SQL Null Input Parameter WHERE condition

As of now I am using IF ELSE to handle this condition
IF INPUT_PARAM IS NOT NULL
SELECT ... FROM SOMETABLE WHERE COLUMN = INPUT_PARAM
ELSE
SELECT ... FROM SOMETABLE
Is there any better way to do this in a single query without IF ELSE loops. As the query gets complex there will be more input parameters like this and the amount of IF ELSE required would be too much.
One method would be to use a variant of
WHERE column = nvl(var, column)
There are two pitfalls here however:
if the column is nullable, this clause will filter null values whereas in your question you would not filter the null values in the second case. You could modify this clause to take nulls into account but it turns ugly:
WHERE nvl(column, impossible_value) = nvl(var, impossible_value)
Of course if somehow the impossible_value is ever inserted you will run into some other kind of (fun) problems.
The optimizer doesn't understand correctly this type of clause. It will sometimes produce a plan with a UNION ALL but if there are more than a couple of nvl, you will get full scan even if perfectly valid indexes are present.
This is why when there are lots of parameters (several search fields in a big form for example), I like to use dynamic SQL:
DECLARE
l_query VARCHAR2(32767) := 'SELECT ... JOIN ... WHERE 1 = 1';
BEGIN
IF param1 IS NOT NULL THEN
l_query := l_query || ' AND column1 = :p1';
ELSE
l_query := l_query || ' AND :p1 IS NULL';
END IF;
/* repeat for each parameter */
...
/* open the cursor dynamically */
OPEN your_ref_cursor FOR l_query USING param1 /*,param2...*/;
END;
You can also use EXECUTE IMMEDIATE l_query INTO l_result USING param1;
This should work
SELECT ... FROM SOMETABLE WHERE COLUMN = NVL( INPUT_PARAM, COLUMN )

Resources