database update on large table oracle taking huge time - oracle

I want to copy one column data into another column in a large table containing 10 millions records.
I am using sys refcursor to copy data from one column into another column. It will taking more than 30 min to copy the data. I am using ORACLE 11gR2.
Is there any others alternative to do the same. Below is the scripts
CREATE OR REPLACE PROCEDURE tblCursor(org_mig OUT SYS_REFCURSOR)
IS
BEGIN
OPEN org_mig FOR
select id from tbl;
END;
/
DECLARE
org_mig SYS_REFCURSOR;
t_id organization.id%TYPE;
loop_var number(10);
commit_interval number(10);
BEGIN
loop_var :=1;
commit_interval:=10000;
tblCursor(org_mig);
LOOP
FETCH org_mig INTO t_id;
EXIT WHEN org_mig%NOTFOUND;
update tbl set col1=col2 where id=t_id;
IF mod(loop_var,commit_interval)=0 THEN
Commit;
End if;
loop_var :=loop_var+1;
END LOOP;
Commit;
CLOSE org_mig;
END;
/

You're doing this for every row in tbl, right? If so, you should just do this:
update tbl
set col1 = col2
/
Updating ten million rows will take some time, but a set operation will be way faster than the Row By Agonizing Row approach you've implemented. Plus, batching up your commits like that is bad practice. Not only does it slow things down, that approach can lead to ORA-01555: Snapshot too old exceptions. Find out more.

Still it has been taken long time to update.
I am trying with different one but getting error.
-----------------------------------
Error starting at line : 43 in command -
SELECT *
FROM TABLE(test_parallel_update(CURSOR(SELECT * FROM organization)))
Error report -
SQL Error: ORA-12801: error signaled in parallel query server P003
ORA-00932: inconsistent datatypes: expected - got -
ORA-06512: at "QA249.TEST_PARALLEL_UPDATE", line 21
12801. 00000 - "error signaled in parallel query server %s"
*Cause: A parallel query server reached an exception condition.
*Action: Check the following error message for the cause, and consult
your error manual for the appropriate action.
*Comment: This error can be turned off with event 10397, in which
case the server's actual error is signaled instead.
---------------------------
Here is the script:
CREATE OR REPLACE TYPE test_num_arr AS TABLE OF NUMBER;
CREATE OR REPLACE FUNCTION test_parallel_update (
test_cur IN SYS_REFCURSOR
)
RETURN test_num_arr
PARALLEL_ENABLE (PARTITION test_cur BY ANY)
PIPELINED
IS
PRAGMA AUTONOMOUS_TRANSACTION;
test_rec organization%ROWTYPE;
TYPE num_tab_t IS TABLE OF NUMBER(10,0);
TYPE vc2_tab_t IS TABLE OF number(1,0);
id NUM_TAB_T;
org_type_old NUM_TAB_T;
IS_DELETED_old VC2_TAB_T;
cnt INTEGER := 0;
BEGIN
LOOP
FETCH test_cur BULK COLLECT INTO id, org_type_old, IS_DELETED_old LIMIT 1000;
EXIT WHEN id.COUNT() = 0;
FORALL i IN id.FIRST .. id.LAST
UPDATE organization
SET org_type = org_type_old(i)
, IS_DELETED = IS_DELETED_old(i)
WHERE id = id(i);
cnt := cnt + id.COUNT;
END LOOP;
CLOSE test_cur;
COMMIT;
PIPE ROW(cnt);
RETURN;
END;
/
show error;
---- To Execute ----
SELECT *
FROM TABLE(test_parallel_update(CURSOR(SELECT * FROM organization)));
Note:
Table
organization
(
id number(10,0),
org_type number(10,0),
org_type_old number(10,0),
IS_DELETED number(1,0),
IS_DELETED_OLD number(1,0)
);
where id is a primary key, Now I want copy org_type_old and IS_DELETED_OLD into org_type and IS_DELETED respectively.

Related

PLSQL Count and Procedure

Currently trying to create a PL/SQL procedure. I am a complete noob at PL/SQL, as you can tell!
We have had to create a table using SQL, and we are looking to automatically update the table with a procedure. If the customer has requested more than 4 jobs, we are looking to input their details into this table as a frequent customer.
I currently have at the moment:
CREATE TABLE PublisherDetails
(PublisherName VARCHAR2 (40),
City VARCHAR2 (20) ,
PhoneNo NUMBER (11),
jobNo NUMBER (10),
startDate DATE,
completionDate DATE)
;
SELECT Publisher.Name AS PublisherName,
Publisher.City, Publisher.PhoneNo,
COUNT (*) AS PublisherJobCount
FROM Publisher
INNER JOIN PrintJob
ON Publisher.Name = PrintJob.PublisherName
GROUP BY Publisher.Name, Publisher.City, Publisher.PhoneNo;
Create or replace procedure Task3
IS CountPublisherJobs NUMBER;
DECLARE No_data_Found EXCEPTION
BEGIN
SELECT count(*) INTO CountPublisherJobs
OPEN Task3;
LOOP
IF PublisherJobCount < 3
THEN INSERT INTO PublisherDetails (PublisherName, City, PhoneNo)
FROM Publisher
WHERE PublisherName = publisher.name
Else
Insert Into PublisherDetails (JobNo, StartDate, CompletionDate )
SELECT jobNo, startDate, completionDate
FROM PrintJob
Where PublisherName = publishers.name
FETCH Task3 INTO PublisherDetails, publishername, city, phoneNo;
EXIT WHEN c1%NOTFOUND;
INSERT INTO temp VALUES (PublisherName, City, PhoneNo, JobNo, StartDate, CompletionDate);
END IF;
COMMIT;
END LOOP;
EXCEPTION
WHEN NO_DATA_FOUND THEN
dbms_output.put_line('Sorry no data found');
END;
/
Its churning up errors and I am not sure why. Any help as always is appreciated.
There are a number of things incorrect with the syntax of your procedure.
The basic format for a stored procedure is
Create {Or Replace} Procedure PROCEDURE_NAME {(i_param IN datatype)}
Is
<<Declaration Section>>
Begin
<<code section>>
Exception
<<Exceptions>>
End PROCEDURE_NAME;
From what you have described above, you want to insert a record into a table, when a condition is met in another table.
To accomplish this, I would need to see the underlying data structure, what you have provided doesn't show the tables the data is currently in (is there a JOB table for instance? a Customer table?).
The NO_DATA_FOUND exception does not need to be declared, it is an Oracle exception
Your Select Count(*) Into CountJobs is missing a From TABLE, and any predicates you want to add, although I am not sure what you are trying to accomplish with this.
You are attempting to open a cursor on the procedure name. You have not defined a cursor with the name Task3
You have not declared the CountPublisherJobs variable
I would suggest perhaps revisiting the basic structure for a stored procedure.
Edit
Based on your response, you could achieve the result using the following:
Create Or Replace Procedure addFrequentPublisher
Is
Cursor frequentPublishers Is
Select PUBLISHER_ID
From JOB
Group By
PUBLISHER_ID
Having Count(*) >= 4;
Begin
For i In frequentPublishers
Loop
Insert Into FREQUENT_CUSTOMER ...
End Loop;
End;

Bulk inserting in Oracle PL/SQL

I have around 5 million of records which needs to be copied from table of one schema to table of another schema(in the same database). I have prepared a script but it gives me the below error.
ORA-06502: PL/SQL: numeric or value error: Bulk bind: Error in define
Following is my script
DECLARE
TYPE tA IS TABLE OF varchar2(10) INDEX BY PLS_INTEGER;
TYPE tB IS TABLE OF SchemaA.TableA.band%TYPE INDEX BY PLS_INTEGER;
TYPE tD IS TABLE OF SchemaA.TableA.start_date%TYPE INDEX BY PLS_INTEGER;
TYPE tE IS TABLE OF SchemaA.TableA.end_date%TYPE INDEX BY PLS_INTEGER;
rA tA;
rB tB;
rD tD;
rE tE;
f number :=0;
BEGIN
SELECT col1||col2||col3 as main_col, band, effective_start_date as start_date, effective_end_date as end_date
BULK COLLECT INTO rA, rB, rD, rE
FROM schemab.tableb;
FORALL i IN rA.FIRST..rE.LAST
insert into SchemaA.TableA(main_col, BAND, user_type, START_DATE, END_DATE, roll_no)
values(rA(i), rB(i), 'C', rD(i), rE(i), 71);
f:=f+1;
if (f=10000) then
commit;
end if;
end;
Could you please help me in finding where the error lies?
Why not a simple
insert into SchemaA.TableA (main_col, BAND, user_type, START_DATE, END_DATE, roll_no)
SELECT col1||col2||col3 as main_col, band, 'C', effective_start_date, effective_end_date, 71
FROM schemab.tableb;
This
f:=f+1;
if (f=10000) then
commit;
end if;
does not make any sense. f becomes 1 - that's it. f=10000 will never be true, thus you don't make a COMMIT.
Following script worked for me and i was able to load around 5 millions of data within 15 minutes.
ALTER SESSION ENABLE PARALLEL DML
/
DECLARE
cursor c_p1 is
SELECT col1||col2||col3 as main_col, band, effective_start_date as start_date, effective_end_date as end_date
FROM schemab.tableb;
TYPE TY_P1_FULL is table of c_p1%rowtype
index by pls_integer;
v_P1_FULL TY_P1_FULL;
v_seq_num number;
BEGIN
open c_p1;
loop
fetch c_p1 BULK COLLECT INTO v_P1_FULL LIMIT 10000;
exit when v_P1_FULL.count = 0;
FOR i IN 1..v_P1_FULL.COUNT loop
INSERT /*+ APPEND */ INTO schemaA.tableA VALUES (v_P1_FULL(i));
end loop;
commit;
end loop;
close c_P1;
dbms_output.put_line('Load completed');
end;
-- Disable parallel mode for this session
ALTER SESSION DISABLE PARALLEL DML
/
ORA-06502: PL/SQL: numeric or value error: Bulk bind: Error in define
You get that error because you have a literal in the VALUES clause of the INSERT. The FORALL expects everything to be bind to an array.
Your program has a massive problem, literally. You have no LIMIT on the BULK COLLECT clause, so that's going to try to load all five million records from TableB into your collections. That will blow your session's memory limit.
The point of using BULK COLLECT and FORALL is to bite off chunks of a bigger data set and process it in batches. For that you need a loop. The loop has no FOR condition: instead test whether the fetch returned anything and exit when the array has zero records.
DECLARE
TYPE recA IS RECORD (
main_col SchemaA.TableA.main_col%TYPE
, band SchemaA.TableA.band%TYPE
, start_date date
, end_date date
, roll_ni number);
TYPE recsA is table of recA
nt_a recsA;
f number :=0;
CURSOR cur_b is
SELECT col1||col2||col3 as main_col,
band,
effective_start_date as start_date,
effective_end_date as end_date ,
71 as roll_no
FROM schemab.tableb;
BEGIN
open cur_b;
loop
fetch curb_b bulk collect into nt_a limit 1000;
exit when nt_a.count() = 0;
FORALL i IN rA.FIRST..rE.LAST
insert into SchemaA.TableA(main_col, BAND, user_type, START_DATE, END_DATE, roll_no)
values nt_a(i);
f := f + sql%rowcount;
if (f > = 10000) then
commit;
f := 0;
end if;
end loop;
commit;
close cur_b;
end;
Please note that issuing commits inside a loop is contraindicated. You lay yourself open to runtime errors such as ORA-01002 and ORA-01555. If your program does crash half-way through you will have great difficulty in resuming it without problems. By all means persist if you have problems with UNDO tablespace, but the correct answer is to get the DBA to enlarge the UNDO tablespace not weaken your code.
"i am using bulk insert because it gives better performance"
It is true that BULK COLLECT and FORALL ... INSERT is more performative than a CURSOR FOR loop with row-by-row single inserts. It is not more efficient than a pure SQL INSERT INTO ... SELECT. The value of the construct is that it allows us to manipulate the contents of the array before inserting it. This is handle if we have complex business rules which can only be applied programmatically.
Please try after changing first 2 line of your code with below:
DECLARE
TYPE tA IS TABLE OF SchemaA.TableA.main_col%TYPE INDEX BY PLS_INTEGER;
...
...
This may be because of data type/length mismatch. In declaration section you have missed to declare one to inherit type from table.
Also as mentioned, f logic for commit will not do the magic for you. Better you should use LIMIT with BULL COLLECT

PL/SQL - Split the loop to load the bad data

declare
cursor c_data
is
select * from test_product_u;
error_row varchar2(4000);
v_errormsg varchar2(200);
begin
for i in c_data
loop
begin
insert into test_product_u_final (PRODUCT_NO, CREATED_DATE_RAW, DATE_FORMAT)
values (i.PRODUCT_NO, i.CREATED_DATE_RAW,i.DATE_FORMAT);
commit;
exception when others then
error_row := i.PRODUCT_NO ||';'|| i.CREATED_DATE_RAW ||';'|| i.DATE_FORMAT;
v_errormsg := SUBSTR(SQLERRM,1,64);
insert into test_product_error_new(error_no,error_row_msg,errormsg_sql)
values (ERROR_NO.NEXTVAL,error_row,v_errormsg);
end;
end loop;
end;
1 - The above code inserts all the rows irrespective bad or good into the error table. I would want to split the code to just put the good data in the destination and bad data into the error? Any help here.
When I split the loop the reference to the FOR loop is not there.
Sample data
Product_no CREATED_DATE_RAW CREATED_DATE PRICE
1 01-JAN-16 01-JAN-16 55
2 03-JAN-16 03-JAN-16 null
No need for PL/SQL or a loop here. You can use Oracle's error logging feature for this:
First create a table where the errors should be stored:
execute dbms_errlog.create_error_log('TEST_PRODUCT_U_FINAL', 'TEST_PRODUCT_ERRORS');
Then run the insert:
insert into test_product_u_final (PRODUCT_NO, CREATED_DATE_RAW, DATE_FORMAT)
select i.PRODUCT_NO, i.CREATED_DATE_RAW,i.DATE_FORMAT
from test_product_u i
log errors into test_product_errors
reject limit unlimited;
Documentation for dbms_errlog
Documentation for log errors into

how to generate a table of random data from existing database table through oracle procedure

I have to generate a table (contains two columns) of random data from a database table through oracle procedure. The user can indicate the number of data required and we have to use the table data with ID values from 1001 to 1060. I am trying to use cursor loop and not sure dbms_random method dhould I use.
I am using the following code to create procedure
create or replace procedure a05_random_plant(p_count in number)
as
v_count number := p_count;
cursor c is
select plant_id, common_name
from ppl_plants
where rownum = v_count
order by dbms_random.value;
begin
delete from a05_random_plants_table;
for c_table in c
loop
insert into a05_random_plants_table(plant_id, plant_name)
values (c_table.plant_id, c_table.common_name);
end loop;
end;
/
it complied successfully. Then I executed with the following code
set serveroutput on
exec a05_random_plant(5);
it shows anonymous block completed
but when run the following code, I do not get any records
select * from a05_random_plants_table;
The rownum=value would not work for a value greater than 1
hence try the below
create or replace procedure a05_random_plant(p_count in number)
as
v_count number := p_count;
cursor c is
select plant_id, common_name
from ppl_plants
where rownum <= v_count
order by dbms_random.value;
begin
delete from a05_random_plants_table;
for c_table in c
loop
insert into a05_random_plants_table(plant_id, plant_name)
values (c_table.plant_id, c_table.common_name);
end loop;
end;
/
Query by Tom Kyte - will generate almost 75K of rows:
select trunc(sysdate,'year')+mod(rownum,365) TRANS_DATE,
mod(rownum,100) CUST_ID,
abs(dbms_random.random)/100 SALES_AMOUNT
from all_objects
/
You can use this example to write your query and add where clause to it - where id between 1001 and 1060, for example.
I don't think you should use a cursor (which is slow naturally) but do a direct insert from a select:
insert into table (col1, col2)
select colx, coly from other_table...
And, isn't missing a COMMIT on the end of your procedure?
So, all code in your procedure would be a DELETE, a INSERT WITH that SELECT and then a COMMIT.

Reasonable SELECT ... INTO Oracle solution for case of multiple OR no rows

I just want to SELECT values into variables from inside a procedure.
SELECT blah1,blah2 INTO var1_,var2_
FROM ...
Sometimes a large complex query will have no rows sometimes it will have more than one -- both cases lead to exceptions. I would love to replace the exception behavior with implicit behavior similiar to:
No rows = no value change, Multiple rows = use last
I can constrain the result set easily enough for the "multiple rows" case but "no rows" is much more difficult for situations where you can't use an aggregate function in the SELECT.
Is there any special workarounds or suggestions? Looking to avoid significantly rewriting queries or executing twice to get a rowcount before executing SELECT INTO.
Whats wrong with using an exception block?
create or replace
procedure p(v_job VARCHAR2) IS
v_ename VARCHAR2(255);
begin
select ename into v_ename
from (
select ename
from scott.emp
where job = v_job
order by v_ename desc )
where rownum = 1;
DBMS_OUTPUT.PUT_LINE('Found Rows Logic Here -> Found ' || v_ename);
EXCEPTION WHEN NO_DATA_FOUND THEN
DBMS_OUTPUT.PUT_LINE('No Rows found logic here');
end;
SQL> begin
p('FOO');
p('CLERK');
end; 2 3 4
5 /
No Rows found logic here
Found Rows Logic Here -> Found SMITH
PL/SQL procedure successfully completed.
SQL>
You could use a for loop. A for loop would do nothing for no rows returned and would be applied to every row returned if there where multiples. You could adjust your select so that it only returns the last row.
begin
for ARow in (select *
from tableA ta
Where ta.value = ???) loop
-- do something to ARow
end loop;
end;

Resources