avoiding duplicates oracle aapex - oracle

i am trying to avoid duplicates in a table called 'incomingrequest', my table looks like this
CREATE TABLE "REGISTRY"."INCOMINGREQUEST"
( "ID" NUMBER(30,0),
"FILENUMBER" VARCHAR2(30 BYTE),
"REQUESTEDFILE" VARCHAR2(300 BYTE),
"REQUESTEDDEPARTMENT" VARCHAR2(30 BYTE),
"REQUESTDATE" DATE,
"STATUS" VARCHAR2(30 BYTE),
"URGENCY" VARCHAR2(30 BYTE),
"VOLUME" NUMBER(30,0),
"SUB" NUMBER(30,0),
"REGISTRYID" NUMBER(30,0),
"TEMPORARY" VARCHAR2(30 BYTE)
)
and the table data is a s follows
filenumber Filename requester status REQUESTEDDEPARTMENT
1/11/2 Payments JOSHUA MITCHELL PENDING DAY CARE
1/11/2 Payments JOSHUA MITCHELL Delivered DAY CARE
1/11/2 Payments JOSHUA MITCHELL PENDING DAY CARE
1/11/2 Payments RAWLE MUSGRAVE PENDING COMCORP
NB i only included the important fields above for this scenario (the other fields in the table has data).
What i want to achieve is ,when the app_user which in this case is the department (daycare) makes the same request while the previous request is pending(status) i want an error to occur. so the 3rd record/request should not have happen.
the trigger i am trying is
create or replace trigger "INCOMINGREQUEST_T1"
BEFORE
insert or update or delete on "INCOMINGREQUEST"
for each row
DECLARE counter INTEGER;
BEGIN
SELECT * INTO COUNTER FROM
(SELECT COUNT(rownum) FROM INCOMINGREQUEST WHERE requesteddepartment = V('APP_USER')
and status ='PENDING');
IF counter = 1 THEN
RAISE_APPLICATION_ERROR(-20012,'Duplicated value');
END IF;
END;
but i am getting an error
REGISTRY.INCOMINGREQUEST is mutating, trigger/function may not see it ORA-06512: at "REGISTRY.INCOMINGREQUEST_T1", line 3 ORA-04088: error during execution of trigger 'REGISTRY.INCOMINGREQUEST_T1'

You can easily achieve the desired behavior using the conditional UNIQUE index as following:
CREATE UNIQUE INDEX INCOMINGREQUEST_IDX ON
T1 ( CASE WHEN STATUS = 'PENDING'
THEN FILENUMBER
END );
Cheers!!

You could use a procedure to stop duplicates, and pass over the parameters you need to insert into the table.
The issue with using a Trigger to find the current status is that you cannot query information from a table you are inserting/updating/deleting from inside the trigger as the data is "Mutating".
To run the procedure use:
BEGIN
stack_prc('DAY CARE', 'PENDING');
END;
Procedure

Related

How to delete records in parallel from oracle table

We are maintaining audit of our application in table 'application_audit'. I am trying to write the stored procedure to delete records from this table which we don't need any more.
So far, I have written below stored procedure but I found that it is taking lot of time when number of rows to be deleted are more than 100k.
Can you please help me to implement parallel sessions OR optimize delete query in below stored procedure to speedup the execution.
In production, this table will have at least 5 million rows at any given point of time and from what I can see, if we execute this stored proc everyday then there will be at least 100k records to be deleted. In below query, COMPONENT_NAME='REQUESTPURGE' means that for that particular request number purge already happened and there is no request data present in our active database instance for that request number so all records in 'application_audit' table with that request number become eligible for deletion.
Stored procedure:
create or replace PROCEDURE APPLICATION_AUDIT_PURGE_RECORD
IS
purgewait number := 30;
BEGIN
DBMS_OUTPUT.PUT_LINE('Application audit purge started with purge wait value as '||purgewait||' days');
delete from application_audit where id in (select id from application_audit where request_number in (select request_number from application_audit where COMPONENT_NAME='REQUESTPURGE' and trunc(timestamp) < trunc(sysdate - purgewait)));
END APPLICATION_AUDIT_PURGE_RECORD;
Table:
CREATE TABLE "APPLICATION_AUDIT" (
"ID" NUMBER GENERATED ALWAYS AS IDENTITY NOT NULL,
"MESSAGE_TYPE" VARCHAR2(64 CHAR),
"COMPONENT_NAME" VARCHAR2(64 CHAR),
"USERNAME" VARCHAR2(32 CHAR),
"TIMESTAMP" TIMESTAMP (6) WITH TIME ZONE NOT NULL,
"REQUEST_NUMBER" VARCHAR2(64 CHAR),
"MODULE_NAME" VARCHAR2(256 CHAR),
"PROCESS_NAME" VARCHAR2(256 CHAR),
"VERSION" VARCHAR2(64 CHAR),
"TASK" VARCHAR2(64 CHAR),
"ERROR_CODE" VARCHAR2(256 CHAR),
"ERROR_MESSAGE" VARCHAR2(4000 CHAR),
"MESSAGE" VARCHAR2(4000 CHAR)
)
Edit1:
Changing delete statements in stored procedure and using indexes reduced the execution time significantly.
Updated delete statements in stored procedure:
DELETE FROM APPLICATION_AUDIT WHERE REQUEST_NUMBER IN (SELECT APPLICATION_AUDIT.REQUEST_NUMBER FROM APPLICATION_AUDIT WHERE APPLICATION_AUDIT.REQUEST_NUMBER != 'null' AND APPLICATION_AUDIT.MESSAGE_TYPE='INFO' AND APPLICATION_AUDIT.COMPONENT_NAME='REQUESTPURGE' AND APPLICATION_AUDIT.TASK='DeleteRequest' AND TRUNC(APPLICATION_AUDIT.TIMESTAMP) < TRUNC(SYSDATE - v_reqnumpurgewait));
DELETE FROM APPLICATION_AUDIT WHERE REQUEST_NUMBER = 'null' AND TRUNC(APPLICATION_AUDIT.TIMESTAMP) < TRUNC(SYSDATE - v_purgewait);
Index creation queries:
CREATE INDEX APPLICATION_AUDIT_IDX1 ON APPLICATION_AUDIT (COMPONENT_NAME, TIMESTAMP, (NVL(REQUEST_NUMBER,'null')));
CREATE INDEX APPLICATION_AUDIT_IDX2 ON APPLICATION_AUDIT (NVL(REQUEST_NUMBER,'null'));
I see it suffices to find one row with component_name = 'REQUESTPURGE' to delete all rows with the same request number. This means that the component_name alone doesn't tell us whether to delete a row or not. Otherwise I'd have uggested to use table partitions here.
As is, all I can think of is providing appropriate indexes. First of all, though, your query can be simplified to:
delete from application_audit
where request_number in
(
select request_number
from application_audit
where component_name = 'REQUESTPURGE'
and timestamp < trunc(sysdate - purgewait)
);
The indexes I suggest for this statement:
create index idx1 on application_audit (component_name, timestamp, request_number);
create index idx2 on application_audit (request_number);
Deleting 5 million records shouldn't be that time consuming.
Having said that, you can try adding a parallel hint to the DELETE statement.
First enable
ALTER SESSION ENABLE PARALLEL DML;
If that's not helping, you could look into:
Disabling indexes on the table
But, of course, any queries needing and using those indexes will be slower while your delete runs. So you're just trading one slow statement for (lots of) others. And you'll have to rebuild them afterwards which will take (possibly a looooooong) time.
You can look into chunking by SQL or rowid
If none of these help enough, you may need to look into more radical solutions.
Such as saving the data you want to keep in a temporary table. Then dropping the current table and renaming the temporary one. e.g.:
create table tmp as select ...data you want to keep... from old_tab;
drop old_tab;
rename tmp to old_tab;
-- run grants, indexes etc. that were on the original table
...
But you need an outage to do this.
I would suggest track down where the bottleneck is occurring first with an explain plan or trace as it sounds like you have an underlying problem if 5 million deletes are taking a long time
I think your DELETE query can be oversimplified to -
DELETE FROM application_audit
WHERE COMPONENT_NAME = 'REQUESTPURGE'
AND TRUNC(timestamp) < TRUNC(SYSDATE - purgewait);
You can try having an index on COMPONENT_NAME column as well.

Amend Stored Procedure to Ignore Duplicate Records

I need to make the below amendment to this stored procedure
create or replace PROCEDURE "USP_IMPORT_FOBTPP_DATA"
AS
BEGIN
INSERT INTO FINIMP.FOBT_PARTPAYMENT
SELECT
PART_PAYMENT_ID,
ISSUING_SHOP,
TILL_NUMBER,
SLIP_NUMBER,
FOBT_NUMBER,
WHO_PAID,
WHEN_PAID,
AMOUNT_LEFT_TO_PAY,
FOBT_VALUE,
STATUS
FROM IMPORTDB.CLN_FOBTPP;
COMMIT;
END;
In order to skip any records that would result in a primary key violation, this is so the dataload process does not break.
Source Table
CREATE TABLE "FINIMP"."FOBT_PARTPAYMENT"
( "PART_PAYMENT_ID" NUMBER(*,0),
"ISSUING_SHOP" CHAR(4 BYTE),
"TILL_NUMBER" NUMBER(3,0),
"SLIP_NUMBER" NUMBER(*,0),
"FOBT_NUMBER" VARCHAR2(30 BYTE),
"WHO_PAID" CHAR(20 BYTE),
"WHEN_PAID" DATE,
"AMOUNT_LEFT_TO_PAY" NUMBER(19,4),
"FOBT_VALUE" NUMBER(19,4),
"STATUS" CHAR(2 BYTE)
);
ALTER TABLE "FINIMP"."FOBT_PARTPAYMENT" ADD CONSTRAINT "PK_FOBT_PP" PRIMARY KEY ("PART_PAYMENT_ID", "ISSUING_SHOP", "WHEN_PAID")
I am new to PL/SQL, how can I do this?
There are a number of ways to accomplish this, and the best method depends on your environment/requirements. Is the CLN_FOBTPP table considerably large? Is the USP_IMPORT_FOBTPP_DATA procedure called frequently, and does it need to meet certain performance criteria? These are all things you should consider.
One way to do this would be to start with the query that you use.
create or replace PROCEDURE "USP_IMPORT_FOBTPP_DATA"
AS
BEGIN
INSERT INTO FINIMP.FOBT_PARTPAYMENT
SELECT ...
FROM IMPORTDB.CLN_FOBTPP;
This will return all of the rows of data from IMPORTDB.CLN_FOBTP and insert them into FINIMP.FOBT_PARTPAYMENT. Instead, you could control for this by doing:
INSERT INTO FINIMP.FOBT_PARTPAYMENT
SELECT ...
FROM IMPORTDB.CLN_FOBTPP WHERE PART_PAYMENT_ID NOT IN (FINIMP.FOBT_PARTPAYMENT)
This would go through the FOBT_PARTPAYMENT table and check to see if a row's PART_PAYMENT_ID existed in the table before doing the insert. However, this can be prohibitively expensive if the table is large or if you have performance requirements.
Another way would be to create a temp table for each time the procedure is called, store the values in that temp table, and then add the new rows after validating the data. This would look something like:
create global temporary table temp_USP_table ("PART_PAYMENT_ID" NUMBER(*,0), "ISSUING_SHOP" CHAR(4 BYTE),...) on commit delete rows;
create or replace PROCEDURE "USP_IMPORT_FOBTPP_DATA"
AS
BEGIN
INSERT INTO temp_USP_table
SELECT ...
FROM IMPORTDB.CLN_FOBTPP;
From there, you can do a number of things. You could use the same procedure to add the new rows from the temp table into the FINIMP.FOBT_PARTPAYMENT table:
delete from temp_USP_table where PART_PAYMENT_ID in FINIMP.FOBT_PARTPAYMENT;
insert into FINIMP.FOBT_PARTPAYMENT select * from temp_USP_table;
Or you could create a new procedure to load the new data from the temp_USP_table into the FINIMP.FOBT_PARTPAYMENT table, in case you'd like to do something additional to the new data before it's added to the table. Since you reference a data load, I would recommend going the temporary table route because it should allow you to load the data without issue. Once the data is loaded, you can worry about adding it to the proper table(s).

Cannot insert NULL into, ERROR during execution of trigger

I have created a Trigger on table Customers so that every time a record is deleted from table customers this same record is inserted in table Customer_Archives with the current date as Deletion_Date.
I am have to insert a new customer into table Customers and then delete it. The record must be inserted correctly into table Customers_Archive.
Here's script I have so far:
CREATE TABLE Customer_Archives
(
customer_id NUMBER NOT NULL,
customer_first_name VARCHAR2(50),
customer_last_name VARCHAR2(50) NOT NULL,
customer_address VARCHAR2(255) NOT NULL,
customer_city VARCHAR2(50) NOT NULL,
customer_state CHAR(2) NOT NULL,
customer_zip VARCHAR2(20) NOT NULL,
customer_phone VARCHAR2(30) NOT NULL,
customer_fax VARCHAR2(30),
deletion_date DATE,
CONSTRAINT customer_archives_pk
PRIMARY KEY (customer_id)
);
CREATE OR REPLACE TRIGGER Customers_before_insert
BEFORE DELETE ON Customers
FOR EACH ROW
DECLARE
ar_row Customers%rowtype;
BEGIN
INSERT INTO Customer_Archives
VALUES(ar_row.Customer_id,
ar_row.Customer_First_Name,
ar_row.Customer_Last_Name,
ar_row.Customer_Address,
ar_row.Customer_City,
ar_row.Customer_State,
ar_row.Customer_Zip,
ar_row.Customer_Phone,
ar_row.Customer_Fax,
sysdate());
dbms_output.put_line('New row is added to Customers_Archive
Table with Customer_ID:' ||ar_row.Customer_id ||'on date:' || sysdate());
END;
/
SELECT trigger_name, status FROM user_triggers;
INSERT INTO CUSTOMERS
(customer_id, customer_first_name, customer_last_name, customer_address,
customer_city, customer_state, customer_zip, customer_phone, customer_fax)
VALUES (27,'Sofia','Chen','8888 Cowden St.','Philadelphia','PA',
'19149',7654321234',NULL);
DELETE FROM CUSTOMERS
WHERE customer_id = 27;
When I try to delete the customer that I just inserted I get an error:
Error starting at line : 47 in command -
DELETE FROM CUSTOMERS
WHERE customer_id = 27
Error report -
ORA-01400: cannot insert NULL into ("TUG81959"."CUSTOMER_ARCHIVES"."CUSTOMER_ID")
ORA-06512: at "TUG81959.CUSTOMERS_BEFORE_INSERT", line 4
ORA-04088: error during execution of trigger 'TUG81959.CUSTOMERS_BEFORE_INSERT'
In your DELETE trigger you should be using the :OLD values when creating your archive record:
CREATE OR REPLACE TRIGGER CUSTOMERS_BEFORE_INSERT
BEFORE DELETE ON CUSTOMERS
FOR EACH ROW
BEGIN
INSERT INTO CUSTOMER_ARCHIVES
(CUSTOMER_ID,
CUSTOMER_FIRST_NAME,
CUSTOMER_LAST_NAME,
CUSTOMER_ADDRESS,
CUSTOMER_CITY,
CUSTOMER_STATE,
CUSTOMER_ZIP,
CUSTOMER_PHONE,
CUSTOMER_FAX,
DELETION_DATE)
VALUES
(:OLD.CUSTOMER_ID,
:OLD.CUSTOMER_FIRST_NAME,
:OLD.CUSTOMER_LAST_NAME,
:OLD.CUSTOMER_ADDRESS,
:OLD.CUSTOMER_CITY,
:OLD.CUSTOMER_STATE,
:OLD.CUSTOMER_ZIP,
:OLD.CUSTOMER_PHONE,
:OLD.CUSTOMER_FAX,
SYSDATE());
DBMS_OUTPUT.PUT_LINE('New row is added to Customers_Archive
Table with Customer_ID:' ||:OLD.Customer_id ||'on date:' || SYSDATE());
END;
In your original trigger you'd declared a row variable named ar_row but hadn't assigned anything to any of the fields - therefore they were all NULL. When a BEFORE trigger is invoked during a DELETE, the :OLD values have the values prior to the deletion, and the :NEW values are all NULL.
Best of luck.

Oracle trigger to update a table when another table insert or update

I have two tables (master-detail) I use to record orders, I need to create a trigger that allows me to update the "TOTAL_GENERAL" field that is in the master table with the sum of subtotals in the "SUBTOTAL" field the detail table that are related to the foreign key "ID_ORDEN" but I get an error with the trigger.
tables:
CREATE TABLE "ENCABEZADO_ORDEN"
("ID_ENCABEZADO" NUMBER(10,0),
"NUMERO_ORDEN" NUMBER(10,0),
"FECHA" DATE,
"NOMBRE_CLIENTE" VARCHAR2(50),
"DIRECCION" VARCHAR2(50),
"TOTAL_GENERAL" NUMBER(10,0),
"LUGAR_VENTA" VARCHAR2(50),
CONSTRAINT "ENCABEZADO_ORDEN_PK" PRIMARY KEY ("ID_ENCABEZADO")
USING INDEX ENABLE
)
CREATE TABLE "DETALLE_ORDEN"
("ID_DETALLE" NUMBER(10,0),
"PRODUCTO" VARCHAR2(50),
"PRECIO_UNITARIO" NUMBER(10,2),
"CANTIDAD" NUMBER(10,0),
"SUBTOTAL" NUMBER(10,2),
"ID_ENCABEZADO" NUMBER(10,0),
CONSTRAINT "DETALLE_ORDEN_PK" PRIMARY KEY ("ID_DETALLE")
USING INDEX ENABLE
)
/
ALTER TABLE "DETALLE_ORDEN" ADD CONSTRAINT "DETALLE_ORDEN_FK" FOREIGN KEY ("ID_ENCABEZADO")
REFERENCES "ENCABEZADO_ORDEN" ("ID_ENCABEZADO") ENABLE
/
trigger:
create or replace TRIGGER "CALCULAR_TOTAL_GENERAL"
BEFORE INSERT OR UPDATE ON "DETALLE_ORDEN"
FOR EACH ROW
DECLARE
V_ID_ENCABEZADO NUMBER(10,0);
BEGIN
SELECT "ID_ENCABEZADO"
INTO V_ID_ENCABEZADO
FROM "ENCABEZADO_ORDEN"
WHERE "ID_ENCABEZADO" = :NEW."ID_ENCABEZADO";
UPDATE "ENCABEZADO_ORDEN"
SET "TOTAL_GENERAL" = (SELECT SUM("SUBTOTAL") FROM "DETALLE_ORDEN"
WHERE "ID_ENCABEZADO" = V_ID_ENCABEZADO)
WHERE "ID_ENCABEZADO" = V_ID_ENCABEZADO;
END;
This is the error message I get when I insert or update the table "DETALLE_ORDEN":
1 error has occurred
ORA-04091: table CARLOSM.DETALLE_ORDEN is mutating, trigger/function may not see it
ORA-06512: at "CARLOSM.CALCULAR_TOTAL_GENERAL", line 9
ORA-04088: error during execution of trigger 'CARLOSM.CALCULAR_TOTAL_GENERAL'
Don't use triggers for this kind of logic (for that matter, don't use triggers ever; there's almost always a better way). Also, avoid storing redundant information in base tables whenever possible.
Far better design, with minimal impact to existing code is to
1) rename table "ENCABEZADO_ORDEN" (i.e. to "ENCABEZADO_ORDEN_TAB") and 2) disable/drop "TOTAL_GENERAL" field, and then 3) create a view with original name "ENCABEZADO_ORDEN" as:
CREATE OR REPLACE VIEW ENCABEZADO_ORDEN AS
SELECT O.*, (SELECT SUM(D.SUBTOTAL) FROM DETALLE_ORDEN D
WHERE D.ID_ENCABEZADO = O.ID_ENCABEZADO) TOTAL_GENERAL
FROM ENCABEZADO_ORDEN_TAB O;
This will ensure TOTAL_GENERAL is always correct (in fact, any efforts to set it directly to some other value via update of ENCABEZADO_ORDEN will result in immediate syntax error).
If performance is an issue (i.e. users frequently query TOTAL_GENERAL field in ENCABEZADO_ORDEN table for orders with large numbers of detail records in DETALLE_ORDEN, causing Oracle to repeatedly fetch&sum multitudes of SUBTOTALS) then use a materialized view instead of a basic view.

How to use SQL trigger to record the affected column's row number

I want to have an 'updateinfo' table in order to record every update/insert/delete operations on another table.
In oracle I've written this:
CREATE TABLE updateinfo ( rnumber NUMBER(10), tablename VARCHAR2(100 BYTE), action VARCHAR2(100 BYTE), UPDATE_DATE date )
DROP TRIGGER TRI_TABLE;
CREATE OR REPLACE TRIGGER TRI_TABLE
AFTER DELETE OR INSERT OR UPDATE
ON demo
REFERENCING NEW AS NEW OLD AS OLD
FOR EACH ROW
BEGIN
if inserting then
insert into updateinfo(rnumber,tablename,action,update_date ) values(rownum,'demo', 'insert',sysdate);
elsif updating then
insert into updateinfo(rnumber,tablename,action,update_date ) values(rownum,'demo', 'update',sysdate);
elsif deleting then
insert into updateinfo(rnumber,tablename,action,update_date ) values(rownum,'demo', 'delete',sysdate);
end if;
-- EXCEPTION
-- WHEN OTHERS THEN
-- Consider logging the error and then re-raise
-- RAISE;
END TRI_TABLE;
but when checking updateinfo, all rnumber column is zero.
is there anyway to retrieve the correct row number?
The only option is to use primary key column of your "demo" table.
ROWNUM is not what you are looking for, read the explanation.
ROWID looks like a solution, but in fact it isn't, because it shouldn't be stored for a later use.
ROWNUM is not what you think it is. ROWNUM is a counter that has only a meaning within the context of one execution of a statement (i.e. the first resulting row always has rownum=1 etc.). I guess you are looking for ROWID, which identifies a row.

Resources