Is there a way to tell if Oracle attempted an insert on a table from database logs? - oracle

We have a java application deployed on a production WebSphere server. The code is supposed to insert a row into a table, but it does not. I see no error messages in the application server logs. It is as if no attempt was made to insert the row. The same code deployed in a test environment does insert the row.
I would like to know if Oracle attempted to insert a row and then rolled it back for some reason. I am not familiar with Oracle at all. Is there a way to tell by looking at the database logs if an insert statement was executed on the table?
We are using Oracle 10
Thanks

You can use DML before insert trigger. This trigger will execute each time a row is inserted into the given table.
CREATE OR REPLACE TRIGGER t_log_insert
 BEFORE INSERT ON table_name
 FOR EACH ROW
 ENABLE
 BEGIN
--write your logic here.
  DBMS_OUTPUT.PUT_LINE('You Just Inserted a Row');
 END;
 /
You can read more about trigger here.

Related

Oracle - When does the trigger after insert run?

I have a table in my Oracle database. The table is used as a messaging queue. Sender process writes to it and the receiver process reads from it. I want to update the inserted messages under certain conditions before receiver reads it. If I set a trigger to "UPDATE ROW AFTER INSERT", when exactly will Oracle run it? Will Oracle handle the trigger as the first thing after insert? To be clear, will Oracle run the trigger before the receiver reads the inserted message?
You want to change the data to be inserted in a before row trigger. See for an explanation of the difference of before and after triggers this question:
difference before and after trigger in oracle
The reader will only be able to see any data after the sender process has committed the changes.

rolling back ddl changes on oracle using flyway

My question: Is there anyway to disable auto-commit of DDL statements in Oracle DB?
Context:
I'm using Flyway 4 to maintain the state of my Oracle DB. As they say in their faq page, they can't rollback DDL changes in Oracle because DDL is autocommitted in this DB.
For instance, I am moving a column from a table to another table (copying existing values). So I would like to have in the same sql file an ALTER TABLE ADD, then an UPDATE, then an ALTER TABLE DROP. I happened to get an error on the DROP statement but the columns that were added by the first ALTER TABLE remained in place. I would like to be able to rollback that change too.
One work around I am using is adding a separate sql file for each DDL statement. But this is ugly. Any other way of doing this?
Any other way of doing this?
As flyway simply runs the migration files you can put anonymous PL SQL blocks in them. With some effort these can then be used to manage errors via PL SQL exceptions to undo all previous steps. If you are manipulating multiple columns each column may need to be in a separate file or all columns would need to be rolled back.
Psuedocode for single column:
declare
begin
-- alter table add new column
exception when others then
dbms_output.put_line('Exception adding new column');
-- nothing to rollback as failed on first step
end;
-- update copy data to new column
exception when others then
dbms_output.put_line('Exception when populating new column');
-- delete any data in new column if committed in blocks
-- drop new column
end;
-- drop old column
exception when others then
dbms_output.put_line('Exception when dropping old column');
-- ? not sure if you can rollback a failed drop column
-- delete any data in new column if committed in blocks
-- drop new column
end;
end;
/
As you can see this is a bit of effort. You may need to expand your explanation of why you wish to avoid multiple sql files with your colleagues and whether this only affects the database?

Dynamically read the columns of the :NEW object in an oracle trigger

I have an oracle trigger that needs to copy values from the updated table to another table.
The problem is that the columns aren't known when the trigger is created. Part of this system allows the table schema to be updated by the application. (don't ask).
Essentially what I want to do is pivot the table to another table.
I have a stored procedure that will do the pivot, but I can't call it as part of the trigger because it does a select on the table being updated. Causing a "mutating" error.
What would be ideal would be to create a dynamic scripts that reads all the column names from user_tab_cols for the updated table, and reads the value from the :new object.
But of course...I can't :)
:NEW doesn't exist at the point the dynamic script is executed. So something like the following would fail:
EXECUTE IMMEDIATE `insert into pivotTable values(:NEW.' || variableWithColumnName ||')';
So, I'm stuck.
I can't read from the table that was updated, and I can't read the value that was updated from the :NEW object.
Is there anyway to accomplish this other than rebuilding the trigger each time the schema is changed?
No. You'll need to rebuild the trigger whenever the table changes.
If you want to get really involved, you could write a procedure that dynamically generated the DDL to CREATE OR REPLACE the trigger by reading user_tab_columns. You could then create a DDL trigger that fired when the table was altered, submitted a job via dbms_job that called the procedure to recreate the trigger. That works but it's a rather large number of moving parts which means that it can fail in all sorts of subtle and spectacular ways particularly if the application that is making schema changes on the fly decides to add columns in the middle of the day.

Oracle audit trigger code used for multiple tables with different table names

I have a requirement to populate an audit column with current timestamp only if there are any updates to the table. Here is the trigger. Trigger works fine
create or replace TRIGGER test.Audit_Trigger
BEFORE UPDATE ON test.TEST_TABLE
FOR EACH ROW
BEGIN
:NEW.column_dtm := current_timestamp;
END;
Instead of adding same trigger for every table (around 1000 tables means 1000 triggers) with only change in table name, is there any other better way to accomplish this task?
It would be nice if you could write a schema level trigger to do this, but unfortunately Oracle only supports schema level triggers for DDL, not for DML.
You could generate triggers on each table quite easily using dynamic SQL, but assuming your DB version is reasonably recent (9i or later I think), a better alternative might be to talk to your DBA about turning on fine grained auditing for table updates.
https://docs.oracle.com/cd/B19306_01/server.102/b14200/statements_4007.htm

INSERT trigger for inserting record in same table

I have a trigger that is fire on inserting a new record in table in that i want to insert new record in the same table.
My trigger is :
create or replace trigger inst_table
after insert on test_table referencing new as new old as old
for each row
declare
df_name varchar2(500);
df_desc varchar2(2000);
begin
df_name := :new.name;
df_desc := :new.description;
if inserting then
FOR item IN (SELECT pid FROM tbl2 where pid not in(1))
LOOP
insert into test_table (name,description,pid) values(df_name,df_desc,item.pid);
END LOOP;
end if;
end;
its give a error like
ORA-04091: table TEST_TABLE is mutating, trigger/function may not see it
i think it is preventing me to insert into same table.
so how can i insert this new record in to same table.
Note :- I am using Oracle as database
Mutation happens any time you have a row-level trigger that modifies the table that you're triggering on. The problem, is that Oracle can't know how to behave. You insert a row, the trigger itself inserts a row into the same table, and Oracle gets confused, cause, those inserts into the table due to the trigger, are they subject to the trigger action too?
The solution is a three-step process.
1.) Statement level before trigger that instantiates a package that will keep track of the rows being inserted.
2.) Row-level before or after trigger that saves that row info into the package variables that were instantiated in the previous step.
3.) Statement level after trigger that inserts into the table, all the rows that are saved in the package variable.
An example of this can be found here:
http://asktom.oracle.com/pls/asktom/ASKTOM.download_file?p_file=6551198119097816936
Hope that helps.
I'd say that you should look at any way OTHER than triggers to achieve this. As mentioned in the answer from Mark Bobak, the trigger is inserting a row and then for each row inserted by the trigger, that then needs to call the trigger to insert more rows.
I'd look at either writing a stored procedure to create the insert or just insert via a sub-query rather than by values.
Triggers can be used to solve simple problems but when solving more complicated problems they will just cause headaches.
It would be worth reading through the answers to this duplicate question posted by APC and also these this article from Tom Kyte. BTW, the article is also referenced in the duplicate question but the link is now out of date.
Although after complaining about how bad triggers are, here is another solution.
Maybe you need to look at having two tables. Insert the data into the test_table table as you currently do. But instead of having the trigger insert additional rows into the test_table table, have a detail table with the data. The trigger can then insert all the required rows into the detail table.
You may again encounter the mutating trigger error if you have a delete cascade foreign key relationship between the two tables so it might be best to avoid that.

Resources