Execute Oracle script with SqlPlus; go to state before script processing if any exceptions - oracle

I have a lot of scripts that contain all kinds of transactions including complex DML and DDL.
I need to find a way to run them fully or not at all. I'd like to see following behavior: if any error occurs in the middle of script processing => go back to the state before the script processing.
I thought I would just put whole script into one big transaction, put a savepoint at the beginning and make a rollback to the savepoint in case of any exception, but AFAIK that's impossible, as Oracle does not support nested transactions.
Do you mind sharing your thoughts about that case?

I don't think there is an easy solution for this, because you have DDL in your script. DDL executes commit before processing, so rollback will not help.
As an alternative you could use flashback option of Oracle. But this impacts the entire database. You create a flashback restore point, run the script, if any errors occured then you flashback database to restore point. This will revert all the changes in all the schemas of your database. This is good when you have separate database for running/testing your scripts. It is rather fast. The database should be in archivlog mode.
Another option is to use export/import utility (expdp/impdp). This is also hardly automated in one script, so you do the recovery manually. You take the export dump, run the script, if any errors happened - you restore the dump of your db schemas running impdp.

Perhaps what you need is the "whenever sqlerror exit" clause.
Check it here out

Related

Oracle package invalidated automatically

We have a oracle 12.1 Prod database. One of the packages in the database become invalid abruptly and all the sessions have the state of that package has been invalidated. We are sure that there was no manual actions (Deployment, DDL, Compile etc.. ) performed on the database. Is there any other way that package can become invalid automatically?
This package is directly referencing 3 remote database tables using DB link.
If any dependency undergoes a DDL operation, it will invalidate stored PL/SQL programs that depend on it. It could be a table, a synonym, a view, another PL/SQL routine, etc.. I suggest you look at dba_dependencies to see what the dependencies are for your package, then look at dba_objects for every one of those objects to see what has a recent last_ddl_time value. Don't forget the remote objects on the other side of that database link. When you find it, you are well on your way to finding the root cause. If you can't figure out what DDL is hitting that object, enable DDL auditing so you can capture the event the next time it happens.
If you find that the offending object cannot avoid undergoing DDL for some reason, then you may need to consider breaking the dependency on it by embedding your reference to it inside an EXECUTE IMMEDIATE.

Rollback after coming out of the session in Oracle SQL*Plus

I am writing a wrapper shell or perl script which does open an oracle session using sqlplus and then execute some sql files by scanning a directory. So as part of this , lets say if we have multiple sql files in a directory,
for eg: first.sql,second.sql,third.sql
I am planning to create a single file(AllSqlFilesInDirectory.sql) with below content.
>cat AllSqlFilesInDirectory.sql
#first.sql
#second.sql
#third.sql
>
Now I am planning to run the file AllSqlFilesInDirectory.sql by opening an oracle sqlplus session.
After executing, I am planning to come out of the oracle sqlplus session and I am planning to search for any errors in the log file.
If there are any errors, I would like to execute rollback. But I think as I am out of that sqlplus session, rollback is not possible. I am just concerned about the DML statements that were executed as part of those multiple sql files in the directory.
So I have these doubts
Can I simply ignore and not be concerned about rollback at all
Can I do the rollback for a session which was already closed?
If above is valid, then how can do it?
Can I simply ignore and not be concerned about rollback at all
That's a business question you'd have to answer. If you want the changes to be rolled back if there is an error, you'd need to do a rollback.
Can I do the rollback for a session which was already closed?
As a practical matter, probably not. Technically, you can using flashback transaction backout but that is generally way more complexity that you'd normally want to deal with.
If above is valid, then how can do it?
Rather than writing to a log file and parsing the log file to determine if there were any errors, it is most likely vastly easier to simply put a
whenever sqlerror exit rollback
at the top of your script. That tells SQL*Plus to rollback the transaction and exit whenever an error is encountered. You don't have to write logic to parse the log file.
Whenever sqlerror documentation

Tibco JDBC Update dry run

Is it possible to have a dry run in Tibco for the JDBC Update activities? Meaning that I want to run those activities, but not actually update the database.
Even running in test mode if it's possible will be good.
The only option I see is having a copy of the targeted tables in a separe schema, duplicate the data, and temporary align the JDBC connection of you activity on this secondary, temporary/test database.
Since you can use global variables, no code is changed between test and delivery (a typical goal), and you can compare both DB tables to see if the WOULD HAVE ran well...
I think I found a way. I haven't tested it yet, but theoretically it should work.
The solution is to install P6Spy and to create a custom module that will throw an exception when trying to execute an INSERT/UPDATE.
You could wrap the activity into a transaction group and rollback whenever you only want to test the statement. Otherwise just exit the group normally so the data gets commited.

How to copy a JOB, PROGRAM, SCHEDULE definition to run it in another Oracle 11g DB?

I need to copy some Jobs, Schedules and Programs from an Oracle 11g DB to another one, but when I try to look for the SQL of the Job/Schedule/Program on SQL Developer 4.1.3, the SQL tab shows nothing on Edit mode.
When I open a table and I click on the SQL tab, the SQL for create the table is showed up. I was expecting a similar behavior for a Job/Schedule/Program.
How can I copy a JOB/PROGRAM/SCHEDULE definition to run it in another Oracle 11g DB?
The easiest way for Jobs is to use DBMS_METADATA:
select dbms_metadata.get_ddl('PROCOBJ', job_name)
from user_scheduler_jobs;
I'm not 100% sure about schedules / programs, though.
I think you can also use expdp utility and use include=PROCOBJ. All schedules are objects type PROCOBJ. The you should have all jobs which are owned by schema you are exporting. (If you want DDL you can use impdp with sqlfile= to write dump into SQL file. Or use method suggested by Frank Schmitt)
Unfortunately I think if you use this method or method suggested by Frank Schmitt I think you won't be able to get schedules owned by SYS.

Is there a way to enable procedure logging for an Oracle Scheduled Job?

I'm trying to enable logging on an Oracle Scheduled Job so that when I look at the run details of the job, I can discern what the procedure of the job worked on and what it did. Currently, the procedure is written to log out through dbms_output.put_line() since this is nice for procedures and SQL*Plus by just enabling set serveroutput on to see it. However, I cannot figure out how to have this logging information show up in the job run details. Specifically, I'm looking at:
select additional_info from dba_scheduler_job_run_details;
Also, this data seems to appear to be displaying in the run details within the enterprise manager under instances of the job.
So, is there a way, a setting or a different package, to have simple logging for an Oracle Scheduled Job?
You could add something at the end of the call. It can call dbms_output.get_lines, load the data into a CLOB and store that somewhere for your perusal. You'd probably want something to call DBMS_OUTPUT.ENABLE at the start too.
Personally, I've avoid using DBMS_OUTPUT.PUT_LINE and have your own logging routine, either to a flat file or a table (using autonomous transactions).
Autonomous transactions is the way to go!
You define a log table with the information you want to log and write a procedure that inserts & commits into this table as an autonomous transaction. That way, you can see what's going on, even when the job is still running.
AFAIK there is no other way to get the information while the job is still running, DBMS_OUTPUT will only display after the job is done.

Resources