run windows command from oracle database trigger - windows

i have oracle database (11g) trigger run after inserting on table, i need to run external program by this trigger through windows command like this:
c:\my_external_apps\app1.exe arg1 arg2 arg3
i am trying this code but it doesn't work:
create or replace TRIGGER GE_MAIN_NOTIFICATION_SEND AFTER INSERT ON TABLE
REFERENCING OLD AS OLD NEW AS NEW FOR EACH ROW BEGIN
SYS.DBMS_SCHEDULER.create_program(program_name => 'UPLOADNC', program_type => 'EXECUTABLE',
program_action => 'C:\WINDOWS\SYSTEM32\CMD.exe /C c:\my_external_apps\app1.exe arg1 arg2 arg3 ',
enabled => TRUE);
END;
and this is the error
ORA-04088: error during execution of trigger 'DURRA.GE_MAIN_NOTIFICATION_SEND'
27486. 00000 - "insufficient privileges"
*Cause: An attempt was made to perform a scheduler operation without the
required privileges.
*Action: Ask a sufficiently privileged user to perform the requested
operation, or grant the required privileges to the proper user(s).
how i can do that?? i am beginner with oracle database

The error is telling you that you haven't been granted the privileges necessary to call dbms_scheduler.create_program. I expect that you are missing the create job privilege.
However, if you resolve that problem, your next problem will be that dbms_scheduler.create_program does an implicit commit and commits are not allowed inside triggers. That means that you cannot call dbms_scheduler.create_program from a trigger (unless you made the trigger an autonomous transaction which would create a separate set of issues). The right way to solve the problem would almost certainly be to use the older dbms_job package. Since that package doesn't implicitly commit, you can submit a job as part of a larger transaction.
Of course, if you're using the dbms_job package to do your job scheduling, you lose out on the ability of dbms_scheduler to call out to the operating system. Instead, you'd need to do something like creating a Java stored procedure that calls out to the operating system. There are multiple examples of this on the web, I linked to one from Tom Kyte.
So, at a high level, your trigger would call dbms_job.submit to submit the job. The job would then call your Java stored procedure. Your Java stored procedure would make the actual call out to the operating system of the database server.

Related

Submitted Oracle job using dbms_job.submit and it failed but I don't know where to look for an error message

We are initiating the rebuilding of many materialized views by using dbms_job.submit to execute a stored procedure that perform the rebuilding. However, I am having trouble trying to figure out how to determine if a submitted job failed. The issue that I am having is that the job is failing but I cannot identify what the issue is. So, I am trying to start out with a simple example, which is probably failing on a permission issue but I don't know where to look for the error message.
I have the following test procedure that I want to initiate using dbms_job.submit.
CREATE OR REPLACE PROCEDURE MYLANID.JUNKPROC
AS
lv_msg varchar2(3000);
BEGIN
INSERT INTO MYLANID.junk_log ( msg ) VALUES ('Hello World' );
commit;
EXCEPTION
WHEN OTHERS THEN
lv_msg := SUBSTR(sqlerrm, 1, 3000);
INSERT INTO MYLANID.junk_log ( msg ) VALUES (lv_msg);
END;
/
Note that this table is used above:
CREATE TABLE MYLANID.JUNK_LOG (
EVENT_TIME TIMESTAMP(6) DEFAULT systimestamp,
MSG VARCHAR2(3000 BYTE))
To submit the above procedure as a job, I execute the following anonymous block.
declare l_jobid binary_integer;
BEGIN
dbms_job.submit(job => l_jobid, what => 'BEGIN MYLANID.JUNKPROC; END;');
DBMS_OUTPUT.PUT_LINE('l_jobid:' || l_jobid);
commit;
END;
I then execute the following SQL...
select * from all_jobs;
...to see one record that represents my submitted job. When I re-query the all_jobs view, I see that this record quickly disappears from the view within a few seconds, presumably when the job completes. All is happy so far. I would like to use the presence of a record in the all_jobs view to determine whether a submitted job is running or has failed. I expect to be able to tell if it failed by looking at the ALL_JOBS.FAILURES column having a non null value > 0.
The problem, probably a permission issue, begins when I switch to another schema and I switch all of the occurrences of the above SQL and replace "MYSCHEMA" with "ANOTHERSCHEMA" that I also have access to. For example, I create the following
Table: ANOTHERSCHEMA.JUNK_LOG
Procedure: ANOTHERSCHEMA.JUNKPROC
I am even able to execute the stored procedure successfully in a query window while logged in as MYSCHEMA:
EXEC ANOTHERSCHEMA.JUNKPROC
However, if I execute the following code to submit a job that involves running the same ANOTHERSCHEMA procedure but by submitting it as a JOB...
declare l_jobid binary_integer;
BEGIN
dbms_job.submit(job => l_jobid, what => 'BEGIN ANOTHERSCHEMA.JUNKPROC; END;');
DBMS_OUTPUT.PUT_LINE('l_jobid:' || l_jobid);
commit;
END;
...then, when I query the jobs ALL_JOBS view...
select * from all_jobs;
...I see that the job has a positive value for the column FAILURE and I have no record of what the error was. This FAILURE count value continues to gradually increment over time as Oracle presumably retries up to 16? times until the job is marked BROKEN in the ALL_JOBS view.
But this is just a simple example and I don't know where to look for the error message that would tell me why the job using ANOTEHRSCHEMA references failed.
Where Do I look for the error log of failed jobs? I'm wondering if this will be somewhere only the DBA can see...
Update:
The above is just a simple test example. In my actual real world situation, my log shows that the job was submitted but I never see anything in USER_JOBS or even DBA_JOBS, which should show everything. I don't understand why the dbms_job.submit procedure would return the job number of the submitted job indicating that it was submitted but no job is visible in the DBA_JOBS view! The job that I did submit should have taken a long time to run, so I don't expect that it completed faster than I could notice.
First off, you probably shouldn't be using dbms_job. That package has been superseded for some time by the dbms_scheduler package which is significantly more powerful and flexible. If you are using Oracle 19c or later, Oracle automatically migrates dbms_job jobs to dbms_scheduler.
If you are using an Oracle version prior to 19c and a dbms_job job fails, the error information is written to the database alert log. That tends to be a bit of a pain to query from SQL particularly if you're not a DBA. You can define an external table that reads the alert log to make it queryable. Assuming you're on 11g, there is a view, x$dbgalertext, that presents the alert log information in a way that you can query it but DBAs generally aren't going to rush to give users permission on x$ tables.
If you use dbms_scheduler instead (or if you are on 19c or later and your dbms_job jobs get converted to dbms_scheduler jobs), errors are written to dba_scheduler_job_run_details. dbms_scheduler in general gives you a lot more logging information than dbms_job does so you can see things like the history of successful runs without needing to add a bunch of instrumentation code to your procedures.

Display comments to user while executing Oracle procedure

Is it possible to display comments to user while executing a procedure in a package. My package has 3 procedures. I am calling each one after other. I want to display comments on console like procedure xyz is executing, procedure executed successfully. I added comments inside procedure like DBMS_OUTPUT.PUT_LINE('PROCEDURE EXECUTED SUCCESSFULLY') but didn't worked for me.
FYI i am using oracle 11g in windows 7 system.
You can't use DBMS_OUTPUT to display information on a procedure while it is running. This is because DBMS_OUTPUT.put_line doesn't display data on screen, rather, the data is put in a queue that is later read by the calling client (This queue is also invisible outside of its transaction). If you use SQL*Plus the queue is read and displayed automatically at the end of the procedure if you have SET SERVEROUTPUT ON.
Other means exist to follow the progress of a procedure while it is running:
You could write to a file instead. UTL_FILE.put_line will write directly if the parameter autoflush is set to true.
You could set session variables with DBMS_APPLICATION_INFO. These variables can be read with another session by querying v$session.
You could use AUTONOMOUS_TRANSACTIONS to log progress information in a dedicated table. This table can be queried by another session simultaneously.
As you can see you would need another process to read the information while it is written. In some applications, this would be achieved by running the main batch job in a new separate process, for example by calling DBMS_JOB or DBMS_SCHEDULER while the calling transaction loops on the progress table or file until the job is complete.
SQL*Plus is not an interactive client, you will need some more sophisticated environment to achieve this functionality.

DDL Statements in DBMS_JOB

I am trying to schedule a job using DBMS_JOB (I can't use DBMS_SCHEDULER for security reasons), which uses a DDL statement.
DECLARE
job_num NUMBER;
BEGIN
DBMS_JOB.SUBMIT(job => job_num,
what => 'BEGIN EXECUTE IMMEDIATE ''CREATE TABLE temp1 (ID NUMBER)''; END;'
);
DBMS_OUTPUT.PUT_LINE('JobID'||job_num);
DBMS_JOB.RUN(job_num);
END;
/
It fails to execute giving me an error message :
ORA-12011: execution of 1 jobs failed
ORA-06512: at "SYS.DBMS_IJOB", line 548
ORA-06512: at "SYS.DBMS_JOB", line 278
ORA-06512: at line 8
On removing the DBMS_JOB.RUN() statement from inside the anonymous block, I am able to at least create (and save) the job. When I check the job, it has saved this as the code to execute
BEGIN EXECUTE IMMEDIATE 'CREATE TABLE temp1 (id NUMBER) '; END;
If I execute it standalone, it obviously executes. The only time it fails it when I try to execute the entire thing through the call to DBMS_JOB.RUN().
Is there a restriction on using DDL statements as a parameter in DBMS_JOB? I can't find any pointer in documentation for this.
While echoing the sentiments of the other commenters-- creating tables on the fly is a red flag that often indicates that you really ought to be using global temporary tables-- a couple of questions.
Is there a reason that you need the DBMS_JOB.RUN call? Your call to DBMS_JOB.SUBMIT is telling Oracle to run the job asynchronously as soon as the parent transaction commits. So, normally, you'd call DBMS_JOB.SUBMIT and then just `COMMIT'.
Does the user that is submitting job have the CREATE TABLE privilege granted directly? My guess is that the user only has the CREATE TABLE privilege granted via a role. That would allow you to run the anonymous PL/SQL block interactively but not in a job. If so, you'll need the DBA to grant you the CREATE TABLE privilege directly, not via a role.
When a job fails, an entry is written to the alert log with the error message. Can you (or, more likely, the DBA) get the error message and the error stack from the alert log and post it here (assuming it is something other than the privileges issue from #2).

newbie on using Oracle Scheduler to start a job based on event

I have a table in Oracle 11.2 database. I want the database to run an executable file on a remote server if a specific cell in table1 is updated to a value of 1 AND if the number of existing rows in table2 is > 0. I don't have much experience with what is possible in databases -- is the following possible to achieve this?
create a job using Oracle Scheduler. The job runs immediately, and is used to run an external executable program on a remote server. The job exists, but is not run until step 5 below (is this possible?).
http://docs.oracle.com/cd/E11882_01/server.112/e17120/schedadmin001.htm#BAJHIDDC
attach a DML trigger to the column of the table that fires on an UPDATE statement.
http://docs.oracle.com/cd/E11882_01/appdev.112/e25519/triggers.htm#CIHEHBEB
have the trigger invoke a PL/SQL subprogram
http://docs.oracle.com/cd/E11882_01/appdev.112/e25519/triggers.htm#CIHEGACF
in the PL/SQL subprogram, perform the following business logic: if a specific cell in table1 equals 1, AND if the number of rows in table 2 is greater than 0, proceed to step 5, otherwise stop (exit, quit).
Run the job in step 1
Or, if the job/scheduler is not made to provide this functionality, is there another way to achieve the same thing? That is, have a change in a database table trigger an external job.
UPDATE 1:
I wonder if it's possible to implement steps 1-5 above by just using Oracle Scheduler with DBMS_SCHEDULER.CREATE_JOB using parameter event_condition?
http://docs.oracle.com/cd/E11882_01/server.112/e25494/scheduse005.htm#CHDIAJEB
Here's an example from the above link:
BEGIN
DBMS_SCHEDULER.CREATE_JOB (
job_name => 'process_lowinv_j1',
program_name => 'process_lowinv_p1',
event_condition => 'tab.user_data.event_type = ''LOW_INVENTORY''',
queue_spec => 'inv_events_q, inv_agent1',
enabled => TRUE,
comments => 'Start an inventory replenishment job');
END;
The above code creates a job that starts when an application signals the Scheduler that inventory levels for an item have fallen to a low threshold level.
Could the above code somehow be modified to perform the intended steps? For example, could steps 2-4 above be eliminated by using event_condition here instead? etc. If so, what would it look like, for example, how to set the queue_spec?
Assuming that you install the Oracle Scheduler Agent on the remote server, DBMS_SCHEDULER can run an executable on the remote machine. I would have the DML trigger enqueue a message into an Oracle Advanced Queue (AQ) and use that queue to create an event-based job (DML triggers are not allowed to commit or rollback the transaction but running a DBMS_SCHEDULER job implicitly issues a commit so DML triggers cannot directly run a job). The event-based job and the job that runs the remote executable would be part of a job chain.

Privileges for a Oracle 9i statistics job

We want to move our automated statistics gathering from an external script into Oracle 9i's job scheduler. It's a very simple job, and the code basically looks like this:
DBMS_JOB.SUBMIT(
JOB => <output variable>,
WHAT => 'DBMS_STATS.GATHER_DATABASE_STATS(
cascade => TRUE, options => ''GATHER AUTO'');',
NEXT_DATE => <start date>,
INTERVAL => 'SYSDATE + 7');
The job gets created successfully and runs, but fails with the error:
ORA-12012: error on auto execute of job 25
ORA-20000: Insufficient privileges to analyze an object in Database
ORA-06512: at "SYS.DBMS_STATS", line 11015
...
The part I don't get is that the user I submitted the job under has the right permissions to gather those database statistics -- if I run the command manually it works. I was curious if Oracle was ignoring any role-based privileges the user had like it does with creating procedures so I directly granted the user ANALYZE ANY, but still no dice.
Are there some other permissions I'd have to directly grant the user to make this work? I'd rather not have to make a separate job for each schema (which does work if I submit the job under the schema's owner).
What version of 9i are you on. I recall reading on a AskTom thread about 9.2.0.1 having an issue and needing to do a grant select ( i will look up the thread )
Also since you are running DB stats not subroutine ANALYZE ANY DICTIONARY

Resources