Display comments to user while executing Oracle procedure - oracle

Is it possible to display comments to user while executing a procedure in a package. My package has 3 procedures. I am calling each one after other. I want to display comments on console like procedure xyz is executing, procedure executed successfully. I added comments inside procedure like DBMS_OUTPUT.PUT_LINE('PROCEDURE EXECUTED SUCCESSFULLY') but didn't worked for me.
FYI i am using oracle 11g in windows 7 system.

You can't use DBMS_OUTPUT to display information on a procedure while it is running. This is because DBMS_OUTPUT.put_line doesn't display data on screen, rather, the data is put in a queue that is later read by the calling client (This queue is also invisible outside of its transaction). If you use SQL*Plus the queue is read and displayed automatically at the end of the procedure if you have SET SERVEROUTPUT ON.
Other means exist to follow the progress of a procedure while it is running:
You could write to a file instead. UTL_FILE.put_line will write directly if the parameter autoflush is set to true.
You could set session variables with DBMS_APPLICATION_INFO. These variables can be read with another session by querying v$session.
You could use AUTONOMOUS_TRANSACTIONS to log progress information in a dedicated table. This table can be queried by another session simultaneously.
As you can see you would need another process to read the information while it is written. In some applications, this would be achieved by running the main batch job in a new separate process, for example by calling DBMS_JOB or DBMS_SCHEDULER while the calling transaction loops on the progress table or file until the job is complete.
SQL*Plus is not an interactive client, you will need some more sophisticated environment to achieve this functionality.

Related

PL/SQL - retrieve output

Is there a way to retrieve output from PL/SQL continuously rather than wait until the SP completes its execution. Continuously mean as when it executes the execute immediate.
Any other mechanism to retrieve pl/sql output?
As per Oracle docs
Output that you create using PUT or PUT_LINE is buffered in the SGA. The output cannot be retrieved until the PL/SQL program unit from which it was buffered returns to its caller. So, for example, Enterprise Manager or SQL*Plus do not display DBMS_OUTPUT messages until the PL/SQL program completes.
As far as I know, there is a way, but not with DBMS_OUTPUT.PUT_LINE. Technique I use is:
create a log table which will accept values you'd normally display using DBMS_OUTPUT.PUT_LINE. Columns I use are
ID (a sequence, to be able to sort data)
Date (to know what happened when; might not be enough for sorting purposes because operations that take very short time to finish might have the same timestamp)
Message (a VARCHAR2 column, large enough to accept the whole information)
create a logging procedure which will be inserting values into that table. It should be an autonomous transaction so that you could COMMIT within (and be able to access data from other sessions), without affecting the main transaction
Doing so, you'd
start your PL/SQL procedure
call the logging procedure whenever appropriate (basically, where you'd put the DBMS_OUTPUT.PUT_LINE call)
in another session, periodically query the log table as select * from log_table order by ID desc
Additionally, you could write a simple Apex application with one report page which selects from the logging table and refreshes periodically (for example, every 10 seconds or so) and view the main PL/SQL procedure's execution.
The approach that Littlefoot has provided is what I normally use as well.
However, there is another approach that you can try for a specific use case. Let's say you have a long-running batch job (like a payroll process for example). You do not wish to be tied down in front of the screen monitoring the progress. But you want to know as soon as the processing of any of the rows of data hits an error so that you can take action or inform a relevant team. In this case, you could add code to send out emails with all the information from the database as soon as the processing of a row hits an error (or meets any condition you specify).
You can do this using the functions and procedures provided in the 'UTL_MAIL' package. UTL_MAIL Documentation from Oracle
For monitoring progress without the overhead of logging to tables and autonomous transactions. I use:
DBMS_APPLICATION.SET_CLIENT_INFO( TO_CHAR(SYSDATE, 'HH24:MI:SS') || ' On step A' );
and then monitor in v$session.client_infofor your session. It's all in memory and won't persist of course but it is a quick and easy ~zero cost way of posting progress.
Another option (Linux/UNIX) for centralised logging that is persistent and again avoids logging in the database more generally viewable that I like is interfacing to syslog and having Splunk or similar pick these up. If you have Splunk or similar then this makes the monitoring viewable without having to connect to the database query directly. See this post here for how to do this.
https://community.oracle.com/thread/2343125

Calling SQL*Plus commands in PL/SQL block

Is there any way to set the serveroutput on/off using pl/sql procedures/packages. I want to do some changes in displaying of my data on SQL*PLUS screen. like for my previous post
You cannot call SQL*Plus commands (which run only on the client) from PL/SQL (which runs only on the server).
In the particular case where you simply want to enable and disable message output, however, you can call the PL/SQL procedures dbms_output.disable and dbms_output.enable.
If you are depending on the data being written via dbms_output to be displayed to a human user, however, you are almost certainly doing something wrong. Production processes should be writing important data to some other location (i.e. a table somewhere), not writing to dbms_output and hoping that the client application happens to be configured to display the data.

Run SCRIPT from PL/SQL Block

How to use "START SCRIPT" in pl/sql block ?
I want to use something like this
declare
begin
proc(para1,para2);
execute immediate 'start prompt1' ;
end;
/
Also I want to know , can i get a value from prompt1 into my PL/SQL block where am calling the script ? Because I need to use the value to perform some operations in the PL/SQL block.
It is 2012 2017. Scripts are a clunky and brittle hangover from the last millennium. Oracle has a fantastic range of functionality we can execute in PL/SQL, plus there's Java Stored Procedures, and there's scheduling for starting jobs. Other than running DDL to create or amend schemas there is hardly any need for scripts in an Oracle database environment; even DDL scripts should be triggered from an external client, probably a build tool such as TeamCity.
In particular I would regard attempting to run a SQL script from a PL/SQL program as an architectural failure. What are you doing with the script which you cannot do with a stored procedure?
As for passing input to a stored procedure, that's what parameters are for. PL/SQL isn't interactive, we need a client to enter the values. Depending on the scenario this can be done asynchronously (values in a file or a table) or synchronously (calling the stored procedure from SQL*Plus, SQL Developer or a bespoke front end).
Having said all that, in the real world we work with messy architectures with inter-dependencies between the database and the external OS. So what can we do?
We can write a Java Stored Procedure to execute shell commands. This is the venerable solution, having been around since Oracle 8i. Find out more.
In 10g Oracle replace DBMS_JOB with DBMS_SCHEDULER. Once of the enhancements of this tool is its ability to run external jobs i.e. shell scripts. Find out more.
Since Oracle 11g R1 external tables support pre-processor scripts, which run shell commands before querying the table. Find out more.
Note that all these options demand elevated access (grants on DIRECTORY objects, security credentials, etc). These can only be granted by privileged users (i.e. DBAs). Unless our database has an astonishingly lax security configuration there is no way for us to run an arbitrary shell script from PL/SQL.
Finally, it is not clear what benefit you expect from running a SQL script in PL/SQL. Remember that PL/SQL runs on the database server, so it can't see scripts on the client machine. This seems relevant in the light of the requirement to accept user input.
Perhaps the simplest solution is reconfiguration of the original script. Split out the necessary PL/SQL call into a block and then just call the named script:
begin
proc(para1,para2);
end;
/
#prompt1.sql
You can write a pl/sql block in SqlPlus to check for a parameter from a table then execute a script. In the script to be executed (MyScript.sql below), the statement terminators must be ";" instead of "/"
declare
vMyParameter number := 0;
begin
select count(*) into vMyParameter
from MyTable
where MyCheckValue = 'Y';
if vMyParameter = 1 then
#MyFolder/MyScript.sql;
end if;
end;
/
If you're using sql*plus (or a tool that is using it) then you can do something like this:
set serveroutput on
variable a number;
begin
:a := &promt;
dbms_output.put_line(:a);
end;
/
If it runs in batch then you can do:
variable a number;
begin
:a := &1;
dbms_output.put_line(:a);
end;
and get the value for :a as a parameter-
sqlplus sdad/fdsfd#fdggd #<your_script.sql> <val_for_a>
Another practice is to execute on one *.bat with parameters, like:
Example c:/oracle/bin/sqlplus.exe -w #c:/name
sql %1 %2 #c:/output.sql
execute immediate 'start prompt1' ;
Execute immediate is to execute SQL statements , not arbitrary commands.
can i get a value from prompt1 into my PL/SQL block where am calling the script
You can run a run script - but I doubt you can capture input from an SQL script, esp within a PL/SQL block

Oracle, save/map csv string to a table using utl_file and external tables

I use a pl/sql procedure calling a webservice. This webservice returns me a large csv-string which I hold in a clob. Since I do not want to parse the csv by foot, I thought of using external tables. So what I need to do is storing the csv data in a corresponding table.
What I am doing at the moment is, that I store the clob using utl_file. the stored file is defined in a external table. Ok, when I am the only user this works very well. But since DBs are multiuser I have to watchout if someone else is calling the procedure and overwriting the external table data source file. What is the best way avoid a mess in table data source? Or what is the best way to store a cvs-sting into a table?
Thanks
Chris
You want to make sure that the procedure is run by at most one session. There are several ways to achieve this goal:
The easiest way would be to lock a specific row at the beginning of your procedure (SELECT ... FOR UPDATE NOWAIT). If the lock succeeds, go on with your batch. If it fails it means the procedure is already being executed by another session. When the procedure ends, either by success or failure, the lock will be released. This method will only work if your procedure doesn't perform intermediate commits (which would release the lock before the end of the procedure).
You could also use the DBMS_LOCK package to request a lock specific to your procedure. Use the DBMS_LOCK.request procedure to request a lock. You can ask for a lock that will only be released at the end of your session (this would allow intermediate commits to take place).
You could also use AQ (Oracle queuing system), I have little experience with AQ though so I have no idea if it would be a sensible method.
Maybe you should generate temporary filename for each CSV? Something like:
SELECT TO_CHAR(systimestamp, 'YYYYMMDDHH24MISSFF') filename FROM dual
You can use UTL_FILE.FRENAME.
In similar situations, I have the external_table pointing to a file (eg "fred.txt").
When I get a new source file in, I use UTL_FILE.FRENAME to try to rename it to fred.txt. If the rename fails, then another process is running, so you return a busy error or wait or whatever.
When the file has finished processing, I rename it again (normally with some date_timestamp).

Oracle scheduler job log output

I'm using dbms_scheduler to execute a PL/SQL stored procedure. I would like to be able to have that code create some text logging output and associate it with the run to verify it's working, but I can't find anything to do that in the docs. Is there a facility to do this that I'm missing? This is an 11g database running under Unix. It would be real nice if I could use dbms_output so I could also run it from sqlplus and get output.
There are a bunch of Oracle Scheduler data dictionary views that will help you monitor jobs. Here are two documentation pages related to that:
Monitoring Jobs
Monitoring and Managing the Scheduler
Moreover, Oracle Scheduler declares some internal Scheduler variables that you can use like any other PL/SQL identifier in your PL/SQL stored procedure. Here is the list of these variables.
If you want to log application specific information, I suggest you create your own log table. You can then insert into this table from within your stored procedure. You can even insert any of the Scheduler's internal variables there, like job_name and job_scheduled_start.
i make a table JOB_LOG
insert into that table from inside your procedure...
I agree with what the others have said. Here's the actual nuts and bolts, but with a nice interface, too. The way I typically do this:
Make a logging table:
CREATE TABLE job_log (
ts TIMESTAMP DEFAULT SYSTIMESTAMP PRIMARY KEY
, message VARCHAR2(255)
);
Make a stored proc that easily writes into your log table:
CREATE OR REPLACE PROCEDURE job_logger (v_message VARCHAR2)
IS
BEGIN
INSERT INTO job_log(message) VALUES (v_message);
COMMIT;
END;
/
Then within your job, you are probably running a stored procedure. Within your own stored procedure, simply add lines that call the job_logger() procedure to write to your log. This keeps the ugly INSERT ... COMMIT clutter out of your interesting stored proc code.
CREATE OR REPLACE PROCEDURE foo
IS
BEGIN
job_logger('Starting job foo.');
...
{your code here}
...
job_logger('Another message that will be logged.');
...
job_logger('Completed running job foo.');
EXCEPTION
...
job_logger('Oops, something bad happened!');
...
END;
/
Your log table is automatically timestamped and indexed by the primary key. To view the log, you might run this
SELECT * FROM job_log ORDER BY ts DESC;
Now if would rather not use the Oracle scheduler, and want instead to use the DBMS_OUTPUT way of writing output, and want to run this under a Unix shell, that is possible also.
You would make a script that calls sqlplus, somewhat like this. If your user is SCOTT and the stored proc is called FOO,
#!/bin/sh
. /whatever/script/that/sets/your/oracle/environment
echo "
set serveroutput on feedback off
exec foo
" | sqlplus -s -l scott/tiger#orcl
Note, the -s flag suppresses the Oracle SQL Plus banner for cleaner output. The -l flag makes it so that sqlplus will abort if the password is bad or something else wrong, rather than try to prompt for username. Feedback off suppresses the PL/SQL "Anonymous block completed" message.
If you want to schedule this, you can call it from cron like this:
00 00 * * * /path/to/the/above/script.sh > /where/you/want/your/output/saved.log 2>&1

Resources