Hints in PL/SQL packages - oracle

Wondering, can hints work in PL SQL packages?
Recently, I have to tune a long-running query in a PLSQL package because it causes a "snapshot too old" issue. I got the query out of the package and tuned it individually. I used the required hints for my case to tune that query and its running time significantly decreased. But I am not sure whether hints work in PL/SQL package as well. Could please clarify whether they can work in PL/SQL packages or not?
Thanks in advance
Regards

Why wouldn't they work? It is not the package itself that contains hints; it is just a container for your functions, procedures, types, ... Code you have within these program units (can) have hints, and they work just the same as they work in pure SQL.
On the other hand, just being curious: what does that procedure - that raised the "snapshot too old" error - do? Is there, by any chance, a loop with a COMMIT within the loop? If so, don't do that because committing in the loop often causes such an error (which means that maybe you don't need any hints). Or, even better, see if that piece of code can be rewritten so that it doesn't use a loop at all because row-by-row processing can be really slow.
Therefore, consider posting that code, someone might enhance it.

Hints work with SQL queries whether it is adhoc or in sub program (Package/function/Procedure) .
Hints are the additional instruction for optimizer to consider while executing SQL

Related

How to build statistics on a procedures execution in oracle 12.2?

I have a ~2,300 line package, which is split into many procedures & function. It is running slower than I would like. Many years ago on a previous release of Oracle (9i or 11g) I had a similar problem and I was able to build a hierarchical structure which contained everything that was executed in the procedure/package an how much time was spent on each item.
I cannot seem to find a tutorial/blog that shows how to accomplish this. It is probably done with the DBMS_STATS package, but I find Oracle's documentation unsuitable for task oriented problem solving. It may be great if you want to learn everything there is to know about a subject, but generally all I need to know is how to solve the issue I am currently working on.
At any rate can someone point me to how I can get the runtime statistics of a run of an Oracle Procedure?
There are two options:
dbms_profiler - records times on each statement executed. docs
dbms_hpof - similar, but collects statistics on a hierarchy of pl/sql calls docs
Either method will require some setup using SYS (dba) access. Setup Instuctions for dbms_profiler
Did you tried dbms_utility.get_time method as described in THIS post.
Link to the original question which is almost the same as your question.
https://www.quora.com/How-can-I-log-the-execution-time-of-a-stored-procedures-in-a-table-in-Oracle-database

PL/SQL package call via JDBC performance issue

I have to use an PL/SQL package as API for importing data into an Oracle database. I'm doing this within an Java application with the latest ojdbc driver. All statements (of cause PreparedStatements) I'm using during the import are initialized only one time and reused for every set to import.
Now I'm facing following problem: The first call of an procedure of the package takes over 90% of the time for one set. I have to call about 10 procedures during the import and the first one takes about 4 seconds the rest about 0.4 seconds. It doesn't matter if it's the 10th or 100,000th set to import the first procedure call allways takes that time.
Important to know is, if I'm calling another procedure on first position this on takes the 90%. So, may be I'm wrong, it is something about the package initialization? But if I'm (re-)using prepared statements, shouldn't that happen only at first call?
The PL/SQL package has about 10,000 lines of code and also calls several other packages during the import.
So now my questions are:
What are possible reasons for this problem? And what are potential solutions?
Are there any tools I can use to identify the causer?
EDIT: I could identify the cause of the slow import. It had nothing to do with wrong code or something. The reason was simply the kind of data I used in my test scenario. My mistake was importing allways the same data.
If thread one made an update on a data-set in the first procedure it was holding an lock on this row until the commit after the complete import. Thread two to n were trying to update exactly the same row. The result is effectivly a synchronization of all threads.
First of all, this is not normal. So there is definitely something awry with your code. But without being able to see your source there's no way we're going to be able to spot the problem. And frankly I don't want to debug 10000 LOC, not even mine let alone yours. Sorry.
So the best we can do is give you some pointers.
One:
"The first call of an procedure of the package takes over 90% of the
time for one set. .... if I'm calling another procedure on first
position this on takes the 90%"
Perhaps there is some common piece of coding which every procedure executes that behaves differently depending on whether the calling procedure is the first one to execute it in any given run. You need to locate that rogue code.
Two:
" I've used the profiler in pl/sql developer. The execution is very
fast there. "
Your program behaves differently depending on whether you call it from PL/SQL Developer of JDBC. So there is a strong possiblity that the problem lies not in the PL/SQL code but in the JDBC code. Acquiring database connections is definitely one potential source of pain. Depnding on your architecture, network traffic may be another problem: are you returning lots of data to the Java program which is then used in subsequent procedural calls?
In short: you either need to identify something common in your PL/SQL code which can cause the same outcome in different proocedural calls or identify what happens differently when you call the program in PL/SQL Developer and JDBC.

What is the recommended practice for writing procedures with re-usable code?

I want to check with some of the more experienced Oracle developers here on the best-practices for backend development.
I write a lot of packages that print data in XML format and are used by HTTP services.
For this purpose, I loop through cursors and print the data using htp.p.
e.g.
for i in c_my_cursor loop
htp.p('<element>', i.data_field, '</element>');
end loop;
Now I've heard that cursors are bad for performance (is this true?). Moreover, there are similar cursors used in different packages, which I feel from a maintenance perspective would be better to switch to functions.
But what can I return from the function? I don't think a cursor would work. What do you folks use?
Cursors are not inherently bad. What is bad is processing Row By Agonizing Row, rather than using set processing. SQL is all about The Joy Of Sets. The problem with cursors is that PL/SQL developers often reach for them automatically; this frequently leads us down the RBAR route, when a straight SQL statement would be more efficient.
Functions are no more efficient than procedures. Choose functions or procedures accoring to whether you're doing something or retrieving something.
In your case I would consider whether Oracle's built-in XML functionality will work in your particular case. This partly depends on which version of Oracle you're using, but pretty much any version since 8i would work with the specific example you posted. Find out more.

How do you measure the performance of a stored procedure?

I'm using a version of SQL Server 2005 that does not support the profiler, trying to figure out how best to compare the performance of two stored procedures. I've run the execution plan for each, but it's not clear to me which of the provided metrics I should be focusing on. Do I go through and add up the various costs? What's the best approach?
Thanks in advance.
Look at this article: Measuring SQL Performance
If you don't want to register to free account, here is a solution 1:
DECLARE #start datetime, #stop datetime
SET #start = GETDATE()
EXEC your_sp
SET #stop = GETDATE()
2nd:
SET STATISTICS TIME ON
EXEC your_sp
3rd:
SET STATISTICS IO ON
EXEC your_sp
Btw, this site has some nice articles. I'd recommend to register. It's free.
The question is what are you optimizing for? Is it for speed or resources used?
If speed, then in the query analyzer I would look at the execution between several runs, make changes and time them again.
If it is resources then I would look through the execution plan. In that case I would start with the worse offenders and work my way down the list. Adding them up will tell you the over all performance, but most cases it is an item or 2 that is the bottle neck.
if you are using something like
SET SHOWPLAN_ALL ON
look at the TotalSubtreeCost column value for the row with the EXE YourProcedureName
this might help:
http://technet.microsoft.com/en-us/library/ms180765.aspx
Like most questions, the answer depends... In the final analysis, the only measure that matters is end-user perception, which can be affected by many things,including not only the stored procedure, but network performance, usage patterns (is the sProc being called 20x/day, or 1000x/ second?), etc., - and the sProc may not be the determining factor.
But if the stored procedure is the "piece if the puzzle" that is having the major adverse impact on the end-user perception of some function, then, you have to look at elapsed time to run the stored procedure. But this itself can be affected by numerous underlying metrics, and to do anything about it you need to analyse them all to determine which of them is the major or overriding contributer to the overall stored proc performance.
You could always rig a test harness to call your stored procedures and measure the call times. Unfortunately you're not going to get the details about exactly which parts of the Stored Procedure are causing the slow-down.
You could always run the Stored Procedure by hand in Query Analyzer and measure the results that way as well. The .NET harness just automates the process for you.
The simple low-brow solution is to run them with print statements printing the execution time over the various parts. This won't help if the performance problem is more subtle and found in production only, but if you can reproduce it in your test environment, you should be fine.
One handy technique if you are trying to compare the performance of two procs or statements is to select both blocks of sql in query analyzer and run the query plan. The plan will tell you the cost percentage of each block relative to one another. This is not full proof. I have seen it tell me one was cheaper when it was clearly more expensive when actually ran, but for the most part it is a nice, quick trick.

Benchmarking Oracle 10G on Windows XP

I am not a DBA. However, I work on a web application that lives entirely in an Oracle database (Yes, it uses PL/SQL procedures to write HTML to clobs and then vomits the clob at your browser. No, it wasn't my idea. Yes, I'll wait while you go cry.).
We're having some performance issues, and I've been assigned to find some bottlenecks and remove them. How do I go about measuring Oracle performance and finding these bottlenecks? Our unhelpful sysadmin says that Grid Control wasn't helpful, and that he had to rely on "his experience" and queries against the data dictionary and "v$" views.
I'd like to run some tests against my local Oracle instance and see if I can replicate the problems he found so I can make sure my changes are actually improving things. Could someone please point me in the direction of learning how to do this?
Not too surprising there are entire books written on this topic.
Really what you need to do is divide and conquer.
First thing is to just ask yourself some standard common sense questions. Has performance slowly degraded or was there a big drop in performance recently is an example.
After the obvious a good starting point for you would be to narrow down where to spend your time - top queries is a decent start for you. This will give you particular queries which run for a long time.
If you know specifically what screens in you front-end are slow and you know what stored procedures go with that, I'd put some logging. Simple DBMS_OUTPUT.put_lines with some wall clock information at key points. Then I'd run those interactively in SQLNavigator to see what part of the stored procedure is going slow.
Once you start narrowing it down you can look to evaluate why a particular query is going slow. EXPLAIN_PLAN will be your best friend to start with.
It can be overwhelming to analyze database performance with Grid Control, and I would suggest starting with the simplier AWR report - you can find the scripts to generate them in $ORACLE_HOME/rdbms/admin on the db host. This report will rank the SQL seen in the database by various categories (e.g. CPU time, disk i/o, elapsed time) and give you an idea where the bottlenecks are on the database side.
One advantage of the AWR report is that it is a SQL*Plus script and can be run from any client - it will spool HTML or text files to your client.
edit:
There's a package called DBMS_PROFILER that lets you do what you want, I think. I found out my IDE will profile PL/SQL code as I would guess many other IDE's do. They probably use this package.
http://www.dba-oracle.com/t_dbms_profiler.htm
http://www.databasejournal.com/features/oracle/article.php/2197231/Oracles-DBMSPROFILER-PLSQL-Performance-Tuning.htm
edit 2:
I just tried the Profiler out in PL/SQL Developer. It creates a report on the total time and occurrences of snippets of code during runtime and gives code location as unit name and line number.
original:
I'm in the same boat as you, as far as the crazy PL/SQL generated pages go.
I work in a small office with no programmer particularly versed in advanced features of Oracle. We don't have any established methods of measuring and improving performance. But the best bet I'd guess is to try out different PL/SQL IDE's.
I use PL/SQL Developer by Allaround Automations. It's got a testing functionality that lets you debug your PL/SQL code and that may have some benchmarking feature I haven't used yet.
Hope you find a better answer. I'd like to know too. :)
"I work on a web application that
lives entirely in an Oracle database
(Yes, it uses PL/SQL procedures to
write HTML to clobs and then vomits
the clob at your browser"
Is it the Apex product ? That's the web application environment now included as standard part of the Oracle database (although technically it doesn't spit out CLOBs).
If so there is a whole bunch of instrumentation already built in to the product/environment (eg it keeps a rolling two-week history of activity).

Resources