Is there an Oracle Open Cursor (ORA-01000) leak in ColdFusion? - oracle

using CFMX7 and Oracle 10g ent on a query-intensive and active web site, I'm having a problem that some of the Oracle connections in my web server connection pool are accumulating open cursors. (In JDBC parlance this might be called a ResultSet object leak.)
This is a confusing situation in Oracle; read here for an explanation.
http://www.orafaq.com/node/758
Any how, it's not cached PreparedStatements that are leaking, it's actually ResultSets.
My DBAs have set the OPEN_CURSORS parameter to 500 per connection. Fairly frequently, my connections get up to 450+, which triggers a DBA alarm (because we hope to avoid smacking web app users with ORA-01000 cursor exhaustion errors).
Does anybody know if there's a bug in ColdFusion (MX7) that causes this problem? Is there any way programatically to use CF to generate a ResultSet object leak (called a cfquery leak in CF)? Any suggestions?

Here is some information that might be helpful.
http://jehiah.cz/a/maximum-open-cursors-exceeded

Related

Oracle UCP Pool leaking cursor?

Our application has been using OracleDataSource successfully for several years and we are now evaluating switching to the new Oracle Universal Connection Pool (UCP).
With the new UCP Pool, our application runs into ORA-0100: Maximum open cursors after some time.
Some people seem to have had similar problems:
https://stackoverflow.com/a/4683797/217862
https://stackoverflow.com/a/29892459/217862
Is there any known workaround / fix?
Note: We do close sessions and statements correctly and are following all known JDBC/Hibernate best practices. The app runs 24/7 and the data access layer code is >8 year old and has been exhaustively tested. We are using Oracle 12c.
Well, it turned out we though we were following all known best practices. In some places we were using ScrollableResult without closing them properly. In this case it apparently leaks the underlying cursor, even after the hibernate session is closed. We fixed all occurences we found in the code and as an additional defensive measure we configured the opion maxConnectionReuseTime of the pool to ensure connection are renewed periodically.
Note: it didn't took us one year to find the problem, only a few days, I simply forgot to answer the question after we figured out the problem...

BoneCP vs WebLogic's own DB connection pool

I have a servlet which connects to Oracle DB using JDBC (ojdbc6.jar) and BoneCP. I now need to port my BoneCP-using code to something which will work in WebLogic out-of-the-box, without having BoneCP in the package.
What would be the recommended approach? What WebLogic feature I can use, specifically to get an equivalent of BoneCP's:
Performance
Ability to log failed SQL statements
Auto-resume from lost DB connection
Thanks in advance.
The best approach would be to create a standard Oracle JDBC connection pool pointing to your database. Tune it according to your necessities (number of connections, etc.). Next you would need to refactor out of your code any explicit reference to your former connection pool implementation. If you have been working with java.sql.* interfaces in your code, there should be few to no references at all.
Once all that is refactorized, you will have only a bit of code (or config file) telling your app to recover something implementing javax.sql.DataSource from a given JNDI name and getting Connections out of it. The rest should be the same - just do whatever you need and close your ResultSets, Statements and Connections as you must have been doing until now.
About your questions, you will find extensive information on how to monitor your connection pool, and its fail recovery policies, here (depending on your app server version, I paste here the one I have used):
http://docs.oracle.com/cd/E15051_01/wls/docs103/jdbc_admin/jdbc_datasources.html
About performance, I have no accurate data nor benchmarks comparing both implementations; for your tranquility, I would say you that I have never found a database performance problem in the connection pool implementation - this does not mean that it cannot exist, but it is the last place I would look for it ;)

To close or not to close an Oracle Connection?

My application have performance issues, so i started to investigate this from the root: "The connection with the database".
The best practices says: "Open a connection, use it and close is as soon as possible", but i dont know the overhead that this causes, so the question is:
1 -"Open, Use, Close connections as soon as possible is the best aproach using ODP.NET?"
2 - Is there a way and how to use connection pooling with ODP.NET?
I thinking about to create a List to store some connections strings and create a logic to choose the "best" connection every time i need. Is this the best way to do it?
Here is a slide deck containing Oracle's recommended best practices:
http://www.oracle.com/technetwork/topics/dotnet/ow2011-bp-performance-deploy-dotnet-518050.pdf
You automatically get a connection pool when you create an OracleConnection. For most middle tier applications you will want to take advantage of that. You will also want to tune your pool for a realistic workload by turning on Performance Counters in the registry.
Please see the ODP.NET online help for details on connection pooling. Pool settings are added to the connection string.
Another issue people run into a lot with OracleConnections is that the garbage collector does not realize how truly resource intensive they are and does not clean them up promptly. This is compounded by the fact that ODP.NET is not fully managed and so some resources are hidden from the garbage collector. Hence the best practice is to Close() AND Dispose() all Oracle ODP.NET objects (including OracleConnection) to force them to be cleaned up.
This particular issue will be mitigated in Oracle's fully managed provider (a beta will be out shortly)
(EDIT: ODP.NET, Managed Driver is now available.)
Christian Shay
Oracle
The ODP.NET is a data provider for ADO.NET.
The best practice for ADO.Net is Open, Get Data (to memory), close, use in memory data.
For example using a OracleDataReader to load data in a DataTable in memory and close connection.
[]'s
For a single transaction this is best but for multiple transaction where you commit at the end this might not be the best solution. You need to keep the connection open until the transaction either committed or rolled back. How do you manage that and also how do you check the connection still exist in that case?(ie network failure) There is ConnectionState.Broken property which does not work at this point.

Possible complications for not closing database cursors?

What are the possible complications and repercussions if you do not close cursors for your Oracle database?
Since your developers are complaining about the performance hit of repeatedly re-opening cursors, the proper solution in the database would be to close them in your code but set the session_cached_cursors parameter so that the database maintained a cache of the session's recently used cursors. Having them not close their cursors is going to cause the ORA-01000 error that you're seeing and will waste other server resources.

In Classic asp, can I store a database connection in the Session object?

Can I store a database connection in the Session object?
It is generally not recommended to do so, a connection string in the Application variable, with a nice helper function/class is a much preferred method. Here is some reference. (Dead link removed because it now leads to a phishy site)
I seem to recall doing so will have the effect of single threading your application which would be a bad thing.
In general, I wouldn't store any objects in Application variables (and certainly not in session variables).
When it comes to database connections, it's a definite no-no; besides, there is absolutely no need.
If you are use ADO to communicate with the database, if you use the same connection string (yes, by all means, store this in an Application variable) for all your database connections, 'connection pooling' will be implemented behind the scenes. This means that when you release a connection, it isn't actually destroyed - it is put to one side for the next guys who wants the same connection. So next time you request the same connection, it is pulled 'off the shelf' rather than having to be explicitly created and instantiated - which is a quite a nice efficiency improvement.
From this link http://support.microsoft.com/default.aspx/kb/243543
You shouldnt store database connection in Session.
From what I understand, if you do then subsequent ASP requests for the same user must use the same thread.
Therefore if you have a busy site its likely that 'your' thread will already be being used by someone else, so you will have to wait for it to become available.
Multiply this up by lots more users and you will get everyone waiting for everyone elses thread and a not very responsive site.
As said by CJM, there is no need to store a connection in a Session object : connection pooling is much better.

Resources