Should I explicitly close RethinkDB connections? - rethinkdb

I'm a little hazy on how connections in RethinkDB work. I'm opening a new connection every time I execute queries without closing them once the queries finish.
Is this a good practice? Or should I be explicitly closing connections once queries are finished?
(I'm using the JS driver. I don't believe the documentation speaks to this)
[edited cuz the previous post title was vague]

You should explicitly close connections, otherwise you will exhaust the database server. I'm assuming you are running node.js, which will keep connections until you kill the application.
Preferrably you would use a pool, to lessen the overhead of connecting. For a pre-made solution, look into rethinkdbdash which is basically the same API as the official one, but with builtin pooling.

Related

How do you use go-sql-driver when you have a sharded MySQL database solution?

Reading this article: http://go-database-sql.org/accessing.html
It says that the sql.DB object is designed to be long-lived and that we should not Open() and Close() databases frequently. But what should I do if I have 10 different MySQL servers and I have sharded them in a way that I have 511 databases in each server for example the way Pinterest shards their data with MySQL?
https://medium.com/#Pinterest_Engineering/sharding-pinterest-how-we-scaled-our-mysql-fleet-3f341e96ca6f
Then would I not need to constantly access new nodes with new databases all the time? As I understand then I have to Open and Close the database connection all the time depending on which node and database I have to access.
It also says that:
If you don’t treat the sql.DB as a long-lived object, you could
experience problems such as poor reuse and sharing of connections,
running out of available network resources, or sporadic failures due
to a lot of TCP connections remaining in TIME_WAIT status. Such
problems are signs that you’re not using database/sql as it was
designed.
Will this be a problem? How should I solve this issue then?
I am also interested in the question. I guess there could be such solution:
Minimize number of idle connection in pool db.SerMaxIdleConns(N)
Make map[serverID]*sql.DB. When you have no such connection - add it to map.
Make Dara more local - so backends usually go to “their” databases. However Pinterest seems not to use it.
Increase number of sockets and files on backend machines so they can keep more open connections.
Provide some reasonable idle timeout so very old unused connections could be closed.

How to pool a connection in google apps script without ScriptDB?

I am looking to pool a connection in a similar way as described in this question. However I do not wish to use ScriptDB which was deprecated in May 2014.
How would this be achieved without the ScriptDB class? Not use connection pooling in Google Apps Script?
Actually, the answer in the linked question does not work as you cannot store jdbc connection objects on ScriptDb, or anywhere else for that matter. It is not possible to do a connection pool or cache in Apps Script. Even if you ´stringify´ it, when you parse back it'll not work.
One always have to create a new connection for every instance/execution of the script.

To close or not to close an Oracle Connection?

My application have performance issues, so i started to investigate this from the root: "The connection with the database".
The best practices says: "Open a connection, use it and close is as soon as possible", but i dont know the overhead that this causes, so the question is:
1 -"Open, Use, Close connections as soon as possible is the best aproach using ODP.NET?"
2 - Is there a way and how to use connection pooling with ODP.NET?
I thinking about to create a List to store some connections strings and create a logic to choose the "best" connection every time i need. Is this the best way to do it?
Here is a slide deck containing Oracle's recommended best practices:
http://www.oracle.com/technetwork/topics/dotnet/ow2011-bp-performance-deploy-dotnet-518050.pdf
You automatically get a connection pool when you create an OracleConnection. For most middle tier applications you will want to take advantage of that. You will also want to tune your pool for a realistic workload by turning on Performance Counters in the registry.
Please see the ODP.NET online help for details on connection pooling. Pool settings are added to the connection string.
Another issue people run into a lot with OracleConnections is that the garbage collector does not realize how truly resource intensive they are and does not clean them up promptly. This is compounded by the fact that ODP.NET is not fully managed and so some resources are hidden from the garbage collector. Hence the best practice is to Close() AND Dispose() all Oracle ODP.NET objects (including OracleConnection) to force them to be cleaned up.
This particular issue will be mitigated in Oracle's fully managed provider (a beta will be out shortly)
(EDIT: ODP.NET, Managed Driver is now available.)
Christian Shay
Oracle
The ODP.NET is a data provider for ADO.NET.
The best practice for ADO.Net is Open, Get Data (to memory), close, use in memory data.
For example using a OracleDataReader to load data in a DataTable in memory and close connection.
[]'s
For a single transaction this is best but for multiple transaction where you commit at the end this might not be the best solution. You need to keep the connection open until the transaction either committed or rolled back. How do you manage that and also how do you check the connection still exist in that case?(ie network failure) There is ConnectionState.Broken property which does not work at this point.

Is it possible to list all database connections currently in the pool?

I'm getting ActiveRecord::ConnectionTimeoutError in a daemon that runs independently from the rails app. I'm using Passenger with Apache and MySQL as the database.
Passenger's default pool size is 6 (at least that's what the documentation tells me), so it shouldn't use more than 6 connections.
I've set ActiveRecord's pool size to 10, even though I thought that my daemon should only need one connection. My daemon is one process with multiple threads that calls ActiveRecord here and there to save stuff to the database that it shares with the rails app.
What I need to figure out is whether the threads simply can't share one connection or if they just keep asking for new connections without releasing their old connections. I know I could just increase the pool size and postpone the problem, but the daemon can have hundreds of threads and sooner or later the pool will run out of connections.
The first thing I would like to know is that Passenger is indeed just using 6 connections and that the problem lies with the daemon. How do I test that?
Second I would like to figure out if every thread need their own connection or if they just need to be told to reuse the connection they already have. If they do need their own connections, maybe they just need to be told to not hold on to them when they're not using them? The threads are after all sleeping most of the time.
You can get to the connection pools that ActiveRecord is using through ActiveRecord::Base.connection_handler.connection_pools it should be an array of connection pools. You probably will only have one in there and it has a connections method on it. To get an array of connections it knows about.
You can also do a ActiveRecord::Base.connection_handler.connection_pools.each(&:clear_stale_cached_connections!) and it will checkin any checked out connections which thread is no longer alive.
Don't know if that helps or confuses more
As of February 2019, clear_state_cached_connections has been deprecated and moved to reap
Commit
Previous accepted answer updated:
ActiveRecord::Base.connection_handler.connection_pools.each(&:reap)

In Classic asp, can I store a database connection in the Session object?

Can I store a database connection in the Session object?
It is generally not recommended to do so, a connection string in the Application variable, with a nice helper function/class is a much preferred method. Here is some reference. (Dead link removed because it now leads to a phishy site)
I seem to recall doing so will have the effect of single threading your application which would be a bad thing.
In general, I wouldn't store any objects in Application variables (and certainly not in session variables).
When it comes to database connections, it's a definite no-no; besides, there is absolutely no need.
If you are use ADO to communicate with the database, if you use the same connection string (yes, by all means, store this in an Application variable) for all your database connections, 'connection pooling' will be implemented behind the scenes. This means that when you release a connection, it isn't actually destroyed - it is put to one side for the next guys who wants the same connection. So next time you request the same connection, it is pulled 'off the shelf' rather than having to be explicitly created and instantiated - which is a quite a nice efficiency improvement.
From this link http://support.microsoft.com/default.aspx/kb/243543
You shouldnt store database connection in Session.
From what I understand, if you do then subsequent ASP requests for the same user must use the same thread.
Therefore if you have a busy site its likely that 'your' thread will already be being used by someone else, so you will have to wait for it to become available.
Multiply this up by lots more users and you will get everyone waiting for everyone elses thread and a not very responsive site.
As said by CJM, there is no need to store a connection in a Session object : connection pooling is much better.

Resources