vladb has asked for the wisdom of the Perl Monks concerning the following question:

At the place of my work we deal with a number of processes each of which is utilizing at least one database connection. This strains our database of reasources which led me to look at ways of reducing the number of opened database connections at any one time by having central process that would 'serve' a readily available database handle to an open connection to any any script requiring one.

I've been searching PerlMonks for nodes which would cover the same issue, but didn't find exactly what I was looking for. For example, the author of this node is talking about sharing a database connection between parent process and a number of child processes. However, I'm concerned about having a stand alone process which would 'serve' database connections to any number of 'external' processes upon request.

I'm now wondering just how should I approach this problem? One option could be to have the central process share database handler objects (such as DBI objects) with external scripts. For this I might find the Shared.pm module useful. However, I'm not sure if sharing a scalar or an array is a totally different matter than sharing a database handler object? Another option would be to simply have the central process accept requests from external processes in form of SQL statements and return data via a socket connection etc.

I'd appreciate it if you could point me in the right direction. I'm somewhat sure there must be some Perl modules that would enable me to implement this easier than if I had to do it on my own.


"There is no system but GNU, and Linux is one of its kernels." -- Confession of Faith
  • Comment on Sharing database connections over multiple (non child) processes.

Replies are listed 'Best First'.
Re: Sharing database connections over multiple (non child) processes.
by perrin (Chancellor) on Mar 06, 2002 at 17:22 UTC
    As others have pointed out, you can't really share DBI handles between processes. The easiest thing to do by far would be to spend a few dollars on more RAM for your database server and keep on using lots of connections. However, if you really need to limit your connections you can use DBD::Proxy for this. It's slow, but it works.

    Another approach is to use Apache/mod_perl as an application server, putting your core code in there and calling it from clients via SOAP or RPC-XML. This would let you limit the number of connections at any given time by controlling the number of Apache processes, but it also means a significant re-write of your code.

Re: Sharing database connections over multiple (non child) processes.
by mpeppler (Vicar) on Mar 06, 2002 at 16:52 UTC
    In general it is not possible to share DBI handles across processes (and I don't know if it works with Sybase, even though it does work if you use the Sybase::CTlib module).

    You might want to look at something called sqlrelay (available form www.firstworks.com I believe). I've not used it, so I don't know how solid it is, or if it'll work with your database server.

    Michael

Re: Sharing database connections over multiple (non child) processes.
by Moonie (Friar) on Mar 06, 2002 at 16:12 UTC