Re: Looking for a way to use one database connection with multiple CGI scripts
by Tanktalus (Canon) on Nov 23, 2009 at 17:51 UTC
|
You don't mention whether it's running under mod_perl or FastCGI, where DBI's FAQ might answer your question (more for mod_perl than FastCGI). If it's really a CGI application that gets executed when requested, and is otherwise not loaded/running, then you'll have to switch to mod_perl or FastCGI or something similar, where your DBI connections can be cached.
Though, I have to admit, I'm a bit confused by your wording. If you don't reuse connections, you never have to worry about timing out - since you'd be creating a new connection each time your CGI code was executed.
| [reply] |
Re: Looking for a way to use one database connection with multiple CGI scripts
by WizardOfUz (Friar) on Nov 23, 2009 at 17:49 UTC
|
I'm afraid there is no simple way to share database connections between different (plain) CGI processes. You could use something like DBI::ProxyServer or POE::Component::EasyDBI, for example, to implement a stand-alone connection pool manager for your CGI processes, but I wouldn't recommend it. In persistent environments like FastCGI or mod_perl, however, it is possible (and actually quite easy) to reuse DBI database handles on a per process basis. Take a look at Apache::DBI for inspiration.
| [reply] |
Re: Looking for a way to use one database connection with multiple CGI scripts
by almut (Canon) on Nov 23, 2009 at 17:55 UTC
|
that way it won't have unused connections waiting to time out or anything that could cause bad things to happen
I'm afraid chances are in general that more bad things will happen when you try to use shared DB connections with CGI scripts ;) Anyhow, if you really need to do so (for performance reasons), you might want to look into connect_cached, or Apache::DBI (when using mod_perl). But in this case be particularly careful with forking in combination with persistent DB connections (see InactiveDestroy).
Also note that "normal" CGI scripts start up a new process for every HTTP request, but you'd rather want persistent request handlers that allow to keep a DB connection open — as offered by FastCGI or mod_perl. Otherwise, you'd have to have a separate "connection broker" (a persistently running process) that you connect to from the CGI script (e.g. via a socket) in order to delegate the DB query to a free DB handle from a pool of handles managed by the broker (in the hopes that the communication overhead with the broker is less than the overhead resulting from simply opening a new DB connection every time...)
| [reply] [d/l] [select] |
Re: Looking for a way to use one database connection with multiple CGI scripts
by CountZero (Bishop) on Nov 23, 2009 at 22:25 UTC
|
Either you manage a pool of connections à la Apache::DBI or you open the connection at the start of each script and close it at the end. In the first of these cases you will have a limited number of connections "waiting" before they are being used, but that is OK: the only cost is some memory. In the second case, you have no "idle" connections, but you will pay the cost of setting up and tearing down a connection each time the script runs.Having different (or even the same) scripts use the same connection simultaneously is much more likely to cause problems. Anyhow, a modern database server allows hundreds (if not thousands) of simultaneous connections. The manual of MySQL 5.1 says: The maximum number of connections MySQL can support depends on the quality of the thread library on a given platform, the amount of RAM available, how much RAM is used for each connection, the workload from each connection, and the desired response time. Linux or Solaris should be able to support at 500–1000 simultaneous connections routinely and as many as 10,000 connections if you have many gigabytes of RAM available and the workload from each is low or the response time target undemanding. Windows is limited to (open tables × 2 + open connections) < 2048 due to the Posix compatibility layer used on that platform.
CountZero A program should be light and agile, its subroutines connected like a string of pearls. The spirit and intent of the program should be retained throughout. There should be neither too little or too much, neither needless loops nor useless variables, neither lack of structure nor overwhelming rigidity." - The Tao of Programming, 4.1 - Geoffrey James
| [reply] |
Re: Looking for a way to use one database connection with multiple CGI scripts (Pg)
by erix (Prior) on Nov 23, 2009 at 19:17 UTC
|
It seems to me you want a connection pooler, which is a program that (at startup) pre-prepares a smallish pool of connections. Programs request a connection from the pool, and immediately after db-access relinquish the connection back into the pool.
A popular and light-weight connection pooler for the PostgreSQL database is PgBouncer, from the skype developers.
PgBouncer site / manual
and its pgfoundry repository
| [reply] |
Re: Looking for a way to use one database connection with multiple CGI scripts
by pklausner (Scribe) on Nov 23, 2009 at 22:57 UTC
|
An alternative to mod_perl et al is SpeedyCGI. You still fork one process per CGI request, but the actual script gets stuffed into a background daemon for a given time or number of invocations. The connection set-up would go into the global variables space of the script == backend. The first caller would initialize it, subsequent calls re-use until the backend terminates (and takes any memory leak with it). | [reply] |
|
|
| [reply] |