Re: Best practices for closing database connections?
by hippo (Archbishop) on Mar 17, 2022 at 10:01 UTC
|
Should a single connection be opened, and all queries funneled through that before closing it (and the script itself)?
This would be my default approach unless there is a good reason for either multiple concurrent connections within your one script or the expectation of a long wait between queries (which would mean holding a connection open for no good reason). The definition of "a long wait" will depend on your situation.
| [reply] |
Re: Best practices for closing database connections?
by Discipulus (Canon) on Mar 17, 2022 at 11:06 UTC
|
Hello Polyglot,
> Should the connection be explicitly closed? and if so, should some command, e.g. "commit", be given first? Should a single connection be opened, and all queries funneled through that before closing it (and the script itself)?
In my limited experience I saw disconnect as soon as possible, obviously after having committed your transactions (there is also autocommit available). And yes, one single connection as suggested by wiser hippo.
Tricks with DBI is still worth to read after (ugh!) 22 years, as The fine art of database programming, Databases made easy and notably DBI recipes by our sleeping brother gmax
PS from Programming the Perl DBI
> Explicit disconnection from the database is not strictly necessary if you are exiting from your program after you have performed all the work, but it is a good idea. We strongly recommend that you get into the habit of disconnecting explicitly.
L*
There are no rules, there are no thumbs..
Reinvent the wheel, then learn The Wheel; may be one day you reinvent one of THE WHEELS.
| [reply] [d/l] [select] |
|
Thank you for the tips and those links. Those are helpful.
The problem I have with understanding or using placeholders, as mentioned in the first link, is that my queries are all different. For example, I query the user's privileges, usage statistics, preferences, the language parameters for the browser interface, the content for various divs on the page, etc., in addition to setting certain parameters in the database such as updating the user's last login/connection time. Much of this is repeated with each subsequent AJAX-driven request from the client's browser, but each time it is a new and separate calling of the script. There's hardly a single query that would be consistently run multiple times within a single execution of the script, so I am baffled as to the benefits of placeholders in such a case.
In fact, seeing as each of these queries originates in a separate subroutine of the script, it will be challenging enough for me to open one global connection for all such queries and then close it before the script exits, rather than opening the connection for each query as I have done to date.
| [reply] [d/l] |
|
| [reply] |
|
|
|
|
|
|
|
Re: Best practices for closing database connections?
by davido (Cardinal) on Mar 17, 2022 at 14:25 UTC
|
Let DBIx::Connector handle the issues around long-lived database connections, such as dropped connections. It's a really good module for this sort of thing. It is a lot more efficient to keep a frequently used DB connection open, but then you have to start dealing with handles going dead on you for various reasons. DBIx::Connector manages this so you don't have to.
| [reply] |
Re: Best practices for closing database connections?
by erix (Prior) on Mar 17, 2022 at 11:43 UTC
|
You haven't said which database system. In case of PostgreSQL: pgbouncer, a connection pooler, might be useful.
pgbouncer website
| [reply] |
|
| [reply] |
Re: Best practices for closing database connections?
by NERDVANA (Priest) on Mar 22, 2022 at 09:32 UTC
|
The current "best practices" are to use a framework, like Mojolicious, Catalyst, or Dancer. These run as a service that handles incoming requests from a more robust and battle-hardened front-end proxy like Apache or Nginx or Traefik. Each framework has its own method of managing one single database connection object (per worker), usually wrapped with DBIx::Class and usually persistent through the life of the worker and with built-in code that creates a new connection and re-runs your query if the connection drops in the middle of a query. This is the state-of-the-art for Perl web development, and actually fairly closely mirrors the state-of-the-art in Python or Ruby. Mojo is actually a bit ahead of the curve, here.
But, since you are trying to fix an old script, you might not be interested in totally rewriting it with those new tools. So assuming you are running a process per request like classic CGI, you just need to create some kind of global database handle and then refer back to it any time you need to run a query. There are modules that can do this for you, but you *could* do something as simple as this:
// in the main CGI script
sub dbh {
$main::dbh ||= DBI->connect(...);
}
main::dbh()->selectall_arrayref(...)
where you just refer to main::dbh() every time you want to access your connection.
Solutions using purpose-build modules will be prettier than this, but also have a little more learning curve.
As a side-note, I 100% agree with everyone else that you should unconditionally use query placeholders for every query. | [reply] [d/l] [select] |
|
I seriously doubt anyone ran Perl process per request for two decades. Framework or no framework, you had mod_perl, PerlIS.dll, ...
Jenda
1984 was supposed to be a warning,
not a manual!
| [reply] |
|
| [reply] |
Re: Best practices for closing database connections?
by bliako (Abbot) on Mar 17, 2022 at 14:43 UTC
|
If you are using Mojolicious (not sure if you can do this with CGI) you can store the DB connection and reuse it.
| [reply] |
|
| [reply] |