in reply to Best practices for closing database connections?

The current "best practices" are to use a framework, like Mojolicious, Catalyst, or Dancer. These run as a service that handles incoming requests from a more robust and battle-hardened front-end proxy like Apache or Nginx or Traefik. Each framework has its own method of managing one single database connection object (per worker), usually wrapped with DBIx::Class and usually persistent through the life of the worker and with built-in code that creates a new connection and re-runs your query if the connection drops in the middle of a query. This is the state-of-the-art for Perl web development, and actually fairly closely mirrors the state-of-the-art in Python or Ruby. Mojo is actually a bit ahead of the curve, here.

But, since you are trying to fix an old script, you might not be interested in totally rewriting it with those new tools. So assuming you are running a process per request like classic CGI, you just need to create some kind of global database handle and then refer back to it any time you need to run a query. There are modules that can do this for you, but you *could* do something as simple as this:

// in the main CGI script sub dbh { $main::dbh ||= DBI->connect(...); }
main::dbh()->selectall_arrayref(...)
where you just refer to main::dbh() every time you want to access your connection.

Solutions using purpose-build modules will be prettier than this, but also have a little more learning curve.

As a side-note, I 100% agree with everyone else that you should unconditionally use query placeholders for every query.

Replies are listed 'Best First'.
Re^2: Best practices for closing database connections?
by Jenda (Abbot) on Mar 22, 2022 at 15:25 UTC

    I seriously doubt anyone ran Perl process per request for two decades. Framework or no framework, you had mod_perl, PerlIS.dll, ...

    Jenda
    1984 was supposed to be a warning,
    not a manual!

      I know of a few places where this happens still.