in reply to Leashing DBI

I try to keep code that calls DBI functions in its own subroutines, and I get the database handle and set environment variables etc from a perl module. I also tell perl what the schema of a db is going to be and make checks to see that everything is legal inside these separate routines (although being lazy I tend to ignore the hashes I've made up which tell me what can be NULL).

So I can get a field or update arbitrary fields with this kind of thing:

my $oldpoints = &getfield("users",$id,"arigatopoints"); $ret = &updatefields($table,$id,\@fields,\@vals);
but it depends on what I'm trying to do. One report I generate is built around parsing a single complex query. But in a web site I tend to do lots of little things like the code above and the bulk of the code is site logic. I put the getfield and updatefields routines in my library module out of the way so I can get at all the SQL at once if I need to, and I can include that module in other programs too.

If you are doing lots of little things it might pay to abstract it out of the center of things and try to see how few subs you can have that are written with SQL. But fact is, I recently sped up a script from 3 minutes to <10 seconds just by doing a complex query instead of little ones interspersed by Perl. SQL engines are fast.