in reply to Sheer Size

At my previous job, I worked on a Perl project that ran to about 20k lines that we wrote and about 200k lines that we generated from templates, but still needed to edit by hand. This was an interactive program that acted as a network element in testing cellphone base stations.

We had ZERO performance problems from the program. We didn't even try to optimize once. We simply wrote good Perl code and based our performance on how fast we could make changes to the code. (Since we were in a testing group, we had to keep up with the changes in the programs we were testing, as new features were turned on.) We got our turnaround time to under an hour, in most cases.

We also (Thank the Gods!) had no DBI work to speak of, so I can't speak much on that.

Just as a thought, if this program will always be connecting to the same database(s), I would look at something like mod_perl for databases. That way, you could open the connection and keep it open. Then, each Perl program that starts can connect to this process and ask it to do stuff. That should reduce your overhead in DB connections and the like.

------
We are the carpenters and bricklayers of the Information Age.

Don't go borrowing trouble. For programmers, this means Worry only about what you need to implement.