Each time I create a new "key" for a value is a run (perhaps I should have used a different word). Therefore I am doing over a million runs.
It's not clear whether you answered my question. How many times does the shell load and execute your multi-module perl script? Once, or over a million times?
If the latter, then I would strongly suggest that you refactor things so that you can generate a large quantity of keys in a single shell-command-line run -- running millions of processes in sequence (each one a presumably bulky script with a database connection and 5 queries) is bound to be costing you a lot of time. By doing a large quantity of keys in a single process, you might save a lot on DB connection, statement preparation, etc, in addition to OS process-management overhead.
And/or maybe you can speed things up a bit by running multiple instances in parallel? (But then you have to make sure they don't interfere with each other, or overlap, causing redundant runs.)
Once per customer, since we're doing a version upgrade, and the "normalized keys" is a new feature. We have quite a few customers...
And I suppose each customer has their own particular data requiring their own distinct set of normalized keys? Obviously, any amount of commonality across customers should be exploited (keep results from one to use on another, if at all possible). |