Hi tilly,
Thank you: this is really sound advice. In a way this is common-sense, but it was common-sense I hadn't been applying.
I first quantified just "how much too slow" the overall program is. It's averaging about 75 seconds per 1000 records, but I'm aiming for 24s so that gave me a target.
Next, I looked at CPU usage, and indeed it *is* climbing quite slowly -- indicating a DB or network problem. I checked for indices, and all seemed okay so I simply copied the relevant DB tables to the machine running this analysis. BOOM: averaging 49 seconds/1000k.
So big picture thinking got me halfway there. I'm now profiling to see just where the code is going slow to see if it's worth trying to load the property-tables into memory or not.
Good solid advice, thanks. ++tilly.
In reply to Re^2: Benchmarking A DB-Intensive Script
by bernanke01
in thread Benchmarking A DB-Intensive Script
by bernanke01
| For: | Use: | ||
| & | & | ||
| < | < | ||
| > | > | ||
| [ | [ | ||
| ] | ] |