in reply to Efficiency of a Hash with 1 Million Entries
friendoffriend.pl runs very quickly
Yes, but it's executed 1,000,000 times.
Just loading perl takes 3ms
$ time perl -e1 real 0m0.003s user 0m0.004s sys 0m0.000s
Loading a basic module triples that.
$ time perl -e'use Scalar::Util;' real 0m0.009s user 0m0.004s sys 0m0.004s
For every record, you do the following actions that could be avoided if you rewrote friendoffriend.pl as a module:
If that adds up to 12ms, you'd save 3 hours and 20 minutes (12ms * 1,000,000) just by converting friendoffriend.pl into a module.
So, will this large hash significantly impact execution time?
It shouldn't. Assuming all memory access costs the same, inserting into and fetching from a Perl hash takes O(1) time. In other words, the time to do those actions does not increase as the size of the hash increases.
If you use enough memory to require swapping, that would break the assumption.
Update: Reorganised to be clearer. Added actual numbers.
|
|---|