in reply to Use of undef slowing down perl?

The next time you do this, try using the Benchmark module. It builds in all the grunt work for comparing two snippets.

Replies are listed 'Best First'.
Re^2: Use of undef slowing down perl?
by ikegami (Patriarch) on Oct 12, 2005 at 17:28 UTC

    Benchmark isn't useful here. We don't have two snippets here. Benchmark finds the average of mulitple runs of different code. We want the times of two runs of the same code.

    The problem at hand is not a comparison of perfomance of two snippets. The problem at hand is a slowdown (presumably related to virtual-memory) on the second run, even though the second run should be reusing the memory of the first run.

      No, wrong. You want to compare the average and total speed of insertion before and after that undef(%foobar) operation. This is clearly something I'd use Benchmark for. Perhaps there's some big time hit when the hash is first re-used but it goes away as the hash is used more.

        The problem is related to the available memory size. About available_MB_of_mem * 10000 iterations need to be run before the undef, by my estimates. Any other amount will change the outcome and hide the problem.

        The question isn't *if* there is something the first pass, the question is *what* is different.

        True, you could use Benchmark to repeat the bit after the undef, but that's not what you were suggesting initially. You were suggesting using it for *both* snippets.